Compare commits

..

683 Commits

Author SHA1 Message Date
Dane Urban
5ac1f12af5 nits 2026-02-12 20:02:34 -08:00
Jessica Singh
6230e36a63 chore(bulk invite): free trial limit (#8378) 2026-02-13 02:03:38 +00:00
Jamison Lahman
7595b54f6b chore(playwright): upload baselines with merge_group jobs (#8410) 2026-02-13 01:41:14 +00:00
Evan Lohn
dc1bb426ee fix: sharepoint cred refresh (#8406)
Co-authored-by: justin-tahara <justintahara@gmail.com>
2026-02-13 01:38:07 +00:00
acaprau
e9a0506183 chore(opensearch): Add profiling information (#8404)
Co-authored-by: greptile-apps[bot] <165735046+greptile-apps[bot]@users.noreply.github.com>
2026-02-13 01:18:16 +00:00
Nikolas Garza
4747c43889 feat(auth): enforce seat limits on all user creation paths (#8401) 2026-02-13 00:18:29 +00:00
Jamison Lahman
27e676c48f chore(devtools): ods screenshot-diff for visual regression testing (#8386)
Co-authored-by: cubic-dev-ai[bot] <191113872+cubic-dev-ai[bot]@users.noreply.github.com>
2026-02-13 00:04:22 +00:00
Justin Tahara
6749f63f09 fix(email): Making sure Email Links go to Default Mail Service (#8395) 2026-02-12 23:43:40 +00:00
Yuhong Sun
e404ffd443 fix: SearXNG works now (#8403) 2026-02-12 23:34:02 +00:00
Justin Tahara
c5b89b86c3 fix(ollama): Passing Context Window through (#8385) 2026-02-12 23:30:33 +00:00
Evan Lohn
84bb3867b2 chore: hide file reader (#8402) 2026-02-12 23:12:42 +00:00
Evan Lohn
92cc1d83b5 fix: flaky no vectordb test (#8400) 2026-02-12 23:12:08 +00:00
Justin Tahara
e92d4a342f fix(ollama): Fixing Content Skipping (#8092) 2026-02-12 22:59:56 +00:00
Wenxi
b4d596c957 fix: remove log error when authtype is not set (#8399) 2026-02-12 22:57:13 +00:00
Nikolas Garza
d76d32003b fix(billing): exclude inactive users from seat counts and allow users page when gated (#8397) 2026-02-12 22:51:24 +00:00
Wenxi
007d2d109f feat(craft): pdf preview and refresh output panel (#8392) 2026-02-12 22:41:11 +00:00
Yuhong Sun
08891b5242 fix: Reminders polluting the query expansion (#8391) 2026-02-12 22:30:35 +00:00
Justin Tahara
846672a843 chore(llm): Additional Model Selection Test (#8389) 2026-02-12 21:53:03 +00:00
roshan
0f362457be fix(craft): craft connector FE nits (#8387) 2026-02-12 21:39:33 +00:00
Wenxi
283e8f4d3f feat(craft): pptx generation, editing, preview (#8383)
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2026-02-12 21:34:22 +00:00
Evan Lohn
fdf19d74bd refactor: github connector (#8384) 2026-02-12 20:33:27 +00:00
roshan
7c702f8932 feat(craft): local file connector (#8304) 2026-02-12 20:23:23 +00:00
Danelegend
3fb06f6e8e feat(search-settings): Add tests + contextual llm validation (#8376) 2026-02-12 20:12:27 +00:00
Jamison Lahman
9fcd999076 chore(devtools): Recommend @playwright/mcp in Cursor (#8380)
Co-authored-by: Evan Lohn <evan@danswer.ai>
Co-authored-by: cubic-dev-ai[bot] <191113872+cubic-dev-ai[bot]@users.noreply.github.com>
2026-02-12 11:15:29 -08:00
Wenxi
c937da65c4 chore: make chatbackgrounds local assets for air-gapped envs (#8381) 2026-02-12 18:53:27 +00:00
Raunak Bhagat
abdbe89dd4 fix: Search submission buttons layouts (#8382) 2026-02-12 18:39:43 +00:00
Raunak Bhagat
54f9c67522 feat: Unified Search and Chat (#8106)
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-12 17:53:55 +00:00
Raunak Bhagat
31bcdc69ca refactor(opal): migrate IconButton usages to opal Button (#8333)
Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-12 17:19:23 +00:00
Justin Tahara
b748e08029 chore(llm): Adding Tool Enforcement Tests (#8371) 2026-02-12 14:34:27 +00:00
SubashMohan
11b279ad31 feat(memory): enable memory tool to add or update the memory (#8331) 2026-02-12 09:09:00 +00:00
Yuhong Sun
782082f818 chore: Opensearch tuning (#8374)
Co-authored-by: acaprau <48705707+acaprau@users.noreply.github.com>
Co-authored-by: cubic-dev-ai[bot] <191113872+cubic-dev-ai[bot]@users.noreply.github.com>
2026-02-12 09:03:07 +00:00
acaprau
c01b559bc6 feat(opensearch): Admin configuration 2 - Make the retrieval toggle actually do something (#8370) 2026-02-12 09:01:07 +00:00
acaprau
3101a53855 feat(opensearch): Admin configuration 1 - FE migration tab in the admin sidebar, gated by env var (#8365) 2026-02-12 08:32:36 +00:00
acaprau
ce6c210de1 fix(opensearch): Make chunk migration not stop on an exception; also ACL does not raise (#8375) 2026-02-12 08:29:19 +00:00
acaprau
15b372fea9 feat(opensearch): Admin configuration 0 - REST APIs for migration stuff (#8364) 2026-02-12 06:16:23 +00:00
Nikolas Garza
cf523cb467 feat(ee): gate access only when legacy EE flag is set and no license exists (#8368) 2026-02-12 03:36:27 +00:00
Raunak Bhagat
344625b7e0 fix(opal): add padding to Interactive.Container and smooth foldable transitions (#8367) 2026-02-12 02:05:34 +00:00
Justin Tahara
9bf8400cf8 chore(playwright): Setup LLM Provider (#8362) 2026-02-12 02:03:08 +00:00
Evan Lohn
09e86c2fda fix: no vector db tests (#8369) 2026-02-12 01:37:28 +00:00
Justin Tahara
204328d52a chore(llm): Backend Fallback Logic Tests (#8363) 2026-02-12 01:15:15 +00:00
Nikolas Garza
3ce58c8450 fix(ee): follow HTTP→HTTPS redirects in forward_to_control_plane (#8360) 2026-02-12 00:27:49 +00:00
Evan Lohn
67b5df255a feat: minimal deployment mode (#8293) 2026-02-11 23:56:50 +00:00
Raunak Bhagat
33fa29e19f refactor(opal): rename subvariant to prominence, add internal, remove static (#8348)
Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-11 23:52:44 +00:00
acaprau
787f25a7c8 chore(opensearch): Tuning - Reduce k from 1000 to 50 (#8359) 2026-02-11 23:40:55 +00:00
Justin Tahara
f10b994a27 fix(bedrock): Fixing toolConfig call (#8342) 2026-02-11 23:29:28 +00:00
Danelegend
d4089b1785 chore(search-settings): Remove unused kv search-setting key (#8356) 2026-02-11 23:13:14 +00:00
Jamison Lahman
e122959854 chore(devtools): upgrade ods: 0.5.2->0.5.3 (#8358) 2026-02-11 15:09:25 -08:00
Jamison Lahman
93afb154ee chore(devtools): update ods compose defaults (#8357) 2026-02-11 15:04:16 -08:00
Jamison Lahman
e9be078268 chore(devtools): upgrade ods: 0.5.1->0.5.2 (#8355) 2026-02-11 22:56:32 +00:00
Jamison Lahman
61502751e8 chore(devtools): address missed cubic review (#8353) 2026-02-11 14:40:58 -08:00
Jamison Lahman
cd26893b87 chore(devtools): ods compose defaults ee version (#8351) 2026-02-11 14:35:22 -08:00
Yuhong Sun
90dc6b16fa fix: Metadata file for larger zips (#8327) 2026-02-11 22:16:12 +00:00
Raunak Bhagat
34b48763f4 refactor(opal): update Container height variants, remove paddingVariant (#8350)
Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-11 21:53:39 +00:00
Jamison Lahman
094d7a2d02 chore(playwright): remove unnecessary global auth checks (#8341) 2026-02-11 21:45:12 +00:00
victoria reese
faa97e92e8 fix: idleReplicaCount should be optional for ScaledObjects (#8344) 2026-02-11 21:09:45 +00:00
Wenxi
358dc32fd2 fix: upgrade plan page nits (#8346) 2026-02-11 21:06:39 +00:00
Justin Tahara
f06465bfb2 chore(admin): Improve Playwright test speeds (#8326) 2026-02-11 20:12:15 +00:00
Raunak Bhagat
8a51b00050 feat(backend): add default_app_mode field to User table (#8291)
Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-11 19:58:53 +00:00
Justin Tahara
33de6dcd6a fix(anthropic): Model Selection in Multi-Tenant (#8308) 2026-02-11 19:34:49 +00:00
acaprau
fe52f4e6d3 chore(opensearch): Add migration queue to helm chart and launch json (#8336) 2026-02-11 19:16:48 +00:00
Jamison Lahman
51de334732 chore(playwright): remove chromatic (#8339) 2026-02-11 18:50:12 +00:00
dependabot[bot]
cb72f84209 chore(deps): bump pillow from 12.0.0 to 12.1.1 (#8338)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Jamison Lahman <jamison@lahman.dev>
2026-02-11 18:38:42 +00:00
dependabot[bot]
8b24c08467 chore(deps): bump langchain-core from 0.3.81 to 1.2.11 in /backend/requirements (#8334)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Jamison Lahman <jamison@lahman.dev>
2026-02-11 18:28:06 +00:00
Wenxi
0a1e043a97 fix(craft): load messages before restore session and feat: timeout restoration operations (#8303) 2026-02-11 18:09:10 +00:00
Raunak Bhagat
466668fed5 feat(opal): add foldable prop to Button + select-variant icon colour (#8300)
Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-11 16:12:53 +00:00
acaprau
41d105faa0 feat(opensearch): Improved migration task 1 - Completely replace old task logic with new (#8323) 2026-02-11 06:28:59 +00:00
SubashMohan
9e581f48e5 refactor(memory): Refactor memories to use ID-based persistence and new memories UI (#8294) 2026-02-11 06:25:53 +00:00
acaprau
48d8e0955a chore(opensearch): Improved migration task 0 - Schema migrations (#8321) 2026-02-11 04:33:51 +00:00
acaprau
a77780d67e chore(devtools): Add comment in AGENTS.md about the limitations of Celery timeouts with threads (#8257) 2026-02-11 03:38:27 +00:00
Justin Tahara
d13511500c chore(llm): Hardening Fallback Tool Call (#8325) 2026-02-11 02:46:01 +00:00
Wenxi
216d486323 fix: allow basic users to share agents (#8269) 2026-02-11 02:34:07 +00:00
dependabot[bot]
a57d399ba5 chore(deps): bump cryptography from 46.0.3 to 46.0.5 (#8319)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Jamison Lahman <jamison@lahman.dev>
2026-02-11 02:28:20 +00:00
Nikolas Garza
07324ae0e4 fix(ee): copy license public key into Docker image (#8322) 2026-02-11 02:16:41 +00:00
Wenxi
c8ae07f7c2 feat(craft): narrow file sync to source, prevent concurrent syncs, and use --delete flag on incremental syncs (#8235) 2026-02-11 02:14:40 +00:00
Justin Tahara
f0fd19f110 chore(llm): Adding new Mock LLM Call test (#8290) 2026-02-11 02:07:21 +00:00
Wenxi
6a62406042 chore(craft): bump sandbox limits one last time TM (#8317) 2026-02-11 01:12:52 +00:00
Jamison Lahman
d0be7dd914 chore(deployment): only try to build desktop if semver-like tag (#8316) 2026-02-11 01:03:19 +00:00
Jamison Lahman
6a045db72b chore(devtools): deploy preview frontend builds in CI (#8315) 2026-02-11 00:58:59 +00:00
Wenxi
e5e9dbe2f0 fix: make /health check async (#8314) 2026-02-11 00:54:34 +00:00
Nikolas Garza
50e0a2cf90 feat(slack): add option to include bot messages during indexing (#8309) 2026-02-10 23:10:59 +00:00
Nikolas Garza
50538ce5ac chore(slack): add logging when bot messages are filtered during indexing (#8305) 2026-02-10 22:41:54 +00:00
Raunak Bhagat
6fab7103bf fix(opal): extract interactive container styles to CSS (#8307)
Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-10 21:40:40 +00:00
Justin Tahara
be6bd22c4c fix(embedding): Updating Masking Logic (#8301) 2026-02-10 20:01:20 +00:00
Jamison Lahman
47ca5a2733 chore(tests): use CE backend for model_server tests (#8296) 2026-02-10 18:51:08 +00:00
Justin Tahara
ee8ce366c7 fix(vertex): Updating masking workflow (#8299) 2026-02-10 18:36:29 +00:00
Jamison Lahman
a924b49405 chore(playwright): improve preflight checks and setup (#8283) 2026-02-10 08:09:54 -08:00
SubashMohan
2d2d998811 feat(memory): add user preferences and structured user context in system prompt (#8264) 2026-02-10 04:20:42 +00:00
SubashMohan
0925b5fbd4 fix(chatpage): Improve agent message layout, sidebar nesting, and icon fixes (#8224) 2026-02-10 04:06:31 +00:00
Wenxi
a02d8414ee chore(craft): update demo dataset and add sandbox image readme (#8059) 2026-02-10 03:37:00 +00:00
SubashMohan
c8abc4a115 fix(timeline): reduce agent message re-renders with referential stability in usePacedTurnGroups (#8265) 2026-02-10 03:18:53 +00:00
Nikolas Garza
cec37bff6a feat(ee): Enable license enforcement by default (#8270) 2026-02-10 02:48:55 +00:00
Nikolas Garza
06d5d3971b feat(chat): dynamic bottom spacer for fresh-chat push-up effect (#8285) 2026-02-10 02:04:01 +00:00
Justin Tahara
ed287a2fc0 chore(ollama): Sort model names (#8288) 2026-02-10 02:03:32 +00:00
Raunak Bhagat
60857d1e73 refactor(opal): select variant, transient/selected separation, OpenButton chevron fix (#8284)
Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-10 01:20:17 +00:00
Yuhong Sun
bb5c22104e chore: Better enforcement of masking (#7967)
Co-authored-by: justin-tahara <justintahara@gmail.com>
2026-02-10 00:41:11 +00:00
Jamison Lahman
03d919c918 chore(devtools): upgrade ods: 0.5.0->0.5.1 (#8279) 2026-02-09 23:31:54 +00:00
Justin Tahara
71d2ae563a fix(posthog): Chat metrics for Cloud (#8278) 2026-02-09 22:58:37 +00:00
Jamison Lahman
19f9c7357c chore(devtools): ods logs, ods pull, ods compose --force-recreate (#8277) 2026-02-09 22:51:01 +00:00
acaprau
f8fa5b243c chore(opensearch): Try to create OpenSearchTenantMigrationRecord earlier in check_for_documents_for_opensearch_migration_task (#8260) 2026-02-09 22:13:43 +00:00
dependabot[bot]
5f845c208f chore(deps-dev): bump pytest-xdist from 3.6.1 to 3.8.0 in /backend (#8120)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Jamison Lahman <jamison@lahman.dev>
2026-02-09 22:06:38 +00:00
Raunak Bhagat
d8595f8de0 refactor(opal): add new Button component built on Interactive.Base (#8263)
Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-09 21:52:13 +00:00
dependabot[bot]
5b00d1ef9c chore(deps-dev): bump faker from 37.1.0 to 40.1.2 in /backend (#8126)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Jamison Lahman <jamison@lahman.dev>
2026-02-09 21:46:06 +00:00
dependabot[bot]
41b6ed92a9 chore(deps): bump docker/login-action from 3.6.0 to 3.7.0 (#8275)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-09 21:26:07 +00:00
dependabot[bot]
07f35336ad chore(deps): bump @modelcontextprotocol/sdk from 1.25.3 to 1.26.0 in /backend/onyx/server/features/build/sandbox/kubernetes/docker/templates/outputs/web (#8166)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-09 13:20:12 -08:00
dependabot[bot]
4728bb87c7 chore(deps): bump @isaacs/brace-expansion from 5.0.0 to 5.0.1 in /backend/onyx/server/features/build/sandbox/kubernetes/docker/templates/outputs/web (#8139)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-09 13:20:04 -08:00
dependabot[bot]
adfa2f30af chore(deps): bump actions/cache from 4.3.0 to 5.0.3 (#8273)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-09 13:19:41 -08:00
dependabot[bot]
9dac4165fb chore(deps): bump actions/setup-python from 6.1.0 to 6.2.0 (#8274)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-09 13:18:26 -08:00
dependabot[bot]
7d2ede5efc chore(deps): bump protobuf from 6.33.4 to 6.33.5 in /backend/requirements (#8182)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Jamison Lahman <jamison@lahman.dev>
2026-02-09 21:04:34 +00:00
dependabot[bot]
4592f6885f chore(deps): bump python-multipart from 0.0.21 to 0.0.22 in /backend/requirements (#7831)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Jamison Lahman <jamison@lahman.dev>
2026-02-09 20:44:54 +00:00
Evan Lohn
9dc14fad79 chore: disable hiernodes when opensearch not available (#8271) 2026-02-09 20:32:47 +00:00
dependabot[bot]
ff6e471cfb chore(deps): bump actions/setup-node from 4.4.0 to 6.2.0 (#8122)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-09 20:31:35 +00:00
dependabot[bot]
09b9443405 chore(deps): bump bytes from 1.11.0 to 1.11.1 in /desktop/src-tauri (#8138)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-09 12:34:07 -08:00
dependabot[bot]
14cd6d08e8 chore(deps): bump webpack from 5.102.1 to 5.105.0 in /web (#8199)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Jamison Lahman <jamison@lahman.dev>
2026-02-09 12:18:29 -08:00
dependabot[bot]
5ee16697ce chore(deps): bump time from 0.3.44 to 0.3.47 in /desktop/src-tauri (#8187)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-09 12:17:40 -08:00
dependabot[bot]
b794f7e10d chore(deps): bump actions/upload-artifact from 4.6.2 to 6.0.0 (#8121)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-09 12:14:25 -08:00
dependabot[bot]
bb3275bb75 chore(deps): bump actions/checkout from 6.0.1 to 6.0.2 (#8123)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-02-09 12:13:37 -08:00
roshan
7644e225a5 fix(chrome extension): Simplify NRFPage ChatInputBar layout to use normal flex flow (#8267)
Co-authored-by: Claude <noreply@anthropic.com>
2026-02-09 18:12:54 +00:00
roshan
811600b84a fix(craft): snapshot restore (#8194) 2026-02-09 18:00:07 +00:00
Jamison Lahman
40ce8615ff fix(login): window undefined on login (#8266) 2026-02-09 17:55:05 +00:00
Justin Tahara
0cee3f6960 chore(llm): Introduce Scaffolding for Integration Tests (#8251) 2026-02-09 17:26:15 +00:00
acaprau
8883e5608f chore(chat frontend): Round up in formatDurationSeconds so we don't see "Thought for 0s" (#8259) 2026-02-09 07:54:39 +00:00
acaprau
7c2f3ded44 fix(opensearch): Tighten up task timing (#8256) 2026-02-09 07:53:44 +00:00
Raunak Bhagat
aa094ce1f0 refactor(opal): interactive base variant types + foreground color system (#8255)
Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-09 07:26:05 +00:00
Evan Lohn
4b0c800db7 feat: postgres file store (#8246) 2026-02-09 04:28:42 +00:00
acaprau
8386742c10 fix(profiling): log_function_time should use time.monotonic not time.time (#8258) 2026-02-09 04:11:50 +00:00
Evan Lohn
f2e5e4f040 feat: jwt-based auth (#8244) 2026-02-09 00:29:56 +00:00
Yuhong Sun
c0498cf2fc fix: Deep Research Agent Cycle Count (#8254) 2026-02-08 22:27:33 +00:00
acaprau
954ee1706a chore(opensearch): Improve ordering of migration records that we query (#8248) 2026-02-08 19:51:52 +00:00
SubashMohan
0745765a56 refactor(chat): agent timeline layout and spacing changes (#8226) 2026-02-07 08:40:16 +00:00
Jessica Singh
10feb6ae77 chore(auth): anon fix (#8222) 2026-02-07 06:58:54 +00:00
Danelegend
f5b170af1e chore(provider config): llm provider config prefers LLMModelFlow (#8064) 2026-02-07 03:37:56 +00:00
Jessica Singh
2d2f252e95 fix(web search): strictly typed provider config (#8022) 2026-02-07 00:35:33 +00:00
Evan Lohn
a05f304960 fix: column overlap typing (#8247) 2026-02-07 00:34:34 +00:00
acaprau
7ce5120302 chore(opensearch): Make indexing use the client's new bulk index API (#8238) 2026-02-06 22:07:27 +00:00
Evan Lohn
2d8f864251 fix: metadata hardening (#8201) 2026-02-06 21:57:28 +00:00
acaprau
3b48c2104b fix(opensearch): Allow update to skip if a doc chunk is not found in OpenSearch, or if chunk count is not known (#8236) 2026-02-06 21:47:01 +00:00
Nikolas Garza
a9ec6a2434 fix(settings): default ee_features_enabled to False (#8237) 2026-02-06 20:59:49 +00:00
Nikolas Garza
e85575c6cc fix: make it more clear how to add channels to fed slack config form (#8227) 2026-02-06 18:35:34 +00:00
Danelegend
c966c81e8a fix(llm): LLM override can fail if admin (#8204) 2026-02-06 18:28:31 +00:00
Jamison Lahman
a0d6ebe66d chore(migrations): database migration runner (#8217) 2026-02-06 18:03:03 +00:00
Wenxi
d75b501a1f fix(craft): upload to s3 before marking docs as indexed in db (#8216) 2026-02-06 17:55:18 +00:00
Nikolas Garza
89dd44bee8 fix(db): null out document set and persona ownership on user deletion (#8219) 2026-02-06 17:08:22 +00:00
Justin Tahara
c5451ffe53 fix(ui): Inconsistent LLM Provider Logo (#8220) 2026-02-06 04:38:44 +00:00
Yuhong Sun
85da1d85ce fix: LiteLLM for OpenAI compatible models not using Responses route (#8215) 2026-02-06 03:58:34 +00:00
acaprau
00d90c5e27 chore(opensearch): Add timing and debug logging in the OpenSearch client; also expand log_function_time (#8209) 2026-02-06 03:31:44 +00:00
Nikolas Garza
ea7654e4b8 feat(ee): block Slack bot for suspended tenants and enforce seat limits (#8202) 2026-02-06 03:28:34 +00:00
Yuhong Sun
eb90775e42 feat: basic langfuse tracing + tracing consolidation (#8207) 2026-02-06 03:28:28 +00:00
Nikolas Garza
75865fcdfd feat: support PEM-style delimiters in license file uploads (#7559) 2026-02-06 02:45:11 +00:00
acaprau
d50dc8fa68 feat(opensearch): Support bulk indexing (#8203) 2026-02-06 02:40:50 +00:00
Justin Tahara
39b96973ec chore(openai): Add test for Chat Models (#8213) 2026-02-06 02:21:03 +00:00
Justin Tahara
a342c4d848 fix(openai): Set Auto Reasoning effort to Medium (#8211) 2026-02-06 01:40:49 +00:00
Yuhong Sun
7c084a35b6 fix: GPT -chat models (#8210) 2026-02-06 00:47:23 +00:00
Raunak Bhagat
946eba5ba5 feat(opal): expand InteractiveBase variant system (#8200)
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-06 00:13:41 +00:00
Wenxi
ec4f85f4a4 chore: bump sandbox cpu and memory limits (#8208) 2026-02-05 15:32:39 -08:00
Jamison Lahman
d8fd6d398e chore(ruff): enable flake8s unused arg rules (#8206) 2026-02-05 23:25:07 +00:00
Wenxi
ef85a14b6e fix: mt provisioning rollback and add tests (#8205) 2026-02-05 23:15:36 +00:00
Jamison Lahman
97b44b530e chore(devtools): CLAUDE.md.template -> AGENTS.md (#8197) 2026-02-05 22:46:37 +00:00
Justin Tahara
e05a34cad3 chore(chat): Cleaning Error Codes + Tests (#8186) 2026-02-05 21:13:30 +00:00
Yuhong Sun
d80a4270cb fix: LLM Read Timeout (#8193) 2026-02-05 20:51:37 +00:00
Justin Tahara
a26b4ff888 fix(agents): Removing Label Dependency (#8189) 2026-02-05 20:41:44 +00:00
Wenxi
185d2bb813 feat: recommend opus 4-6 (#8198) 2026-02-05 20:36:12 +00:00
Justin Tahara
d5b64e8472 chore(openai): Add Reasoning Specific Test (#8195) 2026-02-05 20:24:18 +00:00
Justin Tahara
378a216af3 fix(ci): Model Check update (#8196) 2026-02-05 20:17:06 +00:00
Jamison Lahman
2c002c48f7 chore(ruff): move config up a level (#8192) 2026-02-05 20:11:54 +00:00
Jamison Lahman
9c20549e58 chore(devtools): upgrade ods: 0.4.1->0.5.0 (#8190) 2026-02-05 20:08:03 +00:00
Evan Lohn
ffd30ae72a chore: bump default usage limits (#8188) 2026-02-05 19:43:46 +00:00
Wenxi
e18496dfa7 fix: don't run craft setup script unless it exists (#8191) 2026-02-05 19:39:19 +00:00
roshan
560a78a5d0 fix(craft): file upload (#8149) 2026-02-05 19:18:39 +00:00
Wenxi
10bc398746 refactor(craft): chad s5cmd > chud aws cli (mem overhead + speed) (#8170) 2026-02-05 18:58:32 +00:00
Jamison Lahman
9356f79461 chore(devtools): ods compose to start containers (#8185) 2026-02-05 18:44:31 +00:00
Raunak Bhagat
e246b53108 feat(opal): extract Hoverable into Interactive atom (#8173)
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-05 18:32:49 +00:00
Justin Tahara
26533d58e2 fix(openai): Fix reasoning (#8183) 2026-02-05 18:27:13 +00:00
SubashMohan
a32f27f4c8 feat(CommandMenu): add comprehensive tests (#8159) 2026-02-05 17:57:47 +00:00
Yuhong Sun
413a96f138 chore: DR Prompt Tuning (#8180) 2026-02-05 07:02:07 +00:00
Yuhong Sun
73a6721886 chore: Slight tweaks of DR (#8179) 2026-02-04 21:07:27 -08:00
Evan Lohn
01872a7196 fix(salesforce): cleanup logic (#8175) 2026-02-05 02:50:08 +00:00
Evan Lohn
0ba1f715f2 feat(filesys): disabled sections (#8153) 2026-02-05 02:27:47 +00:00
Yuhong Sun
94d0dc0ffe chore: DR description for GA (#8178) 2026-02-05 01:22:04 +00:00
Yuhong Sun
039daa0027 chore: Sanitize LLM tool call args (#8177) 2026-02-05 01:14:05 +00:00
Yuhong Sun
62b1c55494 fix: Anthropic DR requires setting reasoning limit if we want to set output limit (#8168) 2026-02-05 00:39:58 +00:00
Nikolas Garza
1800d4b9d7 fix(db): add cascade delete to search_query user_id foreign key (#8176) 2026-02-05 00:34:28 +00:00
Justin Tahara
5ed2d78471 fix(ui): Additional LLM Config update (#8174) 2026-02-04 23:16:01 +00:00
Justin Tahara
ff28dc9c72 fix(ci): Allow for flexible beta tag (#8171) 2026-02-04 22:17:12 +00:00
Raunak Bhagat
e88a7ac868 fix: Fix expansion error inside of TextView (#8151) 2026-02-04 21:50:40 +00:00
Justin Tahara
79c1bbe666 fix(ci): Notification workflow for Slack (#8167) 2026-02-04 21:38:04 +00:00
Jessica Singh
b1168d4526 chore(chat compress): create readme (#8165) 2026-02-04 21:31:18 +00:00
Wenxi
21751b2cf2 fix(craft): bump aws sync concurrent requests 10-->200 (#8163) 2026-02-04 19:54:35 +00:00
Evan Lohn
cb33263ef0 feat(filesys): sorting attached knowledge (#8156) 2026-02-04 18:56:28 +00:00
Justin Tahara
9f9a68f2eb fix(ci): Fix Bedrock Test (#8161) 2026-02-04 17:45:24 +00:00
SubashMohan
9c09c07980 test(timeline): add unit tests for packet processor (#8135) 2026-02-04 09:41:01 +00:00
Yuhong Sun
9aaac7f1ad chore: firecrawl v2 (#8155) 2026-02-03 23:12:34 -08:00
SubashMohan
8b2071a3ae fix(timeline): consolidate header components and visual fixes (#8133) 2026-02-04 06:57:36 +00:00
Evan Lohn
733d55c948 feat(filesys): sharepoint v1 (#8130) 2026-02-04 05:15:26 +00:00
Evan Lohn
1498238c43 feat(filesys): slack connector (#8118) 2026-02-04 04:48:43 +00:00
Evan Lohn
f0657dc1a3 feat(filesys): jira hierarchy v1 (#8113) 2026-02-04 04:48:31 +00:00
SubashMohan
96e71c496b fix(ChatSearchCommandMenu): improve keyboard navigation and search UX (#8134) 2026-02-04 03:20:18 +00:00
Yuhong Sun
db4e1dc1a3 fix: Give more helpful message to LLM on bad tool calls (#8150) 2026-02-04 01:51:39 +00:00
acaprau
bce5f0889f chore(document index): Remove offset (#8148) 2026-02-04 00:55:53 +00:00
acaprau
fa2f4e781a feat(opensearch): Implement admin and random retrieval; fully deprecate update in the old interface; relax update restrictions (#8142) 2026-02-04 00:26:16 +00:00
Yuhong Sun
abdb683584 fix: Back off to basic auth (#8146) 2026-02-03 15:28:53 -08:00
Yuhong Sun
b7b4737b05 chore: Remove auth log (#8145) 2026-02-03 15:22:03 -08:00
Yuhong Sun
3f9b143429 feat: Track reasoning in Braintrust (#8143) 2026-02-03 15:07:26 -08:00
Wenxi
dbf08a3483 fix(craft): pod restoration race, recovery from unexpected state, and updating heartbeat on session creation (#8140) 2026-02-03 22:47:16 +00:00
Nikolas Garza
43e2e7c69c fix(auth): redirect to login page after email verification (#8137) 2026-02-03 22:35:29 +00:00
Yuhong Sun
1da20bc240 fix: DR time based wrap ups (#8141) 2026-02-03 14:39:58 -08:00
acaprau
58b376d7b7 feat(opensearch): Add tenant ID to the document chunk ID (#8129) 2026-02-03 19:41:19 +00:00
Wenxi
23e47a48e1 fix(craft): wrong usage limit string (#8136) 2026-02-03 19:24:16 +00:00
acaprau
cda5b00174 feat(opensearch): String filtering (#8110) 2026-02-03 17:54:54 +00:00
acaprau
6f4ababb11 chore(opensearch): Some small cleanup around update (#8119) 2026-02-03 17:52:06 +00:00
Nikolas Garza
e90656efbe feat(ee): updating billing api (#8073) 2026-02-03 08:26:44 -08:00
Jessica Singh
b3803808e0 feat(chat history): summarize older messages (#7810)
Co-authored-by: greptile-apps[bot] <165735046+greptile-apps[bot]@users.noreply.github.com>
2026-02-03 00:35:23 -08:00
SubashMohan
f5415bace6 feat(timeline): QA fixes aroung renderers (#8132) 2026-02-03 06:34:16 +00:00
Nikolas Garza
b255297365 feat(ee): fe - enable new billing UI (5/5) (#8072) 2026-02-03 05:11:37 +00:00
Yuhong Sun
5463d6aadc feat: LLM history now allows parallel tool call for a single message (#8131) 2026-02-03 04:05:17 +00:00
roshan
b547d487c1 fix(craft): CREEPA? AW MAN (#8115) 2026-02-03 02:47:05 +00:00
Nikolas Garza
18821b612b feat(ee): fe - update billing hooks and services (4/5) (#8071) 2026-02-03 02:12:35 +00:00
Yuhong Sun
2368cef307 fix: LiteLLM in threading context (#8103)
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-03 01:58:08 +00:00
Nikolas Garza
668cc71be4 feat(ee): fe - add BillingDetailsView component (3/5) (#8070) 2026-02-03 01:39:27 +00:00
Justin Tahara
09f3ad8985 feat(ui): Selecting all Models toggle (#8125) 2026-02-03 01:28:09 +00:00
Nikolas Garza
38e88c7b5c feat(ee): fe - add CheckoutView component (2/5) (#8069) 2026-02-03 01:15:34 +00:00
Justin Tahara
cc7bfdbcde fix(ui): Model Selection in Place (#8128) 2026-02-03 00:46:46 +00:00
Nikolas Garza
0e3c511974 feat(ee): fe - add new billing page structure with PlansView (#8068) 2026-02-03 00:45:03 +00:00
Wenxi
9606461ba0 fix(craft): phantom pre-provisionsed sandboxes and poll for fresh session on welcome page (#8124) 2026-02-03 00:26:45 +00:00
Yuhong Sun
d01fcbbf7a chore: LiteLLM bump version (#8114) 2026-02-03 00:14:39 +00:00
Evan Lohn
325a38e502 feat(filesys): selected info improvements (#8117) 2026-02-03 00:10:28 +00:00
Evan Lohn
3916556397 fix: perm sync group prefixing (#8077) 2026-02-02 23:44:17 +00:00
Danelegend
a7edcd6880 feat(provider): create flow mapping table (#8025) 2026-02-02 23:41:53 +00:00
Justin Tahara
f18f0ffd96 feat(helm): Add Probes for All (#8112) 2026-02-02 23:05:01 +00:00
Justin Tahara
06c060bb1f fix(ui): Search Connectors (#8116) 2026-02-02 23:04:53 +00:00
roshan
94ebe9e221 fix(craft): fix default dockerfile outputs_template_path and venv_template_path (#8102) 2026-02-02 14:28:08 -08:00
Justin Tahara
99c9c378cd chore(no-auth): Clean up Playwright (#8109) 2026-02-02 12:19:54 -08:00
victoria reese
df7ab6841a fix: resolve pod label duplication (#8098)
Co-authored-by: victoria-reese_wwg <victoria.reese@grainger.com>
2026-02-02 18:45:00 +00:00
Raunak Bhagat
2131c86c16 refactor: More app header cleanups (#8097)
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-02 18:36:16 +00:00
acaprau
7d1b9e4356 chore(opensearch): Migration 2- Introduce external dependency tests (#8045)
Co-authored-by: greptile-apps[bot] <165735046+greptile-apps[bot]@users.noreply.github.com>
2026-02-02 17:35:49 +00:00
SubashMohan
38e92308ec feat(chat): new agent timeline blocks (#8101) 2026-02-02 14:53:20 +00:00
Danelegend
2444b59070 chore(provider): add more integration tests for provider flow (#8099) 2026-02-01 19:35:55 -08:00
Yuhong Sun
49771945e1 chore: DR to run more than 1 cycle typically (#8100) 2026-02-01 17:17:50 -08:00
Justin Tahara
15f0bc9c3d fix(ui): Agent Saving with other people files (#8095) 2026-02-01 22:38:45 +00:00
Justin Tahara
963b172a09 fix(ui): Cleanup Card Span (#8094) 2026-02-01 22:26:25 +00:00
Justin Tahara
dc2bf20a8d fix(ui): Ollama Model Selection (#8091) 2026-02-01 21:29:57 +00:00
Raunak Bhagat
d29f1efec0 refactor: hooks (#8089)
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-01 20:26:40 +00:00
Justin Tahara
13d1c3d86a feat(desktop): Ensure that UI reflects Light/Dark Toggle (#7684) 2026-02-01 20:03:42 +00:00
Wenxi
adc6773f9f fix(craft): attempt to solve hanging with explicit k8s_stream timeout (#8066) 2026-02-01 17:51:25 +00:00
Raunak Bhagat
a819482749 refactor: Update Hoverable to be more adherent to the mocks (#8083) 2026-02-01 08:53:53 +00:00
Raunak Bhagat
f660f9f447 refactor: Update InputSelect implementation (#8076) 2026-02-01 08:53:43 +00:00
Yuhong Sun
26f9574364 chore: Web query sanitize (#8085) 2026-01-31 23:44:55 -08:00
Yuhong Sun
9fa17c7713 chore: remove long term log (#8084) 2026-01-31 23:42:47 -08:00
Yuhong Sun
1af484503e chore: ASCII in docs (#8082) 2026-01-31 23:16:38 -08:00
Yuhong Sun
55276be061 chore: ensure ascii false (#8081) 2026-01-31 23:11:53 -08:00
Yuhong Sun
4bb02459ae chore: DR tool tuning (#8080) 2026-01-31 22:57:58 -08:00
Yuhong Sun
7109aea897 chore: OpenURL sometimes gives too many tokens (#8079) 2026-01-31 22:23:33 -08:00
Yuhong Sun
8ce4cfc302 chore: Tune web search (#8078) 2026-01-31 21:29:12 -08:00
Yuhong Sun
0f75de9687 chore: hopefully help LLM not spam web queries (#8075) 2026-01-31 20:21:22 -08:00
acaprau
9782fcb0b9 feat(opensearch): Migration 1 - Introduce and implement migration tasks (#8014)
Co-authored-by: greptile-apps[bot] <165735046+greptile-apps[bot]@users.noreply.github.com>
2026-02-01 03:35:34 +00:00
Yuhong Sun
ec2a35b3a4 chore: DR edge case (#8074) 2026-01-31 19:26:07 -08:00
Raunak Bhagat
9815c2c8d9 refactor: Clean up app page rendering logic (#8060) 2026-02-01 03:15:21 +00:00
acaprau
8c3e3a6e02 chore(tests): Fix name for test_expire_oauth_token, loosen timing bounds a bit (#8067) 2026-02-01 03:04:49 +00:00
acaprau
726c6232a5 feat(opensearch): Migration 0 - Introduce db tables, alembic migration, db model utils (#8013) 2026-02-01 02:36:16 +00:00
Evan Lohn
f9d41ff1da feat(filesys): initial confluence hierarchy impl (#7932) 2026-02-01 02:28:55 +00:00
Evan Lohn
eb3eb83c95 chore: ban chat-tempmail (#8063) 2026-02-01 01:59:49 +00:00
Evan Lohn
e4b9ef176f fix: attaching user files to assistant (#8061) 2026-02-01 01:33:22 +00:00
trial2onyx
d18dd62641 fix(chat): reduce scroll container bottom margin (#8048)
Co-authored-by: Onyx Trialee 2 <onyxtrial2@Onyxs-MBP.attlocal.net>
Co-authored-by: Cursor <cursoragent@cursor.com>
2026-02-01 00:26:48 +00:00
Yuhong Sun
96224164ca chore: Some settings for DR for evals (#8058) 2026-01-31 15:44:39 -08:00
Wenxi
78cec7c9e9 refactor(craft): make usage limit overrides feature flags instead of env vars (#8056) 2026-01-31 23:35:14 +00:00
Wenxi
8fa7002826 chore(craft): bump sandbox image default value (#8055) 2026-01-31 23:10:53 +00:00
Nikolas Garza
921305f8ff feat(billing): add circuit breaker, license re-claim, and seats to checkout (#8005) 2026-01-31 22:18:27 +00:00
Raunak Bhagat
71148dd880 refactor: Consolidate duplicated AppHeader components into one (#8054)
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-31 22:07:57 +00:00
Nikolas Garza
ac26ba6c2d chore: remove license cache invalidation from multi-tenant (#8052) 2026-01-31 19:47:34 +00:00
Raunak Bhagat
24584d4067 fix: Consolidate providers into one central location (#8032) 2026-01-31 13:46:10 +00:00
Wenxi
39d8d1db0c fix: optional dependency for /me (#8042) 2026-01-31 03:06:01 +00:00
trial2onyx
17824c5d92 refactor(chat): move loading indicator to content area (#8039)
Co-authored-by: Onyx Trialee 2 <onyxtrial2@Onyxs-MBP.attlocal.net>
2026-01-31 02:23:15 +00:00
roshan
eba89fa635 fix(craft): idle sandbox cleanup (#8041) 2026-01-31 02:20:12 +00:00
Nikolas Garza
53f4025a23 feat(components): add InputNumber with increment/decrement controls (#8003) 2026-01-31 01:17:38 +00:00
Wenxi
9159b159fa fix: troll discord assertion (#8038) 2026-01-31 00:46:48 +00:00
Jamison Lahman
d7a22b916b fix(fe): polish chat UI with custom background (#8016)
Co-authored-by: greptile-apps[bot] <165735046+greptile-apps[bot]@users.noreply.github.com>
Co-authored-by: cubic-dev-ai[bot] <191113872+cubic-dev-ai[bot]@users.noreply.github.com>
2026-01-31 00:25:59 +00:00
roshan
97d90a82f8 fix(craft): files stuff (#8037) 2026-01-31 00:16:33 +00:00
Nikolas Garza
d9cf5afee8 fix(ee): use set(ex=) instead of setex() for license cache updates (#8004) 2026-01-30 16:16:40 -08:00
Wenxi
ce43dee20f fix: discord connector tests (#8036) 2026-01-30 23:32:09 +00:00
Justin Tahara
90ac23a564 fix(ui): Updating Dropdown Modal component (#8033) 2026-01-30 23:00:52 +00:00
Jamison Lahman
d9f97090d5 chore(gha): build desktop app in CI (#7996) 2026-01-30 22:54:28 +00:00
Raunak Bhagat
2661e27741 feat: Add new tag icon (#8029) 2026-01-30 22:33:10 +00:00
Wenxi
0481b61f8d refactor: craft onboarding ease (#8030) 2026-01-30 22:28:03 +00:00
roshan
6d12c9c430 fix(craft): clear env vars from all sandboxes in file_sync pods (#8028) 2026-01-30 22:05:57 +00:00
Justin Tahara
b81dd6f4a3 fix(desktop): Remove Global Shortcuts (#7914) 2026-01-30 21:19:55 +00:00
Justin Tahara
f9a648bb5f fix(asana): Workspace Team ID mismatch (#7674) 2026-01-30 20:52:21 +00:00
Raunak Bhagat
e9be9101e5 fix: Add explicit sizings to icons (#8018) 2026-01-30 20:48:14 +00:00
Danelegend
e670bd994b feat(persona): Add default_model_configuration_id column (#8020) 2026-01-30 20:44:03 +00:00
Chris Weaver
a48d74c7fd fix: onboarding model specification (#8019) 2026-01-30 19:57:11 +00:00
Evan Lohn
0e76ae3423 feat: notion connector hierarchynodes (#7931) 2026-01-30 19:28:34 +00:00
Evan Lohn
37bfa5833b fix: race conditions in drive hiernodes (#8017) 2026-01-30 18:30:05 +00:00
Wenxi
6c46fcd651 chore: dev env template defaults (#8015) 2026-01-30 18:05:36 +00:00
roshan
7700674b15 chore: launch.json web server uses .env.web (#7993) 2026-01-30 17:36:32 +00:00
Evan Lohn
4ac6ff633a feat(filesys): working filesys explorer (#7760) 2026-01-30 12:14:56 +00:00
Raunak Bhagat
efd198072e refactor: Update layout components and SettingsPage (#8008)
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-30 11:29:22 +00:00
Evan Lohn
b207a165c7 feat(filesys): UI for selecting hierarchy in assistant creation part 1 (#7721) 2026-01-30 10:36:51 +00:00
Raunak Bhagat
c231d2ec67 refactor: Update hoverable (#8007)
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-30 09:58:26 +00:00
Danelegend
d1a0c75a40 fix(llm): existing custom config not used (#8002) 2026-01-30 07:47:59 +00:00
Evan Lohn
3b737fe311 feat(filesys): filter on assistant info (#7852) 2026-01-30 06:51:00 +00:00
Evan Lohn
e7abbbdc7f feat(filesys): APIs for attaching hiernodes (#7698) 2026-01-30 06:02:25 +00:00
Raunak Bhagat
5d5080e9e1 feat: Add bottomSlot to modal API (#8000) 2026-01-30 04:56:33 +00:00
Jamison Lahman
83b7c5d088 chore(devserver): fix invalid customTheme require (#8001) 2026-01-30 04:53:03 +00:00
Danelegend
f08cdc603b fix(vertex): standardise vertex image config (#7988) 2026-01-30 04:50:54 +00:00
Raunak Bhagat
6932791dd5 refactor: Add a HoverableContainer (#7997) 2026-01-30 03:46:41 +00:00
acaprau
f334b365e0 hygiene(opensearch): Some cleanup (#7999) 2026-01-29 18:42:30 -08:00
Evan Lohn
af58ae5ad9 endpoint clean (#7998) 2026-01-29 18:40:45 -08:00
Raunak Bhagat
bcd8314dd7 refactor: Small tweaks to a few components (#7995) 2026-01-30 01:30:13 +00:00
Raunak Bhagat
cddb26ff19 feat: Add new star icon + rename icon file with invalid naming (#7992) 2026-01-30 01:29:47 +00:00
roshan
c8d38de37f fix(ce): documents sidebar spawns (#7994) 2026-01-30 00:55:07 +00:00
Jamison Lahman
f2e95ee8bb chore(deps): Bump mdast-util-to-hast from 13.2.0 to 13.2.1 in /web (#7991) 2026-01-30 00:50:24 +00:00
Jamison Lahman
94ee45ce64 chore(flags): rm unused NEXT_PUBLIC_ENABLE_CHROME_EXTENSION (#7983) 2026-01-30 00:35:22 +00:00
Jamison Lahman
f36d15d924 chore(flags): remove unused NEXT_PUBLIC_DEFAULT_SIDEBAR_OPEN (#7984) 2026-01-30 00:35:08 +00:00
Jamison Lahman
ec866debc0 chore(deps): Bump @sentry/nextjs from 10.23.0 to 10.27.0 in /web (#7990) 2026-01-30 00:19:29 +00:00
dependabot[bot]
08f80b4abf chore(deps): bump starlette from 0.47.2 to 0.49.3 in /backend/requirements (#5964)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Jamison Lahman <jamison@lahman.dev>
2026-01-30 00:07:31 +00:00
Raunak Bhagat
e559a4925a refactor: Add expandable card layouts with smooth animations (#7981) 2026-01-29 15:37:45 -08:00
Justin Tahara
1f63a23238 fix(helm): Fixing PSQL Operator Labeling (#7985) 2026-01-29 23:13:20 +00:00
Evan Lohn
658c76dd0a fix: custom config (#7987) 2026-01-29 23:01:16 +00:00
Jamison Lahman
00828af63f chore(fe): update baseline-browser-mapping (#7986) 2026-01-29 22:55:22 +00:00
victoria reese
71c6e40d5e feat: enable optional host setting (#7979)
Co-authored-by: victoria-reese_wwg <victoria.reese@grainger.com>
2026-01-29 21:36:59 +00:00
Jessica Singh
f3ff4b57bd feat(auth): update default auth (#7443)
Co-authored-by: Dane Urban <danelegend13@gmail.com>
2026-01-29 12:57:24 -08:00
Jamison Lahman
bf1752552b chore(tests): add retries to azure embeddings daily test (#7978) 2026-01-29 20:42:10 +00:00
Raunak Bhagat
5a9f9e28dc refactor: Consolidate Label component (#7974) 2026-01-29 19:52:39 +00:00
Wenxi
655cfc4858 fix: input masking (#7977) 2026-01-29 18:10:29 +00:00
Wenxi
b26c2e27b2 fix: don't show intro anim with new tenant modal + usage (#7976) 2026-01-29 17:57:45 +00:00
Evan Lohn
305a667bf9 test(filesys): drive hierarchynodes (#7676) 2026-01-29 17:45:03 +00:00
Wenxi
6bc5b083d5 feat(craft): make last name optional in user info form (#7973)
Co-authored-by: Claude <noreply@anthropic.com>
2026-01-29 16:06:34 +00:00
Raunak Bhagat
31213d43b3 refactor: Edit SimpleCollapsible API and update stylings for Modal (#7971) 2026-01-29 00:51:57 -08:00
roshan
a9e79b45cc feat(craft): README (#7970)
Co-authored-by: cubic-dev-ai[bot] <191113872+cubic-dev-ai[bot]@users.noreply.github.com>
2026-01-28 22:59:12 -08:00
Evan Lohn
936ce0535d fix: llm provider upserts (#7969) 2026-01-29 06:33:42 +00:00
Raunak Bhagat
165710b5d6 fix: Edit styling (#7968) 2026-01-28 22:18:56 -08:00
roshan
c2ab9ca2a2 fix(craft): RESTORING WORKS (#7966) 2026-01-28 20:06:51 -08:00
roshan
3bcdeea560 fix(craft): PROMPT IMPROVEMENTS (#7961) 2026-01-28 19:16:58 -08:00
Yuhong Sun
31200a1b41 chore: Remove Reranking (#7946) 2026-01-29 01:26:26 +00:00
Nikolas Garza
a6261d57fd feat(ee): fe - add billing hooks and actions (#7858) 2026-01-29 01:19:44 +00:00
Wenxi
4c5e65e6dd fix(craft): auto set best model instead of checking for visibility (#7962) 2026-01-29 00:29:05 +00:00
Chris Weaver
e70115d359 fix: improve termination (#7964) 2026-01-28 16:19:36 -08:00
Raunak Bhagat
eec188f9d3 refactor: Make AgentCard use LineItemLayout for its information instead (#7958) 2026-01-29 00:10:18 +00:00
Chris Weaver
0504335a7b fix: local indexing for craft (#7959) 2026-01-28 16:12:25 -08:00
Wenxi
f5186b5e44 refactor: craft onboarding nit and connector docs (#7960) 2026-01-28 23:49:33 +00:00
Wenxi
8e3d4e1474 refactor(craft): fix pre-provisioning state management, fix demo data state management (#7955) 2026-01-28 22:59:21 +00:00
dependabot[bot]
474fb028b0 chore(deps): bump lodash-es from 4.17.21 to 4.17.23 in /web (#7652)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Jamison Lahman <jamison@lahman.dev>
2026-01-28 22:51:58 +00:00
dependabot[bot]
d25e773b0e chore(deps): Bump mistune from 0.8.4 to 3.1.4 in /backend (#6407)
Co-authored-by: Jamison Lahman <jamison@lahman.dev>
2026-01-28 22:48:06 +00:00
dependabot[bot]
c5df9d8863 chore(deps): bump lodash from 4.17.21 to 4.17.23 in /web (#7670)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Jamison Lahman <jamison@lahman.dev>
2026-01-28 22:32:52 +00:00
dependabot[bot]
28eabdc885 chore(deps): bump esbuild and vite in /widget (#7543)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-01-28 14:33:37 -08:00
dependabot[bot]
72f34e403c chore(deps): bump astral-sh/setup-uv from 7.1.5 to 7.2.0 (#7528)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-01-28 14:31:13 -08:00
dependabot[bot]
8037dd2420 chore(deps): bump actions/checkout from 6.0.1 to 6.0.2 (#7802)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-01-28 14:30:52 -08:00
Justin Tahara
d29a384da6 chore(braintrust): Removing indexing_pipeline logs (#7957) 2026-01-28 22:25:33 +00:00
Jamison Lahman
fe7e5d3c55 chore(deps): add pytest-repeat to dev (#7956) 2026-01-28 22:10:49 +00:00
dependabot[bot]
91185f80c4 chore(deps): bump j178/prek-action from 1.0.11 to 1.0.12 (#7529)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-01-28 14:19:33 -08:00
dependabot[bot]
1244df1176 chore(deps): bump next from 16.1.2 to 16.1.5 in /examples/widget (#7885)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-01-28 14:12:11 -08:00
dependabot[bot]
080e58d875 chore(deps): bump pypdf from 6.6.0 to 6.6.2 (#7834)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Jamison Lahman <jamison@lahman.dev>
2026-01-28 14:11:47 -08:00
roshan
420f46ce48 chore(craft): more craft logging (#7954)
Co-authored-by: cubic-dev-ai[bot] <191113872+cubic-dev-ai[bot]@users.noreply.github.com>
2026-01-28 14:11:04 -08:00
dependabot[bot]
50835b4fd0 chore(deps): bump hono from 4.11.5 to 4.11.7 in /backend/onyx/server/features/build/sandbox/kubernetes/docker/templates/outputs/web (#7880)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-01-28 14:10:38 -08:00
dependabot[bot]
b08a3f2195 chore(deps): bump next from 16.1.4 to 16.1.5 in /backend/onyx/server/features/build/sandbox/kubernetes/docker/templates/outputs/web (#7887)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-01-28 14:10:25 -08:00
dependabot[bot]
dbf0c10632 chore(deps): bump next from 16.0.10 to 16.1.5 in /web (#7882)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Jamison Lahman <jamison@lahman.dev>
2026-01-28 21:48:33 +00:00
Jamison Lahman
04433f8d44 chore(hygiene): remove linux kernel (#7953) 2026-01-28 21:31:22 +00:00
Raunak Bhagat
e426ca627f refactor: rename /chat route to /app (#7711)
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-28 21:03:04 +00:00
roshan
6c9651eb97 feat(craft): onyx craft upsell upgrade modal when you run out of free messages (#7943) 2026-01-28 20:55:57 +00:00
roshan
02140eed98 fix(craft): hide session limit (#7947) 2026-01-28 20:55:47 +00:00
Jamison Lahman
93f316fa8a chore(devtools): upgrade ods: v0.4.0->v0.4.1 (#7952) 2026-01-28 20:39:03 +00:00
Wenxi
e02a60ddc7 fix: exceptions trace modal (#7951) 2026-01-28 20:25:45 +00:00
Raunak Bhagat
aa413e93d1 refactor: New sections/cards directory to host all feature-specific cards. (#7949) 2026-01-28 20:23:50 +00:00
roshan
2749e9dd6d fix(craft): install script for craft will force pull latest image for any craft-* image tags (#7950) 2026-01-28 20:08:42 +00:00
Jamison Lahman
decca26a71 chore(devtools): ods cherry-pick QOL (#7708) 2026-01-28 19:03:54 +00:00
Justin Tahara
1c490735b1 chore(api): Cleanup (#7945) 2026-01-28 18:51:31 +00:00
Yuhong Sun
87da107a03 fix: Cloud Embedding Keys (#7944) 2026-01-28 18:31:08 +00:00
Evan Lohn
f8b56098cc feat(filesys): hierarchynodes carry permission info (#7669) 2026-01-28 09:12:47 +00:00
Evan Lohn
a3a43173f7 feat(filesys): drive hierarchynodes (#7560) 2026-01-28 08:15:35 +00:00
Evan Lohn
aea924119d feat(filesys): hierarchyfetching task impl (#7557) 2026-01-28 06:40:41 +00:00
Chris Weaver
a79e581465 fix: attachment prompt tweak (#7929) 2026-01-27 22:44:43 -08:00
Chris Weaver
6a02ff9922 fix: kubernetes freezing (#7928) 2026-01-27 22:32:07 -08:00
Wenxi
71b8746a34 fix: z index for output panel (#7926) 2026-01-27 22:12:36 -08:00
Evan Lohn
7080b3d966 feat(filesys): creation of hierarchyfetching job (#7555) 2026-01-28 06:03:15 +00:00
Wenxi
adc3c86b16 feat(craft): allow closing LLM setup modal (#7925) 2026-01-28 05:58:09 +00:00
roshan
b110621b13 fix(craft): install script for craft-latest image (#7918) 2026-01-27 21:40:30 -08:00
Evan Lohn
a2dc752d14 feat(filesys): implement hierarchy injection into vector db chunks (#7548) 2026-01-28 05:29:15 +00:00
Wenxi
f7d47a6ca9 refactor: build/v1 to craft/v1 (#7924) 2026-01-28 05:07:50 +00:00
roshan
9cc71b71ee fix(craft): allow more lenient tag names (for versioning) (#7921) 2026-01-27 21:13:35 -08:00
Wenxi
f2bafd113a refactor: packet type processing and path sanitization (#7920) 2026-01-28 04:33:54 +00:00
roshan
bb00ebd4a8 fix(craft): block opencode.json read (#7846) 2026-01-28 04:29:07 +00:00
Evan Lohn
fda04aa6d2 feat(filesys): opensearch integration with hierarchy (#7429) 2026-01-28 04:04:30 +00:00
Yuhong Sun
285aae6f2f chore: README (#7919) 2026-01-27 19:45:13 -08:00
Yuhong Sun
b75b1019f3 chore: kg stuff in celery (#7908) 2026-01-28 03:36:31 +00:00
Evan Lohn
bbba32b989 feat(filesys): connect hierarchynode and assistant (#7428) 2026-01-28 03:28:47 +00:00
joachim-danswer
f06bf69956 fix(craft): new demo data & change of eng IC demo persona (#7917) 2026-01-28 03:10:54 +00:00
roshan
7d4fe480cc fix(craft): files directory works locally + kube (#7913) 2026-01-27 19:01:08 -08:00
Chris Weaver
7f5b512856 feat: craft ui improvements (#7916) 2026-01-28 02:52:39 +00:00
Wenxi
844a01f751 fix(craft): allow initializing non-visible models (#7915) 2026-01-28 02:49:51 +00:00
Evan Lohn
d64be385db feat(filesys): Connectors know about hierarchynodes (#7404) 2026-01-28 02:39:43 +00:00
roshan
d0518388d6 feat(craft): update github action for craft latest (#7910)
Co-authored-by: cubic-dev-ai[bot] <191113872+cubic-dev-ai[bot]@users.noreply.github.com>
2026-01-27 18:45:44 -08:00
Justin Tahara
a7f6d5f535 chore(tracing): Adding more explicit Tracing to our callsites (#7911) 2026-01-28 01:44:09 +00:00
Wenxi
059e2869e6 feat: md preview scrollbar (#7909) 2026-01-28 01:35:43 +00:00
Chris Weaver
04d90fd496 fix: improve session recovery (#7912) 2026-01-28 01:30:49 +00:00
Nikolas Garza
7cd29f4892 feat(ee): improve license enforcement middleware (#7853) 2026-01-28 01:26:02 +00:00
roshan
c2b86efebf fix(craft): delete session ui (#7847) 2026-01-27 17:30:35 -08:00
Nikolas Garza
bc5835967e feat(ee): Add unified billing API (#7857) 2026-01-27 17:02:08 -08:00
Evan Lohn
c2b11cae01 feat(filesys): data models and migration (#7402) 2026-01-28 00:03:52 +00:00
Chris Weaver
cf17ba6a1c fix: db connection closed for craft (#7905) 2026-01-27 15:46:46 -08:00
Jamison Lahman
b03634ecaa chore(mypy): fix mypy cache issues switching between HEAD and release (#7732) 2026-01-27 23:29:51 +00:00
Wenxi
9a7e92464f fix: demo data toggle race condition (#7902) 2026-01-27 23:06:17 +00:00
Wenxi
09b2a69c82 chore: remove pyproject config for pypandoc mypy (#7894) 2026-01-27 22:31:41 +00:00
Jamison Lahman
c5c027c168 fix: sidebar items are title case (#7893) 2026-01-27 22:05:06 +00:00
Wenxi
882163a4ea feat: md rendering, docx conversion and download, output panel refresh refactor for all artifacts (#7892) 2026-01-27 21:58:06 +00:00
roshan
de83a9a6f0 feat(craft): better output formats (#7889) 2026-01-27 21:48:08 +00:00
Jamison Lahman
f73ce0632f fix(citations): enable citation sidebar w/ web_search-only assistants (#7888) 2026-01-27 20:55:12 +00:00
Justin Tahara
0b10b11af3 fix(redis): Adding more TTLs (#7886) 2026-01-27 20:31:54 +00:00
roshan
d9e3b657d0 fix(craft): only include org_info/ when demo data enabled (#7845) 2026-01-27 19:48:48 +00:00
Justin Tahara
f6e9928dc1 fix(llm): Hide private models from Agent Creation (#7873) 2026-01-27 19:44:13 +00:00
Justin Tahara
ca3179ad8d chore(pr): Add Cherry-pick check (#7805) 2026-01-27 19:31:10 +00:00
Nikolas Garza
5529829ff5 feat(ee): update api to claim license via cloud proxy (#7840) 2026-01-27 18:46:39 +00:00
Chris Weaver
bdc7f6c100 chore: specify sandbox version (#7870) 2026-01-27 10:49:39 -08:00
Wenxi
90f8656afa fix: connector details back button should nav back (#7869) 2026-01-27 18:36:41 +00:00
Wenxi
3c7d35a6e8 fix: remove posthog debug logs and adjust gitignore (#7868) 2026-01-27 18:36:14 +00:00
Nikolas Garza
40d58a37e3 feat(ee): enforce seat limits on user operations (#7504) 2026-01-27 18:12:09 +00:00
Justin Tahara
be3ecd9640 fix(helm): Updating Ingress Templates (#7864) 2026-01-27 17:21:01 +00:00
Chris Weaver
a6da511490 fix: pass in correct region to allow IRSA usage (#7865) 2026-01-27 17:20:25 +00:00
roshan
c7577ebe58 fix(craft): only insert onyx user context when demo data not enabled (#7841) 2026-01-27 17:13:33 +00:00
SubashMohan
b87078a4f5 feat(chat): Search over chats and projects (#7788) 2026-01-27 16:57:00 +00:00
Yuhong Sun
8a408e7023 fix: Project Creation (#7851) 2026-01-27 05:27:19 +00:00
Nikolas Garza
4c7b73a355 feat(ee): add proxy endpoints for self-hosted billing operations (#7819) 2026-01-27 03:57:04 +00:00
Wenxi
8e9cb94d4f fix: processing mode enum (#7849) 2026-01-26 19:09:04 -08:00
Wenxi
a21af4b906 fix: type ignore unrelated mypy for onyx craft head (#7843) 2026-01-26 18:26:53 -08:00
Chris Weaver
7f0ce0531f feat: Onyx Craft (#7484)
Co-authored-by: Wenxi <wenxi@onyx.app>
Co-authored by: joachim-danswer <joachim@danswer.ai>
Co-authored-by: rohoswagger <roshan@onyx.app>
2026-01-26 17:12:42 -08:00
acaprau
b631bfa656 feat(opensearch): Add separate index settings for AWS-managed OpenSearch; Add function for disabling index auto-creation (#7814) 2026-01-27 00:40:46 +00:00
Nikolas Garza
eca6b6bef2 feat(ee): add license public key file and improve signature verification (#7806) 2026-01-26 23:44:16 +00:00
Wenxi
51ef28305d fix: user count check (#7811) 2026-01-26 13:21:33 -08:00
Jamison Lahman
144030c5ca chore(vscode): add non-clean seeded db restore (#7795) 2026-01-26 08:55:19 -08:00
SubashMohan
a557d76041 feat(ui): add new icons and enhance FadeDiv, Modal, Tabs, ExpandableTextDisplay (#7563)
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-26 10:26:09 +00:00
SubashMohan
605e808158 fix(layout): adjust footer margin and prevent page refresh on chatsession drop (#7759) 2026-01-26 04:45:40 +00:00
roshan
8fec88c90d chore(deployment): remove no auth option from setup script (#7784) 2026-01-26 04:42:45 +00:00
Yuhong Sun
e54969a693 fix: LiteLLM Azure models don't stream (#7761) 2026-01-25 07:46:51 +00:00
Raunak Bhagat
1da2b2f28f fix: Some new fixes that were discovered by AI reviewers during 2.9-hotfixing (#7757)
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-25 04:44:30 +00:00
Nikolas Garza
eb7b91e08e fix(tests): use crawler-friendly search query in Exa integration test (#7746) 2026-01-24 20:58:02 +00:00
Yuhong Sun
3339000968 fix: Spacing issue on Feedback (#7747) 2026-01-24 12:59:00 -08:00
Nikolas Garza
d9db849e94 fix(chat): prevent streaming text from appearing in bursts after citations (#7745) 2026-01-24 11:48:34 -08:00
Yuhong Sun
046408359c fix: Azure OpenAI Tool Calls (#7727) 2026-01-24 01:47:03 +00:00
acaprau
4b8cca190f feat(opensearch): Implement complete retrieval filtering (#7691) 2026-01-23 23:27:42 +00:00
Justin Tahara
52a312a63b feat: onyx discord bot - supervisord and kube deployment (#7706) 2026-01-23 20:55:06 +00:00
Danelegend
0594fd17de chore(tests): add more packet tests (#7677) 2026-01-23 19:49:41 +00:00
Jamison Lahman
fded81dc28 chore(extensions): pull in chrome extension (#7703) 2026-01-23 10:17:05 -08:00
Danelegend
31db112de9 feat(url): Open url around snippet (#7488) 2026-01-23 17:02:38 +00:00
Jamison Lahman
a3e2da2c51 chore(vscode): add useful database operations (#7702) 2026-01-23 08:49:59 -08:00
Evan Lohn
f4d33bcc0d feat: basic user MCP action attaching (#7681) 2026-01-23 05:50:49 +00:00
Jamison Lahman
464d957494 chore(devtools): upgrade ods v0.4.0; vscode to restore seeded db (#7696) 2026-01-23 05:21:46 +00:00
Jamison Lahman
be12de9a44 chore(devtools): ods db restore --fetch-seeded (#7689) 2026-01-22 20:41:28 -08:00
Yuhong Sun
3e4a1f8a09 feat: Maintain correct docs on replay (#7683) 2026-01-22 19:24:10 -08:00
Raunak Bhagat
af9b7826ab fix: Remove cursor pointer from view-only field (#7688)
Co-authored-by: greptile-apps[bot] <165735046+greptile-apps[bot]@users.noreply.github.com>
2026-01-23 02:47:08 +00:00
Danelegend
cb16eb13fc chore(tests): Mock LLM (#7590) 2026-01-23 01:48:54 +00:00
Jamison Lahman
20a73bdd2e chore(desktop): make artifact filename version-agnostic (#7679) 2026-01-22 15:15:52 -08:00
Justin Tahara
85cc2b99b7 fix(fastapi): Resolve CVE-2025-68481 (#7661) 2026-01-22 20:07:25 +00:00
Jamison Lahman
1208a3ee2b chore(fe): disable blur when there is not a custom background (#7673)
Co-authored-by: greptile-apps[bot] <165735046+greptile-apps[bot]@users.noreply.github.com>
2026-01-22 11:26:16 -08:00
Justin Tahara
900fcef9dd feat(desktop): Domain Configuration (#7655) 2026-01-22 18:15:44 +00:00
Justin Tahara
d4ed25753b fix(ui): Coda Logo (#7656) 2026-01-22 10:10:02 -08:00
Justin Tahara
0ee58333b4 fix(ui): User Groups Connectors Fix (#7658) 2026-01-22 17:59:12 +00:00
Justin Tahara
11b7e0d571 fix(ui): First Connector Result (#7657) 2026-01-22 17:52:02 +00:00
acaprau
a35831f328 fix(opensearch): Release Onyx Helm Charts was failing (#7672) 2026-01-22 17:41:47 +00:00
Justin Tahara
048a6d5259 fix(ui): Fix Token Rate Limits Page (#7659) 2026-01-22 17:20:21 +00:00
Ciaran Sweet
e4bdb15910 docs: enhance send-chat-message docs to also show ChatFullResponse (#7430) 2026-01-22 16:48:26 +00:00
Jamison Lahman
3517d59286 chore(fe): add custom backgrounds to the settings page (#7668) 2026-01-21 21:32:56 -08:00
Jamison Lahman
4bc08e5d88 chore(fe): remove Text pseudo-element padding (#7665) 2026-01-21 19:50:42 -08:00
Yuhong Sun
4bd080cf62 chore: Redirect user to create account (#7654) 2026-01-22 02:44:58 +00:00
Raunak Bhagat
b0a8625ffc feat: Add confirmation modal for connector disconnect (#7637)
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-22 02:08:19 +00:00
Yuhong Sun
f94baf6143 fix: DR Language Tuning (#7660) 2026-01-21 17:36:50 -08:00
Wenxi
9e1867638a feat: onyx discord bot - frontend (#7497) 2026-01-22 00:00:12 +00:00
Yuhong Sun
5b6d7c9f0d chore: Onboarding Image Generation (#7653) 2026-01-21 15:49:15 -08:00
Danelegend
e5dcf31f10 fix(image): Emit error to user (#7644) 2026-01-21 22:50:12 +00:00
Nikolas Garza
8ca06ef3e7 fix: deflake chat user journey test (#7646) 2026-01-21 22:33:30 +00:00
Justin Tahara
6897dbd610 feat(desktop): Properly Sign Mac App (#7608) 2026-01-21 22:17:45 +00:00
Evan Lohn
7f3cb77466 chore: remove prompt caching from chat history (#7636) 2026-01-21 21:35:11 +00:00
acaprau
267042a5aa fix(opensearch): Use the same method for getting title that the title embedding logic uses; small cleanup for content embedding (#7638) 2026-01-21 21:34:38 +00:00
Yuhong Sun
d02b3ae6ac chore: Remove default prompt shortcuts (#7639) 2026-01-21 21:28:53 +00:00
Yuhong Sun
683c3f7a7e fix: color mode and memories (#7642) 2026-01-21 13:29:33 -08:00
Nikolas Garza
008b4d2288 fix(slack): Extract person names and filter garbage in query expansion (#7632) 2026-01-21 21:09:50 +00:00
Jamison Lahman
8be261405a chore(deployments): fix region (#7640) 2026-01-21 13:14:42 -08:00
acaprau
61f2c48ebc feat(opensearch): Add helm charts (#7606) 2026-01-21 19:34:18 +00:00
acaprau
dbde2e6d6d chore(opensearch): Create OpenSearch docker compose, enabling test_opensearch_client.py to run in CI (#7611) 2026-01-21 18:41:23 +00:00
Raunak Bhagat
2860136214 feat: Refreshed user settings page (#7455)
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-21 16:41:56 +00:00
Raunak Bhagat
49ec5994d3 refactor: Improve refresh-components with cleanup and truncation (#7622)
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-21 00:29:25 -08:00
Raunak Bhagat
8d5fb67f0f feat: improve prompt shortcuts with uniqueness constraints and enhancements (#7619)
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-21 07:31:35 +00:00
Raunak Bhagat
15d02f6e3c fix: Prevent description duplication in Modal header (#7609)
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-21 04:32:22 +00:00
Jamison Lahman
e58974c419 chore(fe): move chatpage footer inside background element (#7618) 2026-01-21 04:21:49 +00:00
Yuhong Sun
6b66c07952 chore: Delete multilingual docker compose file (#7616) 2026-01-20 19:50:01 -08:00
Jamison Lahman
cae058a3ac chore(extensions): simplify and de-dupe NRFPage (#7607) 2026-01-21 03:42:19 +00:00
Nikolas Garza
aa3b21a191 fix: scroll to bottom when loading existing conversations (#7614) 2026-01-20 19:19:18 -08:00
Raunak Bhagat
7a07a78696 fix: Set width to fit for rightChildren section in LineItem (#7604)
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-21 01:55:03 +00:00
Nikolas Garza
a8db236e37 feat(billing): fetch Stripe publishable key from S3 (#7595) 2026-01-21 01:32:57 +00:00
Raunak Bhagat
8a2e4ed36f fix: Fix flashing in progress-circle icon (#7605)
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-21 01:03:52 +00:00
Evan Lohn
216f2c95a7 chore: add dialog description to modal (#7603) 2026-01-21 00:41:35 +00:00
Evan Lohn
67081efe08 fix: modal header in index attempt errors (#7601) 2026-01-21 00:37:23 +00:00
Yuhong Sun
9d40b8336f feat: Allow no system prompt (#7600) 2026-01-20 16:16:39 -08:00
Evan Lohn
23f0033302 chore: bg services launch.json (#7597) 2026-01-21 00:05:20 +00:00
Raunak Bhagat
9011b76eb0 refactor: Add new layout component (#7588)
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-20 23:36:18 +00:00
Yuhong Sun
45e436bafc fix: prompt tunings (#7594) 2026-01-20 15:13:05 -08:00
Justin Tahara
010bc36d61 Revert "chore(deps): Bump fastapi-users from 14.0.1 to 15.0.2 in /backend/requirements" (#7593) 2026-01-20 14:44:21 -08:00
dependabot[bot]
468e488bdb chore(deps): bump docker/setup-buildx-action from 3.11.1 to 3.12.0 (#7527)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-01-20 22:36:39 +00:00
dependabot[bot]
9104c0ffce chore(deps): Bump fastapi-users from 14.0.1 to 15.0.2 in /backend/requirements (#6897)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: justin-tahara <justintahara@gmail.com>
2026-01-20 22:31:02 +00:00
Jamison Lahman
d36a6bd0b4 feat(fe): custom chat backgrounds (#7486)
Co-authored-by: cubic-dev-ai[bot] <191113872+cubic-dev-ai[bot]@users.noreply.github.com>
2026-01-20 14:29:06 -08:00
Jamison Lahman
a3603c498c chore(deployments): fetch secrets from AWS (#7584) 2026-01-20 22:10:19 +00:00
Jamison Lahman
8f274e34c9 chore(blame): unignore checked in .vscode/ files (#7592) 2026-01-20 14:07:27 -08:00
Justin Tahara
5c256760ff fix(vertex ai): Extra Args for Opus 4.5 (#7586) 2026-01-20 14:07:14 -08:00
Nikolas Garza
258e1372b3 fix(billing): remove grandfathered pricing option when subscription lapses (#7583) 2026-01-20 21:55:37 +00:00
Yuhong Sun
83a543a265 chore: NLTK and stopwords (#7587) 2026-01-20 13:36:04 -08:00
Evan Lohn
f9719d199d fix: drive connector creation ui (#7578) 2026-01-20 21:10:06 +00:00
Raunak Bhagat
1c7bb6e56a fix: Input variant refactor (#7579)
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-20 13:04:16 -08:00
acaprau
982ad7d329 feat(opensearch): Add dual document indices (#7539) 2026-01-20 20:53:24 +00:00
Jamison Lahman
f94292808b chore(vscode): launch.template.jsonc -> launch.json (#7440) 2026-01-20 20:32:46 +00:00
Justin Tahara
293553a2e2 fix(tests): Anthropic Prompt Caching Test (#7585) 2026-01-20 20:32:24 +00:00
Justin Tahara
ba906ae6fa chore(llm): Removing Claude Haiku 3.5 (#7577) 2026-01-20 19:06:14 +00:00
Raunak Bhagat
c84c7a354e refactor: refactor to use string-enum props instead of boolean props (#7575)
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-20 18:59:54 +00:00
Jamison Lahman
2187b0dd82 chore(pre-commit): disallow large files (#7576) 2026-01-20 11:02:00 -08:00
acaprau
d88a417bf9 feat(opensearch): Formally disable secondary indices in the backend (#7541) 2026-01-20 18:21:47 +00:00
Jamison Lahman
f2d32b0b3b fix(fe): inline code text wraps (#7574) 2026-01-20 17:11:42 +00:00
Nikolas Garza
f89432009f fix(fe): show scroll-down button when user scrolls up during streaming (#7562) 2026-01-20 07:07:55 +00:00
Jamison Lahman
8ab2bab34e chore(fe): fix sticky header parent height (#7561) 2026-01-20 06:18:32 +00:00
Jamison Lahman
59e0d62512 chore(fe): align assistant icon with chat bar (#7537) 2026-01-19 19:47:18 -08:00
Jamison Lahman
a1471b16a5 fix(fe): chat header is sticky and transparent (#7487) 2026-01-19 19:20:03 -08:00
Yuhong Sun
9d3811cb58 fix: prompt tuning (#7550) 2026-01-19 19:04:18 -08:00
Yuhong Sun
3cd9505383 feat: Memory initial (#7547) 2026-01-19 18:57:13 -08:00
Nikolas Garza
d11829b393 refactor: proxy customer portal session through control plane (#7544) 2026-01-20 01:24:30 +00:00
Nikolas Garza
f6e068e914 feat(billing): add annual pricing support to subscription checkout (#7506) 2026-01-20 00:17:18 +00:00
roshan
0c84edd980 feat: onyx embeddable widget (#7427)
Co-authored-by: cubic-dev-ai[bot] <191113872+cubic-dev-ai[bot]@users.noreply.github.com>
2026-01-20 00:01:10 +00:00
Wenxi
2b274a7683 feat: onyx discord bot - discord client (#7496) 2026-01-20 00:00:20 +00:00
Wenxi
ddd91f2d71 feat: onyx discord bot - api client and cache manager (#7495) 2026-01-19 23:15:17 +00:00
Yuhong Sun
a7c7da0dfc fix: tool call handling for weak models (#7538) 2026-01-19 13:37:00 -08:00
Evan Lohn
b00a3e8b5d fix(test): confluence group sync (#7536) 2026-01-19 21:20:48 +00:00
Raunak Bhagat
d77d1a48f1 fix: Line item fixes (#7513) 2026-01-19 20:25:35 +00:00
Raunak Bhagat
7b4fc6729c fix: Popover size fix (#7521) 2026-01-19 18:44:29 +00:00
Nikolas Garza
1f113c86ef feat(ee): license enforcement middleware (#7483) 2026-01-19 18:03:39 +00:00
Raunak Bhagat
8e38ba3e21 refactor: Fix some onboarding inaccuracies (#7511) 2026-01-19 04:33:27 +00:00
Raunak Bhagat
bb9708a64f refactor: Small styling / prop-naming refactors (#7503) 2026-01-19 02:49:27 +00:00
Raunak Bhagat
8cae97e145 fix: Fix connector-setup modal (#7502) 2026-01-19 00:29:36 +00:00
Wenxi
7e4abca224 feat: onyx discord bot - backend, crud, and apis (#7494) 2026-01-18 23:13:58 +00:00
Yuhong Sun
233a91ea65 chore: drop dead table (#7500) 2026-01-17 20:05:22 -08:00
Yuhong Sun
b30737b6b2 fix: memory leak possibility (#7493) 2026-01-18 02:00:09 +00:00
Yuhong Sun
caf8b85ec2 feat: LLM filter on query endpoint (#7492) 2026-01-17 15:56:07 -08:00
Yuhong Sun
1d13580b63 feat: Keyword Expansions (#7485) 2026-01-17 02:08:53 +00:00
acaprau
00390c53e0 fix(vespa): Make ID retrieval always check for tenant ID; Add additional tenant ID checks in the new interface (#7480) 2026-01-17 01:58:13 +00:00
Raunak Bhagat
66656df9e6 refactor: Layout fixes (#7475) 2026-01-17 01:49:45 +00:00
Jamison Lahman
51d26d7e4c chore(git): git rm plans/ -r (#7482) 2026-01-16 17:03:32 -08:00
Yuhong Sun
198ac8ccbc feat: Doc search optionally returns contents (#7481) 2026-01-16 16:33:01 -08:00
Jamison Lahman
ee6d33f484 refactor(fe): remove redundant as="span" usage (#7479) 2026-01-16 23:57:39 +00:00
Danelegend
7bcb72d055 feat(image-gen): nano banana addition on fe (#7375) 2026-01-16 23:48:43 +00:00
Danelegend
876894e097 feat(img-gen): Add nanobanana to backend (#7403) 2026-01-16 23:35:15 +00:00
Yuhong Sun
7215f56b25 chore: reenable some tests (#7476) 2026-01-16 15:26:18 -08:00
dependabot[bot]
0fd1c34014 chore(deps): bump distributed from 2025.11.0 to 2026.1.1 in /backend/requirements (#7462)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Jamison Lahman <jamison@lahman.dev>
2026-01-16 22:08:04 +00:00
Jamison Lahman
9e24b41b7b fix(db): ensure migrations are atomic (#7474) 2026-01-16 21:40:19 +00:00
Jamison Lahman
ab3853578b chore(fe): fix WelcomeMessage hydration issue (#7473) 2026-01-16 20:25:48 +00:00
dependabot[bot]
7db969d36a chore(deps): bump pyasn1 from 0.6.1 to 0.6.2 (#7472)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Jamison Lahman <jamison@lahman.dev>
2026-01-16 20:19:32 +00:00
Nikolas Garza
6cdeb71656 fix(playwright): waitFor instead of waitForSelector for action popover button (#7464) 2026-01-16 20:08:29 +00:00
Yuhong Sun
2c4b2c68b4 enhancement: prompt tuning (#7469) 2026-01-16 11:50:59 -08:00
Yuhong Sun
5301ee7cef Contribution Guidelines (#7468) 2026-01-16 11:24:09 -08:00
Wenxi
f8e6716875 feat: override tenant usgae limits for dev mode (#7463) 2026-01-16 18:09:44 +00:00
Wenxi
755c65fd8a feat: url builder for api server http requests (#7442) 2026-01-16 17:52:47 +00:00
Wenxi
90cf5f49e3 fix: delete old notifications first in migration (#7454) 2026-01-16 17:52:10 +00:00
Nikolas Garza
d4068c2b07 fix: improve scroll behavior (#7364) 2026-01-16 16:32:09 +00:00
dependabot[bot]
fd6fa43fe1 chore(deps): bump langchain-text-splitters from 0.3.8 to 0.3.9 (#7459)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Jamison Lahman <jamison@lahman.dev>
2026-01-16 10:31:39 +00:00
dependabot[bot]
8d5013bf01 chore(deps): bump langchain-core from 0.3.51 to 0.3.81 (#7456)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Jamison Lahman <jamison@lahman.dev>
2026-01-16 10:06:08 +00:00
dependabot[bot]
dabd7c6263 chore(deps-dev): Bump storybook from 8.6.14 to 8.6.15 in /web (#6847)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Jamison Lahman <jamison@lahman.dev>
2026-01-16 10:00:52 +00:00
dependabot[bot]
c8c0389675 chore(deps-dev): bump js-yaml from 3.14.1 to 3.14.2 in /web (#7458)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Jamison Lahman <jamison@lahman.dev>
2026-01-16 09:59:28 +00:00
dependabot[bot]
9cfcfb12e1 chore(deps): remove diff and npm in /web (#7422)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Jamison Lahman <jamison@lahman.dev>
2026-01-16 09:30:03 +00:00
Jamison Lahman
786a0c2bd0 chore(deps): upgrade widget deps (#7457) 2026-01-16 01:02:51 -08:00
dependabot[bot]
0cd8d3402b chore(deps): bump torch from 2.6.0 to 2.9.1 in /backend/requirements (#5667)
Co-authored-by: Jamison Lahman <jamison@lahman.dev>
2026-01-16 08:41:58 +00:00
Raunak Bhagat
3fa397b24d fix: Fix notifications popover that broke with some modal/popover changes (#7453) 2026-01-16 03:48:40 +00:00
acaprau
e0a97230b8 feat(opensearch): Fix some stuff around metadata to improve code and match what we store in Vespa (#7448) 2026-01-16 03:46:22 +00:00
Raunak Bhagat
7f1272117a fix: Update modal sizings (#7452) 2026-01-16 03:12:20 +00:00
Evan Lohn
79302f19be fix: bedrock non-anthropic prompt caching (#7435) 2026-01-16 02:02:41 +00:00
Raunak Bhagat
4a91e644d4 refactor: User settings hooks (#7445) 2026-01-16 01:41:04 +00:00
Jamison Lahman
ca0318f16e fix(fe): assistant icon is inline with chat (#7449) 2026-01-16 01:40:54 +00:00
Jamison Lahman
be8e0b3a98 refactor(fe): simplify AIMessage render (#7447) 2026-01-16 01:02:15 +00:00
Raunak Bhagat
49c4814c70 fix: Fix invite buttons (#7444)
Co-authored-by: Jamison Lahman <jamison@lahman.dev>
2026-01-16 00:13:00 +00:00
Yuhong Sun
2f945613a2 feat: Backend Search APIs (#7431)
Co-authored-by: acaprau <48705707+acaprau@users.noreply.github.com>
2026-01-15 23:53:56 +00:00
acaprau
e9242ca3a8 feat(opensearch): Implement match highlighting (#7437) 2026-01-15 23:05:07 +00:00
Jamison Lahman
a150de761a chore(devtools): upgrade ods -> v0.3.2 (#7438) 2026-01-15 12:36:06 -08:00
Jamison Lahman
0e792ca6c9 chore(devtools): fix ods README typo (#7441) 2026-01-15 12:27:17 -08:00
Jamison Lahman
6be467a4ac chore(devtools): #7432 follow ups (#7436) 2026-01-15 11:50:11 -08:00
Jamison Lahman
dd91bfcfe6 chore(devtools): ods run-ci (#7432) 2026-01-15 11:10:24 -08:00
SubashMohan
8a72291781 feat(chat): enable Slack federated search based on user preference (#7355) 2026-01-15 17:47:48 +00:00
roshan
b2d71da4eb feat(citations): Add include_citations parameter to control citation processing (#7412)
Co-authored-by: cubic-dev-ai[bot] <191113872+cubic-dev-ai[bot]@users.noreply.github.com>
2026-01-15 17:09:59 +00:00
Jamison Lahman
6e2f851c62 chore(tests): fix nightly model-server tests (#7421) 2026-01-15 08:08:14 -08:00
Yuhong Sun
be078edcb4 feat: Search Backend (#7426) 2026-01-15 02:22:30 +00:00
acaprau
194c54aca3 feat(opensearch): Propogate search scores (#7425) 2026-01-15 01:44:15 +00:00
Raunak Bhagat
9fa7221e24 feat: Agent deletion (#7361)
Co-authored-by: cubic-dev-ai[bot] <191113872+cubic-dev-ai[bot]@users.noreply.github.com>
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-15 01:15:58 +00:00
Raunak Bhagat
3a5c7ef8ee feat: Agent sharing (#7359)
Co-authored-by: cubic-dev-ai[bot] <191113872+cubic-dev-ai[bot]@users.noreply.github.com>
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-15 00:42:14 +00:00
Evan Lohn
84458aa0bf chore: default usage limits off2 (#7424) 2026-01-14 23:54:03 +00:00
Danelegend
de57bfa35f refactor(img-gen): encapsulate provider quirks (#7386) 2026-01-14 23:19:08 +00:00
Yuhong Sun
386f8f31ed chore: Turn off reasoning for chat naming (#7423) 2026-01-14 14:06:04 -08:00
Evan Lohn
376f04caea chore: usage limit defaults to off (#7420) 2026-01-14 21:05:51 +00:00
Raunak Bhagat
4b0a3c2b04 fix: Agent editor fix (#7419) 2026-01-14 20:38:11 +00:00
Yuhong Sun
1bd9f9d9a6 chore: Cleanup dead code (#7418) 2026-01-14 20:05:41 +00:00
acaprau
4ac10abaea feat(OpenSearch): Implement update (#7416) 2026-01-14 20:00:08 +00:00
Raunak Bhagat
a66a283af4 fix: Fix small UI rendering bugs in AgentEditorPage (#7417) 2026-01-14 19:52:14 +00:00
Yuhong Sun
bf5da04166 fix: Chat naming for long messages (#7415) 2026-01-14 19:51:10 +00:00
roshan
693487f855 feat(mcp): add support for passing custom headers through send-chat-message API (#7390) 2026-01-14 19:36:49 +00:00
Jamison Lahman
d02a76d7d1 chore(docs): fix is_creation description (#7414) 2026-01-14 19:34:58 +00:00
Danelegend
28e05c6e90 refactor(llm): replace credential_file w/ custom_config in llmconfig (#7401) 2026-01-14 17:52:38 +00:00
Danelegend
a18f546921 fix(chat): Internal search enablement matches source enablement (#7338) 2026-01-14 17:20:38 +00:00
Yuhong Sun
e98dea149e feat: Deep Research Multilingual (#7405) 2026-01-14 05:13:15 +00:00
Yuhong Sun
027c165794 chore: Refactor pre search UI backend (#7399)
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-14 03:08:48 +00:00
Nikolas Garza
14ebe912c8 feat(tools): auto-pin internal search when sources change (#7376) 2026-01-14 02:48:51 +00:00
Evan Lohn
a63b906789 fix(mcp): per-user auth (#7400) 2026-01-14 02:01:47 +00:00
Yuhong Sun
92a68a3c22 fix: LLM failing to give answer on tool call (#7398) 2026-01-14 00:28:01 +00:00
Chris Weaver
95db4ed9c7 feat: add back indexed slack (#7392) 2026-01-14 00:06:35 +00:00
Yuhong Sun
5134d60d48 fix: _url_lookup_variants swallows all non-url document ids (#7387) 2026-01-13 23:38:29 +00:00
Evan Lohn
651a54470d fix: prevent updates from overwriting perm syncing (#7384) 2026-01-13 23:36:01 +00:00
dependabot[bot]
269d243b67 chore(deps): Bump pandas from 2.2.3 to 2.3.3 in /backend (#6670)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-01-13 22:33:53 +00:00
dependabot[bot]
0286dd7da9 chore(deps): Bump dask from 2023.8.1 to 2025.11.0 in /backend (#6671)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Jamison Lahman <jamison@lahman.dev>
2026-01-13 22:15:33 +00:00
dependabot[bot]
f3a0710d69 chore(deps): Bump docker/metadata-action from 5.9.0 to 5.10.0 (#6669)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-01-13 14:09:33 -08:00
Jamison Lahman
037c2aee3a chore(playwright): skip dall-e test (#7395) 2026-01-13 13:58:20 -08:00
dependabot[bot]
9b2f3d234d chore(deps): bump filelock from 3.20.1 to 3.20.3 in /backend/requirements (#7389)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Jamison Lahman <jamison@lahman.dev>
2026-01-13 13:22:07 -08:00
Jamison Lahman
7646399cd4 revert: "feat: Enable triple click on content in the chat" (#7393) 2026-01-13 13:21:30 -08:00
dependabot[bot]
d913b93d10 chore(deps-dev): bump virtualenv from 20.35.4 to 20.36.1 in /backend/requirements (#7388)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Jamison Lahman <jamison@lahman.dev>
2026-01-13 20:41:37 +00:00
Raunak Bhagat
8a0ce4c294 feat: Feedback modals update (#7380) 2026-01-13 19:48:45 +00:00
Wenxi
862c140763 chore: move public tag constant and tag ingestion api (#7383) 2026-01-13 19:36:05 +00:00
Jamison Lahman
47487f1940 chore(fe): fix undefined className in tooltip (#7324) 2026-01-13 19:19:16 +00:00
Jamison Lahman
e3471df940 chore(devtools): upgrade ods to v0.2.2 (#7282) 2026-01-13 11:22:09 -08:00
acaprau
fb33c815b3 feat(opensearch): Refactor and implement chunk content enrichment and cleanup (#7385) 2026-01-13 19:04:49 +00:00
Jamison Lahman
5c6594be73 chore(pre-commit): run npm install after web/package.json changes (#7382) 2026-01-13 18:35:49 +00:00
SubashMohan
8d30a03d7f fix(chat): prevent adding chat sessions to recents that belong to a project (#7377) 2026-01-13 17:57:29 +00:00
Raunak Bhagat
277428f579 refactor: consolidate tabs components into single Tabs.tsx (#7370) 2026-01-13 03:51:48 +00:00
acaprau
9f8c0d4237 feat(opensearch): Even more feature parity, more strict tenant ID checks, OpenSearch client test improvements (#7372)
Co-authored-by: cubic-dev-ai[bot] <191113872+cubic-dev-ai[bot]@users.noreply.github.com>
2026-01-13 03:39:02 +00:00
Jessica Singh
9ccbb6a04b feat(web search): exa crawler (#7326) 2026-01-13 01:42:16 +00:00
Danelegend
58a943f782 fix(tools): Tool name should align with what llm knows (#7352) 2026-01-13 01:04:20 +00:00
roshan
9021c607f2 chore(dr): finer grained tracing for clarification step, research plan step, and orchestration step (#7374)
Co-authored-by: greptile-apps[bot] <165735046+greptile-apps[bot]@users.noreply.github.com>
2026-01-12 23:58:27 +00:00
Jamison Lahman
c03b0d80fd chore(deps): remove requires-python < 3.13 (#7367) 2026-01-12 23:21:02 +00:00
acaprau
fcf0b316a4 feat(opensearch): More feature parity (#7286) 2026-01-12 23:01:55 +00:00
Jamison Lahman
157f672b4b chore(deps): upgrade numpy, unstructured, unstructured-client (#7369) 2026-01-12 22:58:11 +00:00
dependabot[bot]
51b9484b96 chore(deps): bump actions/upload-artifact from 5.0.0 to 6.0.0 (#6964)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Jamison Lahman <jamison@lahman.dev>
2026-01-12 21:53:48 +00:00
Danelegend
0c8f55c049 fix(tools): persist enabled tools in ui (#7347) 2026-01-12 21:47:29 +00:00
dependabot[bot]
c7be2571d1 chore(deps): bump tauri-apps/tauri-action from 0.6.0 to 0.6.1 (#7371)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-01-12 13:48:46 -08:00
dependabot[bot]
4948b6cca9 chore(deps): bump actions/stale from 10.1.0 to 10.1.1 (#6965)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-01-12 13:12:24 -08:00
Jamison Lahman
638ea5f316 chore(deps): fix uv-lock hook (#7368) 2026-01-12 12:52:17 -08:00
dependabot[bot]
6e3268ca75 chore(deps): bump pypdf from 6.1.3 to 6.6.0 (#7319)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Jamison Lahman <jamison@lahman.dev>
2026-01-12 20:36:47 +00:00
Wenxi
d8921df60c fix: onboarding modal styling (#7363) 2026-01-12 20:29:23 +00:00
Yuhong Sun
693d9f5f69 fix: Editing First Message (#7366) 2026-01-12 19:45:01 +00:00
Jamison Lahman
02e17871cc chore(devtools): recommend starting dev dockers with --wait (#7365) 2026-01-12 19:13:00 +00:00
Wenxi
209cfd00b0 fix: only show latest release notification for nightly versions (#7362) 2026-01-12 11:10:28 -08:00
Jessica Singh
cd36baa484 fix(web search): removing site: operator from exa query (#7248) 2026-01-12 18:22:18 +00:00
Raunak Bhagat
c78fe275af refactor: Popover cleanup (#7356) 2026-01-12 12:08:30 +00:00
Raunak Bhagat
c935c4808f fix: More actions cards fixes (#7358) 2026-01-12 03:27:42 -08:00
Raunak Bhagat
4ebcfef541 fix: Fix actions cards (#7357) 2026-01-12 10:57:22 +00:00
SubashMohan
e320ef9d9c Fix/agent creation files (#7346) 2026-01-12 07:00:47 +00:00
Nikolas Garza
9e02438af5 chore: standardize password/secret inputs and update per design docs (#7316) 2026-01-12 06:26:09 +00:00
Danelegend
177e097ddb fix(chat): newly created chats being marked as failed (#7310)
Co-authored-by: Dane Urban <durban@Danes-MacBook-Pro.local>
2026-01-12 02:02:49 +00:00
Wenxi
9ecd47ec31 feat: in app notifications for changelog (#7253) 2026-01-12 01:09:04 +00:00
Nikolas Garza
83f3d29b10 fix: stop federated OAuth modal from appearing permanently after skips (#7351) 2026-01-11 22:20:13 +00:00
Yuhong Sun
12e668cc0f feat: Deep Research Replay (#7340) 2026-01-11 22:17:09 +00:00
SubashMohan
afe8376d5e feat: Exclude image generation providers from LLM fetch in API calls (#7348) 2026-01-11 21:13:25 +00:00
Wenxi
082ef3e096 fix: always start onboarding at first step and track by user (#7315) 2026-01-11 21:03:17 +00:00
Nikolas Garza
cb2951a1c0 perf: switch BeautifulSoup parser from html.parser to lxml for web crawler (#7350) 2026-01-11 20:46:35 +00:00
Corey Auger
eda5598af5 fix: update docs link (#7349)
Co-authored-by: cubic-dev-ai[bot] <191113872+cubic-dev-ai[bot]@users.noreply.github.com>
2026-01-11 12:44:48 -08:00
Justin Tahara
0bbb4b6988 fix(ui): Action Strikethrough when not configured (#7273) 2026-01-11 11:21:17 +00:00
Jamison Lahman
4768aadb20 refactor(fe): WelcomeMessage nits (#7344) 2026-01-10 22:01:48 -08:00
Jamison Lahman
e05e85e782 fix(fe): "Pick a date range" button wrapping (#7343) 2026-01-10 21:22:20 -08:00
Jamison Lahman
6408f61307 fix(fe): avoid internal table scroll on query history page (#7342) 2026-01-10 20:39:17 -08:00
Jamison Lahman
5a5cd51e4f fix(fe): SidebarTabs are Links (#7341) 2026-01-10 20:01:31 -08:00
Danelegend
7c047c47a0 fix(chat): Chat in-progress messages (#7318)
Co-authored-by: Dane Urban <durban@Danes-MacBook-Pro.local>
2026-01-11 00:29:39 +00:00
Evan Lohn
22138bbb33 fix: vertex prompt caching (#7339)
Co-authored-by: Weves <chrisweaver101@gmail.com>
2026-01-11 00:23:39 +00:00
Chris Weaver
7cff1064a8 chore: reenable auto update test (#7146) 2026-01-10 16:00:48 -08:00
Wenxi
deeb6fdcd2 fix: anonymous users cookie and admin panel config (#7321) 2026-01-10 15:12:27 -08:00
Chris Weaver
3e7f4e0aa5 fix: auto-sync (#7337) 2026-01-10 13:43:40 -08:00
Raunak Bhagat
ac73671e35 refactor: Components updates (#7308) 2026-01-10 06:30:39 +00:00
Raunak Bhagat
3c20d132e0 feat: Modal updates (#7306) 2026-01-10 05:13:09 +00:00
Yuhong Sun
0e3e7eb4a2 feat: Create new chat session button after msg send (#7332)
Co-authored-by: Raunak Bhagat <r@rabh.io>
2026-01-10 04:56:54 +00:00
Yuhong Sun
c85aebe8ab Tables (#7333) 2026-01-09 20:40:15 -08:00
Yuhong Sun
a47e6a3146 feat: Enable triple click on content in the chat (#7331)
Co-authored-by: Raunak Bhagat <r@rabh.io>
2026-01-09 20:37:36 -08:00
Jamison Lahman
1e61737e03 fix(fe): Tags have consistent height on hover (#7328) 2026-01-09 20:20:36 -08:00
Wenxi
c7fc1cd5ae chore: allow tenant cleanup script to skip control plane if tenant not found (#7290) 2026-01-10 00:17:26 +00:00
roshan
e2b60bf67c feat(posthog): track message origin analytics in posthog (#7313) 2026-01-10 00:11:17 +00:00
Danelegend
f4d4d14286 fix(chat): post llm loop callback (#7309)
Co-authored-by: Dane Urban <durban@Danes-MacBook-Pro.local>
2026-01-09 23:53:22 +00:00
Yuhong Sun
1c24bc6ea2 Opensearch README (#7327) 2026-01-09 15:53:22 -08:00
Yuhong Sun
cacbd18dcd feat: Opensearch README (#7325) 2026-01-09 15:28:08 -08:00
Nikolas Garza
8527b83b15 fix(sidebar): Allow unpinning all agents and fix icon flicker (#7241) 2026-01-09 14:20:46 -08:00
Nikolas Garza
33e37a1846 fix: make autocomplete opt in (#7317) 2026-01-09 20:04:22 +00:00
Jamison Lahman
d454d8a878 fix(chat): wide tables can be scrolled (#7311) 2026-01-09 19:07:40 +00:00
roshan
00ad65a6a8 feat: chrome extension (#6704) 2026-01-09 18:45:23 +00:00
Nikolas Garza
dac60d403c fix(chat): show "User has stopped generation" indicator when user cancels (#7312) 2026-01-09 18:14:35 +00:00
Evan Lohn
6256b2854d chore: bump indexing usage (#7307) 2026-01-09 17:46:27 +00:00
Danelegend
8acb8e191d fix(chat): use url when name unknown (#7278)
Co-authored-by: Dane Urban <durban@Danes-MacBook-Pro.local>
2026-01-09 17:16:20 +00:00
Evan Lohn
8c4cbddc43 fix: minor perm sync improvements (#7296) 2026-01-09 05:46:23 +00:00
Yuhong Sun
f6cd006bd6 chore: Refactor tool exceptions (#7280) 2026-01-09 04:01:12 +00:00
Jamison Lahman
0033934319 chore(perf): remove isEqual memoization check (#7304)
Co-authored-by: greptile-apps[bot] <165735046+greptile-apps[bot]@users.noreply.github.com>
2026-01-09 03:20:37 +00:00
Raunak Bhagat
ff87b79d14 fix: Section layout component fix (#7305) 2026-01-08 19:25:33 -08:00
Raunak Bhagat
ebf18af7c9 refactor: UI components cleanup (#7301)
Co-authored-by: Nikolas Garza <90273783+nmgarza5@users.noreply.github.com>
2026-01-09 03:09:20 +00:00
Raunak Bhagat
cf67ae962c feat: Add a new GeneralLayouts file and update layout components (#7297)
Co-authored-by: Nikolas Garza <90273783+nmgarza5@users.noreply.github.com>
2026-01-09 02:50:21 +00:00
2096 changed files with 224414 additions and 47371 deletions

16
.cursor/mcp.json Normal file
View File

@@ -0,0 +1,16 @@
{
"mcpServers": {
"Playwright": {
"command": "npx",
"args": [
"@playwright/mcp"
]
},
"Linear": {
"url": "https://mcp.linear.app/mcp"
},
"Figma": {
"url": "https://mcp.figma.com/mcp"
}
}
}

4
.github/CODEOWNERS vendored
View File

@@ -6,5 +6,5 @@
/web/STANDARDS.md @raunakab @Weves
# Agent context files
/CLAUDE.md.template @Weves
/AGENTS.md.template @Weves
/CLAUDE.md @Weves
/AGENTS.md @Weves

View File

@@ -8,4 +8,5 @@
## Additional Options
- [ ] [Required] I have considered whether this PR needs to be cherry-picked to the latest beta branch.
- [ ] [Optional] Override Linear Check

File diff suppressed because it is too large Load Diff

View File

@@ -21,10 +21,10 @@ jobs:
timeout-minutes: 45
steps:
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # ratchet:docker/setup-buildx-action@v3
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # ratchet:docker/setup-buildx-action@v3
- name: Login to Docker Hub
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # ratchet:docker/login-action@v3
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # ratchet:docker/login-action@v3
with:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_TOKEN }}

View File

@@ -21,10 +21,10 @@ jobs:
timeout-minutes: 45
steps:
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # ratchet:docker/setup-buildx-action@v3
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # ratchet:docker/setup-buildx-action@v3
- name: Login to Docker Hub
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # ratchet:docker/login-action@v3
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # ratchet:docker/login-action@v3
with:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_TOKEN }}

View File

@@ -15,7 +15,7 @@ jobs:
timeout-minutes: 45
steps:
- name: Checkout
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # ratchet:actions/checkout@v6
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # ratchet:actions/checkout@v6
with:
fetch-depth: 0
persist-credentials: false
@@ -29,6 +29,7 @@ jobs:
run: |
helm repo add ingress-nginx https://kubernetes.github.io/ingress-nginx
helm repo add onyx-vespa https://onyx-dot-app.github.io/vespa-helm-charts
helm repo add opensearch https://opensearch-project.github.io/helm-charts
helm repo add cloudnative-pg https://cloudnative-pg.github.io/charts
helm repo add ot-container-kit https://ot-container-kit.github.io/helm-charts
helm repo add minio https://charts.min.io/

View File

@@ -13,7 +13,7 @@ jobs:
runs-on: ubuntu-latest
timeout-minutes: 45
steps:
- uses: actions/stale@5f858e3efba33a5ca4407a664cc011ad407f2008 # ratchet:actions/stale@v10
- uses: actions/stale@997185467fa4f803885201cee163a9f38240193d # ratchet:actions/stale@v10
with:
stale-issue-message: 'This issue is stale because it has been open 75 days with no activity. Remove stale label or comment or this will be closed in 15 days.'
stale-pr-message: 'This PR is stale because it has been open 75 days with no activity. Remove stale label or comment or this will be closed in 15 days.'

View File

@@ -28,12 +28,12 @@ jobs:
steps:
- name: Checkout code
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # ratchet:actions/checkout@v6
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # ratchet:actions/checkout@v6
with:
persist-credentials: false
- name: Set up Python
uses: actions/setup-python@83679a892e2d95755f2dac6acb0bfd1e9ac5d548 # ratchet:actions/setup-python@v6
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # ratchet:actions/setup-python@v6
with:
python-version: '3.11'
cache: 'pip'
@@ -94,10 +94,10 @@ jobs:
steps:
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # ratchet:docker/setup-buildx-action@v3
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # ratchet:docker/setup-buildx-action@v3
- name: Login to Docker Hub
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # ratchet:docker/login-action@v3
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # ratchet:docker/login-action@v3
with:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_TOKEN }}

View File

@@ -0,0 +1,28 @@
name: Require beta cherry-pick consideration
concurrency:
group: Require-Beta-Cherrypick-Consideration-${{ github.workflow }}-${{ github.head_ref || github.event.workflow_run.head_branch || github.run_id }}
cancel-in-progress: true
on:
pull_request:
types: [opened, edited, reopened, synchronize]
permissions:
contents: read
jobs:
beta-cherrypick-check:
runs-on: ubuntu-latest
timeout-minutes: 45
steps:
- name: Check PR body for beta cherry-pick consideration
env:
PR_BODY: ${{ github.event.pull_request.body }}
run: |
if echo "$PR_BODY" | grep -qiE "\\[x\\][[:space:]]*\\[Required\\][[:space:]]*I have considered whether this PR needs to be cherry[- ]picked to the latest beta branch"; then
echo "Cherry-pick consideration box is checked. Check passed."
exit 0
fi
echo "::error::Please check the 'I have considered whether this PR needs to be cherry-picked to the latest beta branch' box in the PR description."
exit 1

View File

@@ -27,7 +27,7 @@ jobs:
- uses: runs-on/action@cd2b598b0515d39d78c38a02d529db87d2196d1e # ratchet:runs-on/action@v2
- name: Checkout code
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # ratchet:actions/checkout@v6
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # ratchet:actions/checkout@v6
with:
persist-credentials: false
@@ -40,13 +40,16 @@ jobs:
- name: Generate OpenAPI schema and Python client
shell: bash
# TODO(Nik): https://linear.app/onyx-app/issue/ENG-1/update-test-infra-to-use-test-license
env:
LICENSE_ENFORCEMENT_ENABLED: "false"
run: |
ods openapi all
# needed for pulling external images otherwise, we hit the "Unauthenticated users" limit
# https://docs.docker.com/docker-hub/usage/
- name: Login to Docker Hub
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # ratchet:docker/login-action@v3
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # ratchet:docker/login-action@v3
with:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_TOKEN }}

114
.github/workflows/pr-desktop-build.yml vendored Normal file
View File

@@ -0,0 +1,114 @@
name: Build Desktop App
concurrency:
group: Build-Desktop-App-${{ github.workflow }}-${{ github.head_ref || github.event.workflow_run.head_branch || github.run_id }}
cancel-in-progress: true
on:
merge_group:
pull_request:
branches:
- main
- "release/**"
paths:
- "desktop/**"
- ".github/workflows/pr-desktop-build.yml"
push:
tags:
- "v*.*.*"
permissions:
contents: read
jobs:
build-desktop:
name: Build Desktop (${{ matrix.platform }})
runs-on: ${{ matrix.os }}
timeout-minutes: 60
strategy:
fail-fast: false
matrix:
include:
- platform: linux
os: ubuntu-latest
target: x86_64-unknown-linux-gnu
args: "--bundles deb,rpm"
# TODO: Fix and enable the macOS build.
#- platform: macos
# os: macos-latest
# target: universal-apple-darwin
# args: "--target universal-apple-darwin"
# TODO: Fix and enable the Windows build.
#- platform: windows
# os: windows-latest
# target: x86_64-pc-windows-msvc
# args: ""
steps:
- name: Checkout code
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd
with:
persist-credentials: false
- name: Setup node
uses: actions/setup-node@6044e13b5dc448c55e2357c09f80417699197238
with:
node-version: 24
cache: "npm" # zizmor: ignore[cache-poisoning]
cache-dependency-path: ./desktop/package-lock.json
- name: Setup Rust
uses: dtolnay/rust-toolchain@4be9e76fd7c4901c61fb841f559994984270fce7
with:
toolchain: stable
targets: ${{ matrix.target }}
- name: Cache Cargo registry and build
uses: actions/cache@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # zizmor: ignore[cache-poisoning]
with:
path: |
~/.cargo/bin/
~/.cargo/registry/index/
~/.cargo/registry/cache/
~/.cargo/git/db/
desktop/src-tauri/target/
key: ${{ runner.os }}-cargo-${{ hashFiles('desktop/src-tauri/Cargo.lock') }}
restore-keys: |
${{ runner.os }}-cargo-
- name: Install Linux dependencies
if: matrix.platform == 'linux'
run: |
sudo apt-get update
sudo apt-get install -y \
build-essential \
libglib2.0-dev \
libgirepository1.0-dev \
libgtk-3-dev \
libjavascriptcoregtk-4.1-dev \
libwebkit2gtk-4.1-dev \
libayatana-appindicator3-dev \
gobject-introspection \
pkg-config \
curl \
xdg-utils
- name: Install npm dependencies
working-directory: ./desktop
run: npm ci
- name: Build desktop app
working-directory: ./desktop
run: npx tauri build ${{ matrix.args }}
env:
TAURI_SIGNING_PRIVATE_KEY: ""
TAURI_SIGNING_PRIVATE_KEY_PASSWORD: ""
- name: Upload build artifacts
if: always()
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f
with:
name: desktop-build-${{ matrix.platform }}-${{ github.run_id }}
path: |
desktop/src-tauri/target/release/bundle/
retention-days: 7
if-no-files-found: ignore

View File

@@ -45,6 +45,9 @@ env:
# TODO: debug why this is failing and enable
CODE_INTERPRETER_BASE_URL: http://localhost:8000
# OpenSearch
OPENSEARCH_ADMIN_PASSWORD: "StrongPassword123!"
jobs:
discover-test-dirs:
# NOTE: Github-hosted runners have about 20s faster queue times and are preferred here.
@@ -54,7 +57,7 @@ jobs:
test-dirs: ${{ steps.set-matrix.outputs.test-dirs }}
steps:
- name: Checkout code
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # ratchet:actions/checkout@v6
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # ratchet:actions/checkout@v6
with:
persist-credentials: false
@@ -88,7 +91,7 @@ jobs:
- uses: runs-on/action@cd2b598b0515d39d78c38a02d529db87d2196d1e # ratchet:runs-on/action@v2
- name: Checkout code
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # ratchet:actions/checkout@v6
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # ratchet:actions/checkout@v6
with:
persist-credentials: false
@@ -107,7 +110,7 @@ jobs:
# otherwise, we hit the "Unauthenticated users" limit
# https://docs.docker.com/docker-hub/usage/
- name: Login to Docker Hub
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # ratchet:docker/login-action@v3
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # ratchet:docker/login-action@v3
with:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_TOKEN }}
@@ -115,6 +118,7 @@ jobs:
- name: Create .env file for Docker Compose
run: |
cat <<EOF > deployment/docker_compose/.env
COMPOSE_PROFILES=s3-filestore
CODE_INTERPRETER_BETA_ENABLED=true
DISABLE_TELEMETRY=true
EOF
@@ -125,11 +129,13 @@ jobs:
docker compose \
-f docker-compose.yml \
-f docker-compose.dev.yml \
-f docker-compose.opensearch.yml \
up -d \
minio \
relational_db \
cache \
index \
opensearch \
code-interpreter
- name: Run migrations
@@ -158,7 +164,7 @@ jobs:
cd deployment/docker_compose
# Get list of running containers
containers=$(docker compose -f docker-compose.yml -f docker-compose.dev.yml ps -q)
containers=$(docker compose -f docker-compose.yml -f docker-compose.dev.yml -f docker-compose.opensearch.yml ps -q)
# Collect logs from each container
for container in $containers; do
@@ -172,7 +178,7 @@ jobs:
- name: Upload Docker logs
if: failure()
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # ratchet:actions/upload-artifact@v5
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f
with:
name: docker-logs-${{ matrix.test-dir }}
path: docker-logs/

View File

@@ -30,7 +30,7 @@ jobs:
# fetch-depth 0 is required for helm/chart-testing-action
steps:
- name: Checkout code
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # ratchet:actions/checkout@v6
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # ratchet:actions/checkout@v6
with:
fetch-depth: 0
persist-credentials: false
@@ -88,6 +88,7 @@ jobs:
echo "=== Adding Helm repositories ==="
helm repo add ingress-nginx https://kubernetes.github.io/ingress-nginx
helm repo add vespa https://onyx-dot-app.github.io/vespa-helm-charts
helm repo add opensearch https://opensearch-project.github.io/helm-charts
helm repo add cloudnative-pg https://cloudnative-pg.github.io/charts
helm repo add ot-container-kit https://ot-container-kit.github.io/helm-charts
helm repo add minio https://charts.min.io/
@@ -180,6 +181,11 @@ jobs:
trap cleanup EXIT
# Run the actual installation with detailed logging
# Note that opensearch.enabled is true whereas others in this install
# are false. There is some work that needs to be done to get this
# entire step working in CI, enabling opensearch here is a small step
# in that direction. If this is causing issues, disabling it in this
# step should be ok in the short term.
echo "=== Starting ct install ==="
set +e
ct install --all \
@@ -187,9 +193,10 @@ jobs:
--set=nginx.enabled=false \
--set=minio.enabled=false \
--set=vespa.enabled=false \
--set=opensearch.enabled=true \
--set=auth.opensearch.enabled=true \
--set=slackbot.enabled=false \
--set=postgresql.enabled=true \
--set=postgresql.nameOverride=cloudnative-pg \
--set=postgresql.cluster.storage.storageClass=standard \
--set=redis.enabled=true \
--set=redis.storageSpec.volumeClaimTemplate.spec.storageClassName=standard \

View File

@@ -48,7 +48,7 @@ jobs:
test-dirs: ${{ steps.set-matrix.outputs.test-dirs }}
steps:
- name: Checkout code
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # ratchet:actions/checkout@v6
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # ratchet:actions/checkout@v6
with:
persist-credentials: false
@@ -56,7 +56,7 @@ jobs:
id: set-matrix
run: |
# Find all leaf-level directories in both test directories
tests_dirs=$(find backend/tests/integration/tests -mindepth 1 -maxdepth 1 -type d ! -name "__pycache__" ! -name "mcp" -exec basename {} \; | sort)
tests_dirs=$(find backend/tests/integration/tests -mindepth 1 -maxdepth 1 -type d ! -name "__pycache__" ! -name "mcp" ! -name "no_vectordb" -exec basename {} \; | sort)
connector_dirs=$(find backend/tests/integration/connector_job_tests -mindepth 1 -maxdepth 1 -type d ! -name "__pycache__" -exec basename {} \; | sort)
# Create JSON array with directory info
@@ -84,7 +84,7 @@ jobs:
steps:
- uses: runs-on/action@cd2b598b0515d39d78c38a02d529db87d2196d1e # ratchet:runs-on/action@v2
- name: Checkout code
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # ratchet:actions/checkout@v6
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # ratchet:actions/checkout@v6
with:
persist-credentials: false
@@ -103,13 +103,13 @@ jobs:
echo "cache-suffix=${CACHE_SUFFIX}" >> $GITHUB_OUTPUT
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # ratchet:docker/setup-buildx-action@v3
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # ratchet:docker/setup-buildx-action@v3
# needed for pulling Vespa, Redis, Postgres, and Minio images
# otherwise, we hit the "Unauthenticated users" limit
# https://docs.docker.com/docker-hub/usage/
- name: Login to Docker Hub
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # ratchet:docker/login-action@v3
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # ratchet:docker/login-action@v3
with:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_TOKEN }}
@@ -144,7 +144,7 @@ jobs:
steps:
- uses: runs-on/action@cd2b598b0515d39d78c38a02d529db87d2196d1e # ratchet:runs-on/action@v2
- name: Checkout code
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # ratchet:actions/checkout@v6
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # ratchet:actions/checkout@v6
with:
persist-credentials: false
@@ -163,13 +163,13 @@ jobs:
echo "cache-suffix=${CACHE_SUFFIX}" >> $GITHUB_OUTPUT
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # ratchet:docker/setup-buildx-action@v3
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # ratchet:docker/setup-buildx-action@v3
# needed for pulling Vespa, Redis, Postgres, and Minio images
# otherwise, we hit the "Unauthenticated users" limit
# https://docs.docker.com/docker-hub/usage/
- name: Login to Docker Hub
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # ratchet:docker/login-action@v3
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # ratchet:docker/login-action@v3
with:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_TOKEN }}
@@ -203,18 +203,18 @@ jobs:
steps:
- uses: runs-on/action@cd2b598b0515d39d78c38a02d529db87d2196d1e # ratchet:runs-on/action@v2
- name: Checkout code
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # ratchet:actions/checkout@v6
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # ratchet:actions/checkout@v6
with:
persist-credentials: false
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # ratchet:docker/setup-buildx-action@v3
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # ratchet:docker/setup-buildx-action@v3
# needed for pulling openapitools/openapi-generator-cli
# otherwise, we hit the "Unauthenticated users" limit
# https://docs.docker.com/docker-hub/usage/
- name: Login to Docker Hub
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # ratchet:docker/login-action@v3
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # ratchet:docker/login-action@v3
with:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_TOKEN }}
@@ -279,7 +279,7 @@ jobs:
steps:
- uses: runs-on/action@cd2b598b0515d39d78c38a02d529db87d2196d1e # ratchet:runs-on/action@v2
- name: Checkout code
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # ratchet:actions/checkout@v6
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # ratchet:actions/checkout@v6
with:
persist-credentials: false
@@ -287,7 +287,7 @@ jobs:
# otherwise, we hit the "Unauthenticated users" limit
# https://docs.docker.com/docker-hub/usage/
- name: Login to Docker Hub
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # ratchet:docker/login-action@v3
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # ratchet:docker/login-action@v3
with:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_TOKEN }}
@@ -300,7 +300,10 @@ jobs:
RUN_ID: ${{ github.run_id }}
run: |
cat <<EOF > deployment/docker_compose/.env
COMPOSE_PROFILES=s3-filestore
ENABLE_PAID_ENTERPRISE_EDITION_FEATURES=true
# TODO(Nik): https://linear.app/onyx-app/issue/ENG-1/update-test-infra-to-use-test-license
LICENSE_ENFORCEMENT_ENABLED=false
AUTH_TYPE=basic
POSTGRES_POOL_PRE_PING=true
POSTGRES_USE_NULL_POOL=true
@@ -310,8 +313,9 @@ jobs:
ONYX_MODEL_SERVER_IMAGE=${ECR_CACHE}:integration-test-model-server-test-${RUN_ID}
INTEGRATION_TESTS_MODE=true
CHECK_TTL_MANAGEMENT_TASK_FREQUENCY_IN_HOURS=0.001
AUTO_LLM_UPDATE_INTERVAL_SECONDS=1
AUTO_LLM_UPDATE_INTERVAL_SECONDS=10
MCP_SERVER_ENABLED=true
USE_LIGHTWEIGHT_BACKGROUND_WORKER=false
EOF
- name: Start Docker containers
@@ -438,12 +442,145 @@ jobs:
- name: Upload logs
if: always()
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # ratchet:actions/upload-artifact@v4
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f
with:
name: docker-all-logs-${{ matrix.test-dir.name }}
path: ${{ github.workspace }}/docker-compose.log
# ------------------------------------------------------------
no-vectordb-tests:
needs: [build-backend-image, build-integration-image]
runs-on:
[
runs-on,
runner=4cpu-linux-arm64,
"run-id=${{ github.run_id }}-no-vectordb-tests",
"extras=ecr-cache",
]
timeout-minutes: 45
steps:
- uses: runs-on/action@cd2b598b0515d39d78c38a02d529db87d2196d1e # ratchet:runs-on/action@v2
- name: Checkout code
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # ratchet:actions/checkout@v6
with:
persist-credentials: false
- name: Login to Docker Hub
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # ratchet:docker/login-action@v3
with:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_TOKEN }}
- name: Create .env file for no-vectordb Docker Compose
env:
ECR_CACHE: ${{ env.RUNS_ON_ECR_CACHE }}
RUN_ID: ${{ github.run_id }}
run: |
cat <<EOF > deployment/docker_compose/.env
COMPOSE_PROFILES=s3-filestore
ENABLE_PAID_ENTERPRISE_EDITION_FEATURES=true
LICENSE_ENFORCEMENT_ENABLED=false
AUTH_TYPE=basic
POSTGRES_POOL_PRE_PING=true
POSTGRES_USE_NULL_POOL=true
REQUIRE_EMAIL_VERIFICATION=false
DISABLE_TELEMETRY=true
DISABLE_VECTOR_DB=true
ONYX_BACKEND_IMAGE=${ECR_CACHE}:integration-test-backend-test-${RUN_ID}
INTEGRATION_TESTS_MODE=true
USE_LIGHTWEIGHT_BACKGROUND_WORKER=true
EOF
# Start only the services needed for no-vectordb mode (no Vespa, no model servers)
- name: Start Docker containers (no-vectordb)
run: |
cd deployment/docker_compose
docker compose -f docker-compose.yml -f docker-compose.no-vectordb.yml -f docker-compose.dev.yml up \
relational_db \
cache \
minio \
api_server \
background \
-d
id: start_docker_no_vectordb
- name: Wait for services to be ready
run: |
echo "Starting wait-for-service script (no-vectordb)..."
start_time=$(date +%s)
timeout=300
while true; do
current_time=$(date +%s)
elapsed_time=$((current_time - start_time))
if [ $elapsed_time -ge $timeout ]; then
echo "Timeout reached. Service did not become ready in $timeout seconds."
exit 1
fi
response=$(curl -s -o /dev/null -w "%{http_code}" http://localhost:8080/health || echo "curl_error")
if [ "$response" = "200" ]; then
echo "API server is ready!"
break
elif [ "$response" = "curl_error" ]; then
echo "Curl encountered an error; retrying..."
else
echo "Service not ready yet (HTTP $response). Retrying in 5 seconds..."
fi
sleep 5
done
- name: Run No-VectorDB Integration Tests
uses: nick-fields/retry@ce71cc2ab81d554ebbe88c79ab5975992d79ba08 # ratchet:nick-fields/retry@v3
with:
timeout_minutes: 20
max_attempts: 3
retry_wait_seconds: 10
command: |
echo "Running no-vectordb integration tests..."
docker run --rm --network onyx_default \
--name test-runner \
-e POSTGRES_HOST=relational_db \
-e POSTGRES_USER=postgres \
-e POSTGRES_PASSWORD=password \
-e POSTGRES_DB=postgres \
-e DB_READONLY_USER=db_readonly_user \
-e DB_READONLY_PASSWORD=password \
-e POSTGRES_POOL_PRE_PING=true \
-e POSTGRES_USE_NULL_POOL=true \
-e REDIS_HOST=cache \
-e API_SERVER_HOST=api_server \
-e OPENAI_API_KEY=${OPENAI_API_KEY} \
-e TEST_WEB_HOSTNAME=test-runner \
${{ env.RUNS_ON_ECR_CACHE }}:integration-test-${{ github.run_id }} \
/app/tests/integration/tests/no_vectordb
- name: Dump API server logs (no-vectordb)
if: always()
run: |
cd deployment/docker_compose
docker compose -f docker-compose.yml -f docker-compose.no-vectordb.yml -f docker-compose.dev.yml \
logs --no-color api_server > $GITHUB_WORKSPACE/api_server_no_vectordb.log || true
- name: Dump all-container logs (no-vectordb)
if: always()
run: |
cd deployment/docker_compose
docker compose -f docker-compose.yml -f docker-compose.no-vectordb.yml -f docker-compose.dev.yml \
logs --no-color > $GITHUB_WORKSPACE/docker-compose-no-vectordb.log || true
- name: Upload logs (no-vectordb)
if: always()
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f
with:
name: docker-all-logs-no-vectordb
path: ${{ github.workspace }}/docker-compose-no-vectordb.log
- name: Stop Docker containers (no-vectordb)
if: always()
run: |
cd deployment/docker_compose
docker compose -f docker-compose.yml -f docker-compose.no-vectordb.yml -f docker-compose.dev.yml down -v
multitenant-tests:
needs:
[build-backend-image, build-model-server-image, build-integration-image]
@@ -459,12 +596,12 @@ jobs:
steps:
- uses: runs-on/action@cd2b598b0515d39d78c38a02d529db87d2196d1e # ratchet:runs-on/action@v2
- name: Checkout code
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # ratchet:actions/checkout@v6
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # ratchet:actions/checkout@v6
with:
persist-credentials: false
- name: Login to Docker Hub
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # ratchet:docker/login-action@v3
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # ratchet:docker/login-action@v3
with:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_TOKEN }}
@@ -476,6 +613,7 @@ jobs:
run: |
cd deployment/docker_compose
ENABLE_PAID_ENTERPRISE_EDITION_FEATURES=true \
LICENSE_ENFORCEMENT_ENABLED=false \
MULTI_TENANT=true \
AUTH_TYPE=cloud \
REQUIRE_EMAIL_VERIFICATION=false \
@@ -567,7 +705,7 @@ jobs:
- name: Upload logs (multi-tenant)
if: always()
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # ratchet:actions/upload-artifact@v4
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f
with:
name: docker-all-logs-multitenant
path: ${{ github.workspace }}/docker-compose-multitenant.log
@@ -582,7 +720,7 @@ jobs:
# NOTE: Github-hosted runners have about 20s faster queue times and are preferred here.
runs-on: ubuntu-slim
timeout-minutes: 45
needs: [integration-tests, multitenant-tests]
needs: [integration-tests, no-vectordb-tests, multitenant-tests]
if: ${{ always() }}
steps:
- name: Check job status

View File

@@ -23,12 +23,12 @@ jobs:
timeout-minutes: 45
steps:
- name: Checkout code
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # ratchet:actions/checkout@v6
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # ratchet:actions/checkout@v6
with:
persist-credentials: false
- name: Setup node
uses: actions/setup-node@395ad3262231945c25e8478fd5baf05154b1d79f # ratchet:actions/setup-node@v4
uses: actions/setup-node@6044e13b5dc448c55e2357c09f80417699197238 # ratchet:actions/setup-node@v4
with:
node-version: 22
cache: "npm"
@@ -44,7 +44,7 @@ jobs:
- name: Upload coverage reports
if: always()
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # ratchet:actions/upload-artifact@v4
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f
with:
name: jest-coverage-${{ github.run_id }}
path: ./web/coverage

View File

@@ -40,7 +40,7 @@ jobs:
test-dirs: ${{ steps.set-matrix.outputs.test-dirs }}
steps:
- name: Checkout code
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # ratchet:actions/checkout@v6
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # ratchet:actions/checkout@v6
with:
persist-credentials: false
@@ -76,7 +76,7 @@ jobs:
steps:
- uses: runs-on/action@cd2b598b0515d39d78c38a02d529db87d2196d1e # ratchet:runs-on/action@v2
- name: Checkout code
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # ratchet:actions/checkout@v6
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # ratchet:actions/checkout@v6
with:
persist-credentials: false
@@ -95,13 +95,13 @@ jobs:
echo "cache-suffix=${CACHE_SUFFIX}" >> $GITHUB_OUTPUT
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # ratchet:docker/setup-buildx-action@v3
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # ratchet:docker/setup-buildx-action@v3
# needed for pulling Vespa, Redis, Postgres, and Minio images
# otherwise, we hit the "Unauthenticated users" limit
# https://docs.docker.com/docker-hub/usage/
- name: Login to Docker Hub
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # ratchet:docker/login-action@v3
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # ratchet:docker/login-action@v3
with:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_TOKEN }}
@@ -136,7 +136,7 @@ jobs:
steps:
- uses: runs-on/action@cd2b598b0515d39d78c38a02d529db87d2196d1e # ratchet:runs-on/action@v2
- name: Checkout code
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # ratchet:actions/checkout@v6
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # ratchet:actions/checkout@v6
with:
persist-credentials: false
@@ -155,13 +155,13 @@ jobs:
echo "cache-suffix=${CACHE_SUFFIX}" >> $GITHUB_OUTPUT
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # ratchet:docker/setup-buildx-action@v3
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # ratchet:docker/setup-buildx-action@v3
# needed for pulling Vespa, Redis, Postgres, and Minio images
# otherwise, we hit the "Unauthenticated users" limit
# https://docs.docker.com/docker-hub/usage/
- name: Login to Docker Hub
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # ratchet:docker/login-action@v3
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # ratchet:docker/login-action@v3
with:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_TOKEN }}
@@ -195,7 +195,7 @@ jobs:
steps:
- uses: runs-on/action@cd2b598b0515d39d78c38a02d529db87d2196d1e # ratchet:runs-on/action@v2
- name: Checkout code
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # ratchet:actions/checkout@v6
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # ratchet:actions/checkout@v6
with:
persist-credentials: false
@@ -214,13 +214,13 @@ jobs:
echo "cache-suffix=${CACHE_SUFFIX}" >> $GITHUB_OUTPUT
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # ratchet:docker/setup-buildx-action@v3
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # ratchet:docker/setup-buildx-action@v3
# needed for pulling openapitools/openapi-generator-cli
# otherwise, we hit the "Unauthenticated users" limit
# https://docs.docker.com/docker-hub/usage/
- name: Login to Docker Hub
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # ratchet:docker/login-action@v3
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # ratchet:docker/login-action@v3
with:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_TOKEN }}
@@ -271,7 +271,7 @@ jobs:
steps:
- uses: runs-on/action@cd2b598b0515d39d78c38a02d529db87d2196d1e # ratchet:runs-on/action@v2
- name: Checkout code
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # ratchet:actions/checkout@v6
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # ratchet:actions/checkout@v6
with:
persist-credentials: false
@@ -279,7 +279,7 @@ jobs:
# otherwise, we hit the "Unauthenticated users" limit
# https://docs.docker.com/docker-hub/usage/
- name: Login to Docker Hub
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # ratchet:docker/login-action@v3
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # ratchet:docker/login-action@v3
with:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_TOKEN }}
@@ -292,6 +292,7 @@ jobs:
RUN_ID: ${{ github.run_id }}
run: |
cat <<EOF > deployment/docker_compose/.env
COMPOSE_PROFILES=s3-filestore
AUTH_TYPE=basic
POSTGRES_POOL_PRE_PING=true
POSTGRES_USE_NULL_POOL=true
@@ -301,7 +302,7 @@ jobs:
ONYX_MODEL_SERVER_IMAGE=${ECR_CACHE}:integration-test-model-server-test-${RUN_ID}
INTEGRATION_TESTS_MODE=true
MCP_SERVER_ENABLED=true
AUTO_LLM_UPDATE_INTERVAL_SECONDS=1
AUTO_LLM_UPDATE_INTERVAL_SECONDS=10
EOF
- name: Start Docker containers
@@ -424,7 +425,7 @@ jobs:
- name: Upload logs
if: always()
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # ratchet:actions/upload-artifact@v4
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f
with:
name: docker-all-logs-${{ matrix.test-dir.name }}
path: ${{ github.workspace }}/docker-compose.log

View File

@@ -52,6 +52,9 @@ env:
MCP_SERVER_PUBLIC_HOST: host.docker.internal
MCP_SERVER_PUBLIC_URL: http://host.docker.internal:8004/mcp
# Visual regression S3 bucket (shared across all jobs)
PLAYWRIGHT_S3_BUCKET: onyx-playwright-artifacts
jobs:
build-web-image:
runs-on:
@@ -66,7 +69,7 @@ jobs:
- uses: runs-on/action@cd2b598b0515d39d78c38a02d529db87d2196d1e # ratchet:runs-on/action@v2
- name: Checkout code
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # ratchet:actions/checkout@v6
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # ratchet:actions/checkout@v6
with:
persist-credentials: false
@@ -85,12 +88,12 @@ jobs:
echo "cache-suffix=${CACHE_SUFFIX}" >> $GITHUB_OUTPUT
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # ratchet:docker/setup-buildx-action@v3
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # ratchet:docker/setup-buildx-action@v3
# needed for pulling external images otherwise, we hit the "Unauthenticated users" limit
# https://docs.docker.com/docker-hub/usage/
- name: Login to Docker Hub
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # ratchet:docker/login-action@v3
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # ratchet:docker/login-action@v3
with:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_TOKEN }}
@@ -127,7 +130,7 @@ jobs:
- uses: runs-on/action@cd2b598b0515d39d78c38a02d529db87d2196d1e # ratchet:runs-on/action@v2
- name: Checkout code
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # ratchet:actions/checkout@v6
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # ratchet:actions/checkout@v6
with:
persist-credentials: false
@@ -146,12 +149,12 @@ jobs:
echo "cache-suffix=${CACHE_SUFFIX}" >> $GITHUB_OUTPUT
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # ratchet:docker/setup-buildx-action@v3
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # ratchet:docker/setup-buildx-action@v3
# needed for pulling external images otherwise, we hit the "Unauthenticated users" limit
# https://docs.docker.com/docker-hub/usage/
- name: Login to Docker Hub
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # ratchet:docker/login-action@v3
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # ratchet:docker/login-action@v3
with:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_TOKEN }}
@@ -188,7 +191,7 @@ jobs:
- uses: runs-on/action@cd2b598b0515d39d78c38a02d529db87d2196d1e # ratchet:runs-on/action@v2
- name: Checkout code
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # ratchet:actions/checkout@v6
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # ratchet:actions/checkout@v6
with:
persist-credentials: false
@@ -207,12 +210,12 @@ jobs:
echo "cache-suffix=${CACHE_SUFFIX}" >> $GITHUB_OUTPUT
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # ratchet:docker/setup-buildx-action@v3
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # ratchet:docker/setup-buildx-action@v3
# needed for pulling external images otherwise, we hit the "Unauthenticated users" limit
# https://docs.docker.com/docker-hub/usage/
- name: Login to Docker Hub
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # ratchet:docker/login-action@v3
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # ratchet:docker/login-action@v3
with:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_TOKEN }}
@@ -239,6 +242,9 @@ jobs:
playwright-tests:
needs: [build-web-image, build-backend-image, build-model-server-image]
name: Playwright Tests (${{ matrix.project }})
permissions:
id-token: write # Required for OIDC-based AWS credential exchange (S3 access)
contents: read
runs-on:
- runs-on
- runner=8cpu-linux-arm64
@@ -249,17 +255,17 @@ jobs:
strategy:
fail-fast: false
matrix:
project: [admin, no-auth, exclusive]
project: [admin, exclusive]
steps:
- uses: runs-on/action@cd2b598b0515d39d78c38a02d529db87d2196d1e # ratchet:runs-on/action@v2
- name: Checkout code
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # ratchet:actions/checkout@v6
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # ratchet:actions/checkout@v6
with:
persist-credentials: false
- name: Setup node
uses: actions/setup-node@395ad3262231945c25e8478fd5baf05154b1d79f # ratchet:actions/setup-node@v4
uses: actions/setup-node@6044e13b5dc448c55e2357c09f80417699197238 # ratchet:actions/setup-node@v4
with:
node-version: 22
cache: "npm"
@@ -289,7 +295,10 @@ jobs:
RUN_ID: ${{ github.run_id }}
run: |
cat <<EOF > deployment/docker_compose/.env
COMPOSE_PROFILES=s3-filestore
ENABLE_PAID_ENTERPRISE_EDITION_FEATURES=true
# TODO(Nik): https://linear.app/onyx-app/issue/ENG-1/update-test-infra-to-use-test-license
LICENSE_ENFORCEMENT_ENABLED=false
AUTH_TYPE=basic
GEN_AI_API_KEY=${OPENAI_API_KEY_VALUE}
EXA_API_KEY=${EXA_API_KEY_VALUE}
@@ -299,15 +308,12 @@ jobs:
ONYX_MODEL_SERVER_IMAGE=${ECR_CACHE}:playwright-test-model-server-${RUN_ID}
ONYX_WEB_SERVER_IMAGE=${ECR_CACHE}:playwright-test-web-${RUN_ID}
EOF
if [ "${{ matrix.project }}" = "no-auth" ]; then
echo "PLAYWRIGHT_FORCE_EMPTY_LLM_PROVIDERS=true" >> deployment/docker_compose/.env
fi
# needed for pulling Vespa, Redis, Postgres, and Minio images
# otherwise, we hit the "Unauthenticated users" limit
# https://docs.docker.com/docker-hub/usage/
- name: Login to Docker Hub
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # ratchet:docker/login-action@v3
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # ratchet:docker/login-action@v3
with:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_TOKEN }}
@@ -428,21 +434,141 @@ jobs:
env:
PROJECT: ${{ matrix.project }}
run: |
# Create test-results directory to ensure it exists for artifact upload
mkdir -p test-results
if [ "${PROJECT}" = "no-auth" ]; then
export PLAYWRIGHT_FORCE_EMPTY_LLM_PROVIDERS=true
fi
npx playwright test --project ${PROJECT}
- uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # ratchet:actions/upload-artifact@v4
- uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f
if: always()
with:
# Includes test results and trace.zip files
name: playwright-test-results-${{ matrix.project }}-${{ github.run_id }}
path: ./web/test-results/
path: ./web/output/playwright/
retention-days: 30
- name: Upload screenshots
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f
if: always()
with:
name: playwright-screenshots-${{ matrix.project }}-${{ github.run_id }}
path: ./web/output/screenshots/
retention-days: 30
# --- Visual Regression Diff ---
- name: Configure AWS credentials
if: always()
uses: aws-actions/configure-aws-credentials@61815dcd50bd041e203e49132bacad1fd04d2708
with:
role-to-assume: ${{ secrets.AWS_OIDC_ROLE_ARN }}
aws-region: us-east-2
- name: Install the latest version of uv
if: always()
uses: astral-sh/setup-uv@61cb8a9741eeb8a550a1b8544337180c0fc8476b # ratchet:astral-sh/setup-uv@v7
with:
enable-cache: false
version: "0.9.9"
- name: Determine baseline revision
if: always()
id: baseline-rev
env:
EVENT_NAME: ${{ github.event_name }}
BASE_REF: ${{ github.event.pull_request.base.ref }}
MERGE_GROUP_BASE_REF: ${{ github.event.merge_group.base_ref }}
GH_REF: ${{ github.ref }}
REF_NAME: ${{ github.ref_name }}
run: |
if [ "${EVENT_NAME}" = "pull_request" ]; then
# PRs compare against the base branch (e.g. main, release/2.5)
echo "rev=${BASE_REF}" >> "$GITHUB_OUTPUT"
elif [ "${EVENT_NAME}" = "merge_group" ]; then
# Merge queue compares against the target branch (e.g. refs/heads/main -> main)
echo "rev=${MERGE_GROUP_BASE_REF#refs/heads/}" >> "$GITHUB_OUTPUT"
elif [[ "${GH_REF}" == refs/tags/* ]]; then
# Tag builds compare against the tag name
echo "rev=${REF_NAME}" >> "$GITHUB_OUTPUT"
else
# Push builds (main, release/*) compare against the branch name
echo "rev=${REF_NAME}" >> "$GITHUB_OUTPUT"
fi
- name: Generate screenshot diff report
if: always()
env:
PROJECT: ${{ matrix.project }}
PLAYWRIGHT_S3_BUCKET: ${{ env.PLAYWRIGHT_S3_BUCKET }}
BASELINE_REV: ${{ steps.baseline-rev.outputs.rev }}
run: |
uv run --no-sync --with onyx-devtools ods screenshot-diff compare \
--project "${PROJECT}" \
--rev "${BASELINE_REV}"
- name: Upload visual diff report to S3
if: always()
env:
PROJECT: ${{ matrix.project }}
PR_NUMBER: ${{ github.event.pull_request.number }}
RUN_ID: ${{ github.run_id }}
run: |
SUMMARY_FILE="web/output/screenshot-diff/${PROJECT}/summary.json"
if [ ! -f "${SUMMARY_FILE}" ]; then
echo "No summary file found — skipping S3 upload."
exit 0
fi
HAS_DIFF=$(jq -r '.has_differences' "${SUMMARY_FILE}")
if [ "${HAS_DIFF}" != "true" ]; then
echo "No visual differences for ${PROJECT} — skipping S3 upload."
exit 0
fi
aws s3 sync "web/output/screenshot-diff/${PROJECT}/" \
"s3://${PLAYWRIGHT_S3_BUCKET}/reports/pr-${PR_NUMBER}/${RUN_ID}/${PROJECT}/"
- name: Upload visual diff summary
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f
if: always()
with:
name: screenshot-diff-summary-${{ matrix.project }}
path: ./web/output/screenshot-diff/${{ matrix.project }}/summary.json
if-no-files-found: ignore
retention-days: 5
- name: Upload visual diff report artifact
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f
if: always()
with:
name: screenshot-diff-report-${{ matrix.project }}-${{ github.run_id }}
path: ./web/output/screenshot-diff/${{ matrix.project }}/
if-no-files-found: ignore
retention-days: 30
- name: Update S3 baselines
if: >-
success() && (
github.ref == 'refs/heads/main' ||
startsWith(github.ref, 'refs/heads/release/') ||
startsWith(github.ref, 'refs/tags/v') ||
(
github.event_name == 'merge_group' && (
github.event.merge_group.base_ref == 'refs/heads/main' ||
startsWith(github.event.merge_group.base_ref, 'refs/heads/release/')
)
)
)
env:
PROJECT: ${{ matrix.project }}
PLAYWRIGHT_S3_BUCKET: ${{ env.PLAYWRIGHT_S3_BUCKET }}
BASELINE_REV: ${{ steps.baseline-rev.outputs.rev }}
run: |
if [ -d "web/output/screenshots/" ] && [ "$(ls -A web/output/screenshots/)" ]; then
uv run --no-sync --with onyx-devtools ods screenshot-diff upload-baselines \
--project "${PROJECT}" \
--rev "${BASELINE_REV}" \
--delete
else
echo "No screenshots to upload for ${PROJECT} — skipping baseline update."
fi
# save before stopping the containers so the logs can be captured
- name: Save Docker logs
if: success() || failure()
@@ -455,11 +581,100 @@ jobs:
- name: Upload logs
if: success() || failure()
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # ratchet:actions/upload-artifact@v4
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f
with:
name: docker-logs-${{ matrix.project }}-${{ github.run_id }}
path: ${{ github.workspace }}/docker-compose.log
# Post a single combined visual regression comment after all matrix jobs finish
visual-regression-comment:
needs: [playwright-tests]
if: always() && github.event_name == 'pull_request'
runs-on: ubuntu-slim
timeout-minutes: 5
permissions:
pull-requests: write
steps:
- name: Download visual diff summaries
uses: actions/download-artifact@95815c38cf2ff2164869cbab79da8d1f422bc89e # ratchet:actions/download-artifact@v4
with:
pattern: screenshot-diff-summary-*
path: summaries/
- name: Post combined PR comment
env:
GH_TOKEN: ${{ github.token }}
PR_NUMBER: ${{ github.event.pull_request.number }}
RUN_ID: ${{ github.run_id }}
REPO: ${{ github.repository }}
S3_BUCKET: ${{ env.PLAYWRIGHT_S3_BUCKET }}
run: |
MARKER="<!-- visual-regression-report -->"
# Build the markdown table from all summary files
TABLE_HEADER="| Project | Changed | Added | Removed | Unchanged | Report |"
TABLE_DIVIDER="|---------|---------|-------|---------|-----------|--------|"
TABLE_ROWS=""
HAS_ANY_SUMMARY=false
for SUMMARY_DIR in summaries/screenshot-diff-summary-*/; do
SUMMARY_FILE="${SUMMARY_DIR}summary.json"
if [ ! -f "${SUMMARY_FILE}" ]; then
continue
fi
HAS_ANY_SUMMARY=true
PROJECT=$(jq -r '.project' "${SUMMARY_FILE}")
CHANGED=$(jq -r '.changed' "${SUMMARY_FILE}")
ADDED=$(jq -r '.added' "${SUMMARY_FILE}")
REMOVED=$(jq -r '.removed' "${SUMMARY_FILE}")
UNCHANGED=$(jq -r '.unchanged' "${SUMMARY_FILE}")
TOTAL=$(jq -r '.total' "${SUMMARY_FILE}")
HAS_DIFF=$(jq -r '.has_differences' "${SUMMARY_FILE}")
if [ "${TOTAL}" = "0" ]; then
REPORT_LINK="_No screenshots_"
elif [ "${HAS_DIFF}" = "true" ]; then
REPORT_URL="https://${S3_BUCKET}.s3.us-east-2.amazonaws.com/reports/pr-${PR_NUMBER}/${RUN_ID}/${PROJECT}/index.html"
REPORT_LINK="[View Report](${REPORT_URL})"
else
REPORT_LINK="✅ No changes"
fi
TABLE_ROWS="${TABLE_ROWS}| \`${PROJECT}\` | ${CHANGED} | ${ADDED} | ${REMOVED} | ${UNCHANGED} | ${REPORT_LINK} |\n"
done
if [ "${HAS_ANY_SUMMARY}" = "false" ]; then
echo "No visual diff summaries found — skipping PR comment."
exit 0
fi
BODY=$(printf '%s\n' \
"${MARKER}" \
"### 🖼️ Visual Regression Report" \
"" \
"${TABLE_HEADER}" \
"${TABLE_DIVIDER}" \
"$(printf '%b' "${TABLE_ROWS}")")
# Upsert: find existing comment with the marker, or create a new one
EXISTING_COMMENT_ID=$(gh api \
"repos/${REPO}/issues/${PR_NUMBER}/comments" \
--jq ".[] | select(.body | startswith(\"${MARKER}\")) | .id" \
2>/dev/null | head -1)
if [ -n "${EXISTING_COMMENT_ID}" ]; then
gh api \
--method PATCH \
"repos/${REPO}/issues/comments/${EXISTING_COMMENT_ID}" \
-f body="${BODY}"
else
gh api \
--method POST \
"repos/${REPO}/issues/${PR_NUMBER}/comments" \
-f body="${BODY}"
fi
playwright-required:
# NOTE: Github-hosted runners have about 20s faster queue times and are preferred here.
runs-on: ubuntu-slim
@@ -470,48 +685,3 @@ jobs:
- name: Check job status
if: ${{ contains(needs.*.result, 'failure') || contains(needs.*.result, 'cancelled') || contains(needs.*.result, 'skipped') }}
run: exit 1
# NOTE: Chromatic UI diff testing is currently disabled.
# We are using Playwright for local and CI testing without visual regression checks.
# Chromatic may be reintroduced in the future for UI diff testing if needed.
# chromatic-tests:
# name: Chromatic Tests
# needs: playwright-tests
# runs-on:
# [
# runs-on,
# runner=32cpu-linux-x64,
# disk=large,
# "run-id=${{ github.run_id }}",
# ]
# steps:
# - name: Checkout code
# uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # ratchet:actions/checkout@v6
# with:
# fetch-depth: 0
# - name: Setup node
# uses: actions/setup-node@395ad3262231945c25e8478fd5baf05154b1d79f # ratchet:actions/setup-node@v4
# with:
# node-version: 22
# - name: Install node dependencies
# working-directory: ./web
# run: npm ci
# - name: Download Playwright test results
# uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093 # ratchet:actions/download-artifact@v4
# with:
# name: test-results
# path: ./web/test-results
# - name: Run Chromatic
# uses: chromaui/action@latest
# with:
# playwright: true
# projectToken: ${{ secrets.CHROMATIC_PROJECT_TOKEN }}
# workingDir: ./web
# env:
# CHROMATIC_ARCHIVE_LOCATION: ./test-results

View File

@@ -27,7 +27,7 @@ jobs:
steps:
- uses: runs-on/action@cd2b598b0515d39d78c38a02d529db87d2196d1e # ratchet:runs-on/action@v2
- name: Checkout code
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # ratchet:actions/checkout@v6
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # ratchet:actions/checkout@v6
with:
persist-credentials: false
@@ -42,6 +42,9 @@ jobs:
- name: Generate OpenAPI schema and Python client
shell: bash
# TODO(Nik): https://linear.app/onyx-app/issue/ENG-1/update-test-infra-to-use-test-license
env:
LICENSE_ENFORCEMENT_ENABLED: "false"
run: |
ods openapi all
@@ -50,8 +53,9 @@ jobs:
uses: runs-on/cache@50350ad4242587b6c8c2baa2e740b1bc11285ff4 # ratchet:runs-on/cache@v4
with:
path: backend/.mypy_cache
key: mypy-${{ runner.os }}-${{ hashFiles('**/*.py', '**/*.pyi', 'backend/pyproject.toml') }}
key: mypy-${{ runner.os }}-${{ github.base_ref || github.event.merge_group.base_ref || 'main' }}-${{ hashFiles('**/*.py', '**/*.pyi', 'backend/pyproject.toml') }}
restore-keys: |
mypy-${{ runner.os }}-${{ github.base_ref || github.event.merge_group.base_ref || 'main' }}-
mypy-${{ runner.os }}-
- name: Run MyPy

View File

@@ -65,7 +65,7 @@ env:
ZENDESK_TOKEN: ${{ secrets.ZENDESK_TOKEN }}
# Salesforce
SF_USERNAME: ${{ secrets.SF_USERNAME }}
SF_USERNAME: ${{ vars.SF_USERNAME }}
SF_PASSWORD: ${{ secrets.SF_PASSWORD }}
SF_SECURITY_TOKEN: ${{ secrets.SF_SECURITY_TOKEN }}
@@ -110,6 +110,9 @@ env:
# Slack
SLACK_BOT_TOKEN: ${{ secrets.SLACK_BOT_TOKEN }}
# Discord
DISCORD_CONNECTOR_BOT_TOKEN: ${{ secrets.DISCORD_CONNECTOR_BOT_TOKEN }}
# Teams
TEAMS_APPLICATION_ID: ${{ secrets.TEAMS_APPLICATION_ID }}
TEAMS_DIRECTORY_ID: ${{ secrets.TEAMS_DIRECTORY_ID }}
@@ -139,7 +142,7 @@ jobs:
- uses: runs-on/action@cd2b598b0515d39d78c38a02d529db87d2196d1e # ratchet:runs-on/action@v2
- name: Checkout code
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # ratchet:actions/checkout@v6
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # ratchet:actions/checkout@v6
with:
persist-credentials: false

View File

@@ -5,11 +5,6 @@ on:
# This cron expression runs the job daily at 16:00 UTC (9am PT)
- cron: "0 16 * * *"
workflow_dispatch:
inputs:
branch:
description: 'Branch to run the workflow on'
required: false
default: 'main'
permissions:
contents: read
@@ -31,7 +26,11 @@ env:
jobs:
model-check:
# See https://runs-on.com/runners/linux/
runs-on: [runs-on,runner=8cpu-linux-x64,"run-id=${{ github.run_id }}-model-check"]
runs-on:
- runs-on
- runner=4cpu-linux-arm64
- "run-id=${{ github.run_id }}-model-check"
- "extras=ecr-cache"
timeout-minutes: 45
env:
@@ -39,112 +38,91 @@ jobs:
steps:
- name: Checkout code
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # ratchet:actions/checkout@v6
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # ratchet:actions/checkout@v6
with:
persist-credentials: false
- name: Setup Python and Install Dependencies
uses: ./.github/actions/setup-python-and-install-dependencies
with:
requirements: |
backend/requirements/default.txt
backend/requirements/dev.txt
- name: Format branch name for cache
id: format-branch
env:
PR_NUMBER: ${{ github.event.pull_request.number }}
REF_NAME: ${{ github.ref_name }}
run: |
if [ -n "${PR_NUMBER}" ]; then
CACHE_SUFFIX="${PR_NUMBER}"
else
# shellcheck disable=SC2001
CACHE_SUFFIX=$(echo "${REF_NAME}" | sed 's/[^A-Za-z0-9._-]/-/g')
fi
echo "cache-suffix=${CACHE_SUFFIX}" >> $GITHUB_OUTPUT
- name: Login to Docker Hub
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # ratchet:docker/login-action@v3
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9
with:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_TOKEN }}
# tag every docker image with "test" so that we can spin up the correct set
# of images during testing
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f
# We don't need to build the Web Docker image since it's not yet used
# in the integration tests. We have a separate action to verify that it builds
# successfully.
- name: Pull Model Server Docker image
run: |
docker pull onyxdotapp/onyx-model-server:latest
docker tag onyxdotapp/onyx-model-server:latest onyxdotapp/onyx-model-server:test
- name: Set up Python
uses: actions/setup-python@83679a892e2d95755f2dac6acb0bfd1e9ac5d548 # ratchet:actions/setup-python@v6
- name: Build and load
uses: docker/bake-action@5be5f02ff8819ecd3092ea6b2e6261c31774f2b4 # ratchet:docker/bake-action@v6
env:
TAG: model-server-${{ github.run_id }}
with:
python-version: "3.11"
cache: "pip"
cache-dependency-path: |
backend/requirements/default.txt
backend/requirements/dev.txt
- name: Install Dependencies
run: |
python -m pip install --upgrade pip
pip install --retries 5 --timeout 30 -r backend/requirements/default.txt
pip install --retries 5 --timeout 30 -r backend/requirements/dev.txt
load: true
targets: model-server
set: |
model-server.cache-from=type=registry,ref=${{ env.RUNS_ON_ECR_CACHE }}:model-server-cache-${{ github.event.pull_request.head.sha || github.sha }}
model-server.cache-from=type=registry,ref=${{ env.RUNS_ON_ECR_CACHE }}:model-server-cache-${{ steps.format-branch.outputs.cache-suffix }}
model-server.cache-from=type=registry,ref=${{ env.RUNS_ON_ECR_CACHE }}:model-server-cache
model-server.cache-from=type=registry,ref=onyxdotapp/onyx-model-server:latest
model-server.cache-to=type=registry,ref=${{ env.RUNS_ON_ECR_CACHE }}:model-server-cache-${{ github.event.pull_request.head.sha || github.sha }},mode=max
model-server.cache-to=type=registry,ref=${{ env.RUNS_ON_ECR_CACHE }}:model-server-cache-${{ steps.format-branch.outputs.cache-suffix }},mode=max
model-server.cache-to=type=registry,ref=${{ env.RUNS_ON_ECR_CACHE }}:model-server-cache,mode=max
- name: Start Docker containers
id: start_docker
env:
IMAGE_TAG: model-server-${{ github.run_id }}
run: |
cd deployment/docker_compose
ENABLE_PAID_ENTERPRISE_EDITION_FEATURES=true \
AUTH_TYPE=basic \
REQUIRE_EMAIL_VERIFICATION=false \
DISABLE_TELEMETRY=true \
IMAGE_TAG=test \
docker compose -f docker-compose.model-server-test.yml up -d indexing_model_server
id: start_docker
- name: Wait for service to be ready
run: |
echo "Starting wait-for-service script..."
start_time=$(date +%s)
timeout=300 # 5 minutes in seconds
while true; do
current_time=$(date +%s)
elapsed_time=$((current_time - start_time))
if [ $elapsed_time -ge $timeout ]; then
echo "Timeout reached. Service did not become ready in 5 minutes."
exit 1
fi
# Use curl with error handling to ignore specific exit code 56
response=$(curl -s -o /dev/null -w "%{http_code}" http://localhost:9000/api/health || echo "curl_error")
if [ "$response" = "200" ]; then
echo "Service is ready!"
break
elif [ "$response" = "curl_error" ]; then
echo "Curl encountered an error, possibly exit code 56. Continuing to retry..."
else
echo "Service not ready yet (HTTP status $response). Retrying in 5 seconds..."
fi
sleep 5
done
echo "Finished waiting for service."
docker compose \
-f docker-compose.yml \
-f docker-compose.dev.yml \
up -d --wait \
inference_model_server
- name: Run Tests
shell: script -q -e -c "bash --noprofile --norc -eo pipefail {0}"
run: |
py.test -o junit_family=xunit2 -xv --ff backend/tests/daily/llm
py.test -o junit_family=xunit2 -xv --ff backend/tests/daily/embedding
- name: Alert on Failure
if: failure() && github.event_name == 'schedule'
env:
SLACK_WEBHOOK: ${{ secrets.SLACK_WEBHOOK }}
REPO: ${{ github.repository }}
RUN_ID: ${{ github.run_id }}
run: |
curl -X POST \
-H 'Content-type: application/json' \
--data "{\"text\":\"Scheduled Model Tests failed! Check the run at: https://github.com/${REPO}/actions/runs/${RUN_ID}\"}" \
$SLACK_WEBHOOK
uses: ./.github/actions/slack-notify
with:
webhook-url: ${{ secrets.SLACK_WEBHOOK }}
failed-jobs: model-check
title: "🚨 Scheduled Model Tests failed!"
ref-name: ${{ github.ref_name }}
- name: Dump all-container logs (optional)
if: always()
run: |
cd deployment/docker_compose
docker compose -f docker-compose.model-server-test.yml logs --no-color > $GITHUB_WORKSPACE/docker-compose.log || true
docker compose logs --no-color > $GITHUB_WORKSPACE/docker-compose.log || true
- name: Upload logs
if: always()
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # ratchet:actions/upload-artifact@v4
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f
with:
name: docker-all-logs
path: ${{ github.workspace }}/docker-compose.log

View File

@@ -27,12 +27,14 @@ jobs:
PYTHONPATH: ./backend
REDIS_CLOUD_PYTEST_PASSWORD: ${{ secrets.REDIS_CLOUD_PYTEST_PASSWORD }}
DISABLE_TELEMETRY: "true"
# TODO(Nik): https://linear.app/onyx-app/issue/ENG-1/update-test-infra-to-use-test-license
LICENSE_ENFORCEMENT_ENABLED: "false"
steps:
- uses: runs-on/action@cd2b598b0515d39d78c38a02d529db87d2196d1e # ratchet:runs-on/action@v2
- name: Checkout code
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # ratchet:actions/checkout@v6
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # ratchet:actions/checkout@v6
with:
persist-credentials: false

View File

@@ -20,17 +20,17 @@ jobs:
runs-on: ubuntu-latest
timeout-minutes: 45
steps:
- uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # ratchet:actions/checkout@v6
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # ratchet:actions/checkout@v6
with:
fetch-depth: 0
persist-credentials: false
- uses: actions/setup-python@83679a892e2d95755f2dac6acb0bfd1e9ac5d548 # ratchet:actions/setup-python@v6
- uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # ratchet:actions/setup-python@v6
with:
python-version: "3.11"
- name: Setup Terraform
uses: hashicorp/setup-terraform@b9cd54a3c349d3f38e8881555d616ced269862dd # ratchet:hashicorp/setup-terraform@v3
- name: Setup node
uses: actions/setup-node@395ad3262231945c25e8478fd5baf05154b1d79f # ratchet:actions/setup-node@v6
uses: actions/setup-node@6044e13b5dc448c55e2357c09f80417699197238 # ratchet:actions/setup-node@v6
with: # zizmor: ignore[cache-poisoning]
node-version: 22
cache: "npm"
@@ -38,7 +38,7 @@ jobs:
- name: Install node dependencies
working-directory: ./web
run: npm ci
- uses: j178/prek-action@91fd7d7cf70ae1dee9f4f44e7dfa5d1073fe6623 # ratchet:j178/prek-action@v1
- uses: j178/prek-action@9d6a3097e0c1865ecce00cfb89fe80f2ee91b547 # ratchet:j178/prek-action@v1
with:
prek-version: '0.2.21'
extra-args: ${{ github.event_name == 'pull_request' && format('--from-ref {0} --to-ref {1}', github.event.pull_request.base.sha, github.event.pull_request.head.sha) || github.event_name == 'merge_group' && format('--from-ref {0} --to-ref {1}', github.event.merge_group.base_sha, github.event.merge_group.head_sha) || github.ref_name == 'main' && '--all-files' || '' }}

73
.github/workflows/preview.yml vendored Normal file
View File

@@ -0,0 +1,73 @@
name: Preview Deployment
env:
VERCEL_ORG_ID: ${{ secrets.VERCEL_ORG_ID }}
VERCEL_PROJECT_ID: ${{ secrets.VERCEL_PROJECT_ID }}
VERCEL_CLI: vercel@50.14.1
on:
push:
branches-ignore:
- main
paths:
- "web/**"
permissions:
contents: read
pull-requests: write
jobs:
Deploy-Preview:
runs-on: ubuntu-latest
timeout-minutes: 30
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd
with:
persist-credentials: false
- name: Setup node
uses: actions/setup-node@6044e13b5dc448c55e2357c09f80417699197238 # ratchet:actions/setup-node@v4
with:
node-version: 22
cache: "npm"
cache-dependency-path: ./web/package-lock.json
- name: Pull Vercel Environment Information
run: npx --yes ${{ env.VERCEL_CLI }} pull --yes --environment=preview --token=${{ secrets.VERCEL_TOKEN }}
- name: Build Project Artifacts
run: npx --yes ${{ env.VERCEL_CLI }} build --token=${{ secrets.VERCEL_TOKEN }}
- name: Deploy Project Artifacts to Vercel
id: deploy
run: |
DEPLOYMENT_URL=$(npx --yes ${{ env.VERCEL_CLI }} deploy --prebuilt --token=${{ secrets.VERCEL_TOKEN }})
echo "url=$DEPLOYMENT_URL" >> "$GITHUB_OUTPUT"
- name: Update PR comment with deployment URL
if: always() && steps.deploy.outputs.url
env:
GH_TOKEN: ${{ github.token }}
DEPLOYMENT_URL: ${{ steps.deploy.outputs.url }}
run: |
# Find the PR for this branch
PR_NUMBER=$(gh pr list --head "$GITHUB_REF_NAME" --json number --jq '.[0].number')
if [ -z "$PR_NUMBER" ]; then
echo "No open PR found for branch $GITHUB_REF_NAME, skipping comment."
exit 0
fi
COMMENT_MARKER="<!-- preview-deployment -->"
COMMENT_BODY="$COMMENT_MARKER
**Preview Deployment**
| Status | Preview | Commit | Updated |
| --- | --- | --- | --- |
| ✅ | $DEPLOYMENT_URL | \`${GITHUB_SHA::7}\` | $(date -u '+%Y-%m-%d %H:%M:%S UTC') |"
# Find existing comment by marker
EXISTING_COMMENT_ID=$(gh api "repos/$GITHUB_REPOSITORY/issues/$PR_NUMBER/comments" \
--jq ".[] | select(.body | startswith(\"$COMMENT_MARKER\")) | .id" | head -1)
if [ -n "$EXISTING_COMMENT_ID" ]; then
gh api "repos/$GITHUB_REPOSITORY/issues/comments/$EXISTING_COMMENT_ID" \
--method PATCH --field body="$COMMENT_BODY"
else
gh pr comment "$PR_NUMBER" --body "$COMMENT_BODY"
fi

View File

@@ -24,11 +24,11 @@ jobs:
- { goos: "darwin", goarch: "arm64" }
- { goos: "", goarch: "" }
steps:
- uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # ratchet:actions/checkout@v6
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # ratchet:actions/checkout@v6
with:
persist-credentials: false
fetch-depth: 0
- uses: astral-sh/setup-uv@ed21f2f24f8dd64503750218de024bcf64c7250a # ratchet:astral-sh/setup-uv@v7
- uses: astral-sh/setup-uv@61cb8a9741eeb8a550a1b8544337180c0fc8476b # ratchet:astral-sh/setup-uv@v7
with:
enable-cache: false
version: "0.9.9"

290
.github/workflows/sandbox-deployment.yml vendored Normal file
View File

@@ -0,0 +1,290 @@
name: Build and Push Sandbox Image on Tag
on:
push:
tags:
- "experimental-cc4a.*"
# Restrictive defaults; jobs declare what they need.
permissions: {}
jobs:
check-sandbox-changes:
runs-on: ubuntu-slim
timeout-minutes: 10
permissions:
contents: read
outputs:
sandbox-changed: ${{ steps.check.outputs.sandbox-changed }}
new-version: ${{ steps.version.outputs.new-version }}
steps:
- name: Checkout
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # ratchet:actions/checkout@v6
with:
persist-credentials: false
fetch-depth: 0
- name: Check for sandbox-relevant file changes
id: check
run: |
# Get the previous tag to diff against
CURRENT_TAG="${GITHUB_REF_NAME}"
PREVIOUS_TAG=$(git tag --sort=-creatordate | grep '^experimental-cc4a\.' | grep -v "^${CURRENT_TAG}$" | head -n 1)
if [ -z "$PREVIOUS_TAG" ]; then
echo "No previous experimental-cc4a tag found, building unconditionally"
echo "sandbox-changed=true" >> "$GITHUB_OUTPUT"
exit 0
fi
echo "Comparing ${PREVIOUS_TAG}..${CURRENT_TAG}"
# Check if any sandbox-relevant files changed
SANDBOX_PATHS=(
"backend/onyx/server/features/build/sandbox/"
)
CHANGED=false
for path in "${SANDBOX_PATHS[@]}"; do
if git diff --name-only "${PREVIOUS_TAG}..${CURRENT_TAG}" -- "$path" | grep -q .; then
echo "Changes detected in: $path"
CHANGED=true
break
fi
done
echo "sandbox-changed=$CHANGED" >> "$GITHUB_OUTPUT"
- name: Determine new sandbox version
id: version
if: steps.check.outputs.sandbox-changed == 'true'
run: |
# Query Docker Hub for the latest versioned tag
LATEST_TAG=$(curl -s "https://hub.docker.com/v2/repositories/onyxdotapp/sandbox/tags?page_size=100" \
| jq -r '.results[].name' \
| grep -E '^v[0-9]+\.[0-9]+\.[0-9]+$' \
| sort -V \
| tail -n 1)
if [ -z "$LATEST_TAG" ]; then
echo "No existing version tags found on Docker Hub, starting at 0.1.1"
NEW_VERSION="0.1.1"
else
CURRENT_VERSION="${LATEST_TAG#v}"
echo "Latest version on Docker Hub: $CURRENT_VERSION"
# Increment patch version
MAJOR=$(echo "$CURRENT_VERSION" | cut -d. -f1)
MINOR=$(echo "$CURRENT_VERSION" | cut -d. -f2)
PATCH=$(echo "$CURRENT_VERSION" | cut -d. -f3)
NEW_PATCH=$((PATCH + 1))
NEW_VERSION="${MAJOR}.${MINOR}.${NEW_PATCH}"
fi
echo "New version: $NEW_VERSION"
echo "new-version=$NEW_VERSION" >> "$GITHUB_OUTPUT"
build-sandbox-amd64:
needs: check-sandbox-changes
if: needs.check-sandbox-changes.outputs.sandbox-changed == 'true'
runs-on:
- runs-on
- runner=4cpu-linux-x64
- run-id=${{ github.run_id }}-sandbox-amd64
- extras=ecr-cache
timeout-minutes: 90
environment: release
permissions:
contents: read
id-token: write
outputs:
digest: ${{ steps.build.outputs.digest }}
env:
REGISTRY_IMAGE: onyxdotapp/sandbox
steps:
- uses: runs-on/action@cd2b598b0515d39d78c38a02d529db87d2196d1e # ratchet:runs-on/action@v2
- name: Checkout
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # ratchet:actions/checkout@v6
with:
persist-credentials: false
- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials@61815dcd50bd041e203e49132bacad1fd04d2708
with:
role-to-assume: ${{ secrets.AWS_OIDC_ROLE_ARN }}
aws-region: us-east-2
- name: Get AWS Secrets
uses: aws-actions/aws-secretsmanager-get-secrets@a9a7eb4e2f2871d30dc5b892576fde60a2ecc802
with:
secret-ids: |
DOCKER_USERNAME, deploy/docker-username
DOCKER_TOKEN, deploy/docker-token
parse-json-secrets: true
- name: Docker meta
id: meta
uses: docker/metadata-action@c299e40c65443455700f0fdfc63efafe5b349051 # ratchet:docker/metadata-action@v5
with:
images: ${{ env.REGISTRY_IMAGE }}
flavor: |
latest=false
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # ratchet:docker/setup-buildx-action@v3
- name: Login to Docker Hub
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # ratchet:docker/login-action@v3
with:
username: ${{ env.DOCKER_USERNAME }}
password: ${{ env.DOCKER_TOKEN }}
- name: Build and push AMD64
id: build
uses: docker/build-push-action@263435318d21b8e681c14492fe198d362a7d2c83 # ratchet:docker/build-push-action@v6
with:
context: ./backend/onyx/server/features/build/sandbox/kubernetes/docker
file: ./backend/onyx/server/features/build/sandbox/kubernetes/docker/Dockerfile
platforms: linux/amd64
labels: ${{ steps.meta.outputs.labels }}
cache-from: |
type=registry,ref=${{ env.REGISTRY_IMAGE }}:latest
cache-to: |
type=inline
outputs: type=image,name=${{ env.REGISTRY_IMAGE }},push-by-digest=true,name-canonical=true,push=true
build-sandbox-arm64:
needs: check-sandbox-changes
if: needs.check-sandbox-changes.outputs.sandbox-changed == 'true'
runs-on:
- runs-on
- runner=4cpu-linux-arm64
- run-id=${{ github.run_id }}-sandbox-arm64
- extras=ecr-cache
timeout-minutes: 90
environment: release
permissions:
contents: read
id-token: write
outputs:
digest: ${{ steps.build.outputs.digest }}
env:
REGISTRY_IMAGE: onyxdotapp/sandbox
steps:
- uses: runs-on/action@cd2b598b0515d39d78c38a02d529db87d2196d1e # ratchet:runs-on/action@v2
- name: Checkout
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # ratchet:actions/checkout@v6
with:
persist-credentials: false
- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials@61815dcd50bd041e203e49132bacad1fd04d2708
with:
role-to-assume: ${{ secrets.AWS_OIDC_ROLE_ARN }}
aws-region: us-east-2
- name: Get AWS Secrets
uses: aws-actions/aws-secretsmanager-get-secrets@a9a7eb4e2f2871d30dc5b892576fde60a2ecc802
with:
secret-ids: |
DOCKER_USERNAME, deploy/docker-username
DOCKER_TOKEN, deploy/docker-token
parse-json-secrets: true
- name: Docker meta
id: meta
uses: docker/metadata-action@c299e40c65443455700f0fdfc63efafe5b349051 # ratchet:docker/metadata-action@v5
with:
images: ${{ env.REGISTRY_IMAGE }}
flavor: |
latest=false
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # ratchet:docker/setup-buildx-action@v3
- name: Login to Docker Hub
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # ratchet:docker/login-action@v3
with:
username: ${{ env.DOCKER_USERNAME }}
password: ${{ env.DOCKER_TOKEN }}
- name: Build and push ARM64
id: build
uses: docker/build-push-action@263435318d21b8e681c14492fe198d362a7d2c83 # ratchet:docker/build-push-action@v6
with:
context: ./backend/onyx/server/features/build/sandbox/kubernetes/docker
file: ./backend/onyx/server/features/build/sandbox/kubernetes/docker/Dockerfile
platforms: linux/arm64
labels: ${{ steps.meta.outputs.labels }}
cache-from: |
type=registry,ref=${{ env.REGISTRY_IMAGE }}:latest
cache-to: |
type=inline
outputs: type=image,name=${{ env.REGISTRY_IMAGE }},push-by-digest=true,name-canonical=true,push=true
merge-sandbox:
needs:
- check-sandbox-changes
- build-sandbox-amd64
- build-sandbox-arm64
runs-on:
- runs-on
- runner=2cpu-linux-x64
- run-id=${{ github.run_id }}-merge-sandbox
- extras=ecr-cache
timeout-minutes: 30
environment: release
permissions:
id-token: write
env:
REGISTRY_IMAGE: onyxdotapp/sandbox
steps:
- uses: runs-on/action@cd2b598b0515d39d78c38a02d529db87d2196d1e # ratchet:runs-on/action@v2
- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials@61815dcd50bd041e203e49132bacad1fd04d2708
with:
role-to-assume: ${{ secrets.AWS_OIDC_ROLE_ARN }}
aws-region: us-east-2
- name: Get AWS Secrets
uses: aws-actions/aws-secretsmanager-get-secrets@a9a7eb4e2f2871d30dc5b892576fde60a2ecc802
with:
secret-ids: |
DOCKER_USERNAME, deploy/docker-username
DOCKER_TOKEN, deploy/docker-token
parse-json-secrets: true
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # ratchet:docker/setup-buildx-action@v3
- name: Login to Docker Hub
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # ratchet:docker/login-action@v3
with:
username: ${{ env.DOCKER_USERNAME }}
password: ${{ env.DOCKER_TOKEN }}
- name: Docker meta
id: meta
uses: docker/metadata-action@c299e40c65443455700f0fdfc63efafe5b349051 # ratchet:docker/metadata-action@v5
with:
images: ${{ env.REGISTRY_IMAGE }}
flavor: |
latest=false
tags: |
type=raw,value=v${{ needs.check-sandbox-changes.outputs.new-version }}
type=raw,value=latest
- name: Create and push manifest
env:
IMAGE_REPO: ${{ env.REGISTRY_IMAGE }}
AMD64_DIGEST: ${{ needs.build-sandbox-amd64.outputs.digest }}
ARM64_DIGEST: ${{ needs.build-sandbox-arm64.outputs.digest }}
META_TAGS: ${{ steps.meta.outputs.tags }}
run: |
IMAGES="${IMAGE_REPO}@${AMD64_DIGEST} ${IMAGE_REPO}@${ARM64_DIGEST}"
docker buildx imagetools create \
$(printf '%s\n' "${META_TAGS}" | xargs -I {} echo -t {}) \
$IMAGES

View File

@@ -14,7 +14,7 @@ jobs:
contents: read
steps:
- name: Checkout main Onyx repo
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # ratchet:actions/checkout@v6
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # ratchet:actions/checkout@v6
with:
fetch-depth: 0
persist-credentials: false

View File

@@ -18,7 +18,7 @@ jobs:
# see https://github.com/orgs/community/discussions/27028#discussioncomment-3254367 for the workaround we
# implement here which needs an actual user's deploy key
- name: Checkout code
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # ratchet:actions/checkout@v6
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # ratchet:actions/checkout@v6
with:
ssh-key: "${{ secrets.DEPLOY_KEY }}"
persist-credentials: true

View File

@@ -17,7 +17,7 @@ jobs:
security-events: write # needed for SARIF uploads
steps:
- name: Checkout repository
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # ratchet:actions/checkout@v6.0.1
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # ratchet:actions/checkout@v6.0.2
with:
persist-credentials: false
@@ -31,7 +31,7 @@ jobs:
- name: Install the latest version of uv
if: steps.filter.outputs.zizmor == 'true' || github.ref_name == 'main'
uses: astral-sh/setup-uv@ed21f2f24f8dd64503750218de024bcf64c7250a # ratchet:astral-sh/setup-uv@v7
uses: astral-sh/setup-uv@61cb8a9741eeb8a550a1b8544337180c0fc8476b # ratchet:astral-sh/setup-uv@v7
with:
enable-cache: false
version: "0.9.9"

12
.gitignore vendored
View File

@@ -1,7 +1,12 @@
# editors
.vscode
.vscode/*
!/.vscode/env_template.txt
!/.vscode/env.web_template.txt
!/.vscode/launch.json
!/.vscode/tasks.template.jsonc
.zed
.cursor
!/.cursor/mcp.json
# macos
.DS_store
@@ -21,6 +26,7 @@ backend/tests/regression/search_quality/*.json
backend/onyx/evals/data/
backend/onyx/evals/one_off/*.json
*.log
*.csv
# secret files
.env
@@ -35,10 +41,6 @@ settings.json
/backend/tests/regression/answer_quality/search_test_config.yaml
*.egg-info
# Claude
AGENTS.md
CLAUDE.md
# Local .terraform directories
**/.terraform/*

View File

@@ -11,7 +11,6 @@ repos:
- id: uv-sync
args: ["--locked", "--all-extras"]
- id: uv-lock
files: ^pyproject\.toml$
- id: uv-export
name: uv-export default.txt
args:
@@ -67,7 +66,8 @@ repos:
- id: uv-run
name: Check lazy imports
args: ["--active", "--with=onyx-devtools", "ods", "check-lazy-imports"]
files: ^backend/(?!\.venv/).*\.py$
pass_filenames: true
files: ^backend/(?!\.venv/|scripts/).*\.py$
# NOTE: This takes ~6s on a single, large module which is prohibitively slow.
# - id: uv-run
# name: mypy
@@ -75,6 +75,13 @@ repos:
# pass_filenames: true
# files: ^backend/.*\.py$
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: 3e8a8703264a2f4a69428a0aa4dcb512790b2c8c # frozen: v6.0.0
hooks:
- id: check-added-large-files
name: Check for added large files
args: ["--maxkb=1500"]
- repo: https://github.com/rhysd/actionlint
rev: a443f344ff32813837fa49f7aa6cbc478d770e62 # frozen: v1.7.9
hooks:
@@ -147,6 +154,22 @@ repos:
pass_filenames: false
files: \.tf$
- id: npm-install
name: npm install
description: "Automatically run 'npm install' after a checkout, pull or rebase"
language: system
entry: bash -c 'cd web && npm install --no-save'
pass_filenames: false
files: ^web/package(-lock)?\.json$
stages: [post-checkout, post-merge, post-rewrite]
- id: npm-install-check
name: npm install --package-lock-only
description: "Check the 'web/package-lock.json' is updated"
language: system
entry: bash -c 'cd web && npm install --package-lock-only'
pass_filenames: false
files: ^web/package(-lock)?\.json$
# Uses tsgo (TypeScript's native Go compiler) for ~10x faster type checking.
# This is a preview package - if it breaks:
# 1. Try updating: cd web && npm update @typescript/native-preview

16
.vscode/env.web_template.txt vendored Normal file
View File

@@ -0,0 +1,16 @@
# Copy this file to .env.web in the .vscode folder.
# Fill in the <REPLACE THIS> values as needed
# Web Server specific environment variables
# Minimal set needed for Next.js dev server
# Auth
AUTH_TYPE=basic
DEV_MODE=true
# Enable the full set of Danswer Enterprise Edition features.
# NOTE: DO NOT ENABLE THIS UNLESS YOU HAVE A PAID ENTERPRISE LICENSE (or if you
# are using this for local testing/development).
ENABLE_PAID_ENTERPRISE_EDITION_FEATURES=false
# Enable Onyx Craft
ENABLE_CRAFT=true

View File

@@ -6,23 +6,17 @@
# processes.
# For local dev, often user Authentication is not needed.
AUTH_TYPE=disabled
AUTH_TYPE=basic
DEV_MODE=true
# Always keep these on for Dev.
# Logs model prompts, reasoning, and answer to stdout.
LOG_ONYX_MODEL_INTERACTIONS=True
LOG_ONYX_MODEL_INTERACTIONS=False
# More verbose logging
LOG_LEVEL=debug
# This passes top N results to LLM an additional time for reranking prior to
# answer generation.
# This step is quite heavy on token usage so we disable it for dev generally.
DISABLE_LLM_DOC_RELEVANCE=False
# Useful if you want to toggle auth on/off (google_oauth/OIDC specifically).
OAUTH_CLIENT_ID=<REPLACE THIS>
OAUTH_CLIENT_SECRET=<REPLACE THIS>
@@ -41,7 +35,6 @@ GEN_AI_API_KEY=<REPLACE THIS>
OPENAI_API_KEY=<REPLACE THIS>
# If answer quality isn't important for dev, use gpt-4o-mini since it's cheaper.
GEN_AI_MODEL_VERSION=gpt-4o
FAST_GEN_AI_MODEL_VERSION=gpt-4o
# Python stuff

View File

@@ -1,5 +1,3 @@
/* Copy this file into '.vscode/launch.json' or merge its contents into your existing configurations. */
{
// Use IntelliSense to learn about possible attributes.
// Hover to view descriptions of existing attributes.
@@ -24,9 +22,10 @@
"Slack Bot",
"Celery primary",
"Celery light",
"Celery background",
"Celery heavy",
"Celery docfetching",
"Celery docprocessing",
"Celery user_file_processing",
"Celery beat"
],
"presentation": {
@@ -88,7 +87,7 @@
"request": "launch",
"cwd": "${workspaceRoot}/web",
"runtimeExecutable": "npm",
"envFile": "${workspaceFolder}/.vscode/.env",
"envFile": "${workspaceFolder}/.vscode/.env.web",
"runtimeArgs": ["run", "dev"],
"presentation": {
"group": "2"
@@ -123,7 +122,6 @@
"cwd": "${workspaceFolder}/backend",
"envFile": "${workspaceFolder}/.vscode/.env",
"env": {
"LOG_ONYX_MODEL_INTERACTIONS": "True",
"LOG_LEVEL": "DEBUG",
"PYTHONUNBUFFERED": "1"
},
@@ -151,6 +149,24 @@
},
"consoleTitle": "Slack Bot Console"
},
{
"name": "Discord Bot",
"consoleName": "Discord Bot",
"type": "debugpy",
"request": "launch",
"program": "onyx/onyxbot/discord/client.py",
"cwd": "${workspaceFolder}/backend",
"envFile": "${workspaceFolder}/.vscode/.env",
"env": {
"LOG_LEVEL": "DEBUG",
"PYTHONUNBUFFERED": "1",
"PYTHONPATH": "."
},
"presentation": {
"group": "2"
},
"consoleTitle": "Discord Bot Console"
},
{
"name": "MCP Server",
"consoleName": "MCP Server",
@@ -230,7 +246,7 @@
"--loglevel=INFO",
"--hostname=light@%n",
"-Q",
"vespa_metadata_sync,connector_deletion,doc_permissions_upsert,index_attempt_cleanup"
"vespa_metadata_sync,connector_deletion,doc_permissions_upsert,index_attempt_cleanup,opensearch_migration"
],
"presentation": {
"group": "2"
@@ -259,7 +275,7 @@
"--loglevel=INFO",
"--hostname=background@%n",
"-Q",
"vespa_metadata_sync,connector_deletion,doc_permissions_upsert,checkpoint_cleanup,index_attempt_cleanup,docprocessing,connector_doc_fetching,user_files_indexing,connector_pruning,connector_doc_permissions_sync,connector_external_group_sync,csv_generation,kg_processing,monitoring,user_file_processing,user_file_project_sync,user_file_delete"
"vespa_metadata_sync,connector_deletion,doc_permissions_upsert,checkpoint_cleanup,index_attempt_cleanup,docprocessing,connector_doc_fetching,user_files_indexing,connector_pruning,connector_doc_permissions_sync,connector_external_group_sync,csv_generation,kg_processing,monitoring,user_file_processing,user_file_project_sync,user_file_delete,opensearch_migration"
],
"presentation": {
"group": "2"
@@ -399,7 +415,6 @@
"onyx.background.celery.versioned_apps.docfetching",
"worker",
"--pool=threads",
"--concurrency=1",
"--prefetch-multiplier=1",
"--loglevel=INFO",
"--hostname=docfetching@%n",
@@ -430,7 +445,6 @@
"onyx.background.celery.versioned_apps.docprocessing",
"worker",
"--pool=threads",
"--concurrency=6",
"--prefetch-multiplier=1",
"--loglevel=INFO",
"--hostname=docprocessing@%n",
@@ -558,7 +572,6 @@
"cwd": "${workspaceFolder}/backend",
"envFile": "${workspaceFolder}/.vscode/.env",
"env": {
"LOG_ONYX_MODEL_INTERACTIONS": "True",
"LOG_LEVEL": "DEBUG",
"PYTHONUNBUFFERED": "1",
"PYTHONPATH": "."
@@ -579,6 +592,137 @@
"group": "3"
}
},
{
"name": "Build Sandbox Templates",
"type": "debugpy",
"request": "launch",
"module": "onyx.server.features.build.sandbox.build_templates",
"cwd": "${workspaceFolder}/backend",
"envFile": "${workspaceFolder}/.vscode/.env",
"env": {
"PYTHONUNBUFFERED": "1",
"PYTHONPATH": "."
},
"console": "integratedTerminal",
"presentation": {
"group": "3"
},
"consoleTitle": "Build Sandbox Templates"
},
{
// Dummy entry used to label the group
"name": "--- Database ---",
"type": "node",
"request": "launch",
"presentation": {
"group": "4",
"order": 0
}
},
{
"name": "Restore seeded database dump",
"type": "node",
"request": "launch",
"runtimeExecutable": "uv",
"runtimeArgs": [
"run",
"--with",
"onyx-devtools",
"ods",
"db",
"restore",
"--fetch-seeded",
"--yes"
],
"cwd": "${workspaceFolder}",
"console": "integratedTerminal",
"presentation": {
"group": "4"
}
},
{
"name": "Clean restore seeded database dump (destructive)",
"type": "node",
"request": "launch",
"runtimeExecutable": "uv",
"runtimeArgs": [
"run",
"--with",
"onyx-devtools",
"ods",
"db",
"restore",
"--fetch-seeded",
"--clean",
"--yes"
],
"cwd": "${workspaceFolder}",
"console": "integratedTerminal",
"presentation": {
"group": "4"
}
},
{
"name": "Create database snapshot",
"type": "node",
"request": "launch",
"runtimeExecutable": "uv",
"runtimeArgs": [
"run",
"--with",
"onyx-devtools",
"ods",
"db",
"dump",
"backup.dump"
],
"cwd": "${workspaceFolder}",
"console": "integratedTerminal",
"presentation": {
"group": "4"
}
},
{
"name": "Clean restore database snapshot (destructive)",
"type": "node",
"request": "launch",
"runtimeExecutable": "uv",
"runtimeArgs": [
"run",
"--with",
"onyx-devtools",
"ods",
"db",
"restore",
"--clean",
"--yes",
"backup.dump"
],
"cwd": "${workspaceFolder}",
"console": "integratedTerminal",
"presentation": {
"group": "4"
}
},
{
"name": "Upgrade database to head revision",
"type": "node",
"request": "launch",
"runtimeExecutable": "uv",
"runtimeArgs": [
"run",
"--with",
"onyx-devtools",
"ods",
"db",
"upgrade"
],
"cwd": "${workspaceFolder}",
"console": "integratedTerminal",
"presentation": {
"group": "4"
}
},
{
// script to generate the openapi schema
"name": "Onyx OpenAPI Schema Generator",

View File

@@ -1,26 +1,25 @@
# CLAUDE.md
# PROJECT KNOWLEDGE BASE
This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
This file provides guidance to AI agents when working with code in this repository.
## KEY NOTES
- If you run into any missing python dependency errors, try running your command with `source .venv/bin/activate` \
to assume the python venv.
to assume the python venv.
- To make tests work, check the `.env` file at the root of the project to find an OpenAI key.
- If using `playwright` to explore the frontend, you can usually log in with username `a@example.com` and password
`a`. The app can be accessed at `http://localhost:3000`.
`a`. The app can be accessed at `http://localhost:3000`.
- You should assume that all Onyx services are running. To verify, you can check the `backend/log` directory to
make sure we see logs coming out from the relevant service.
make sure we see logs coming out from the relevant service.
- To connect to the Postgres database, use: `docker exec -it onyx-relational_db-1 psql -U postgres -c "<SQL>"`
- When making calls to the backend, always go through the frontend. E.g. make a call to `http://localhost:3000/api/persona` not `http://localhost:8080/api/persona`
- Put ALL db operations under the `backend/onyx/db` / `backend/ee/onyx/db` directories. Don't run queries
outside of those directories.
outside of those directories.
## Project Overview
**Onyx** (formerly Danswer) is an open-source Gen-AI and Enterprise Search platform that connects to company documents, apps, and people. It features a modular architecture with both Community Edition (MIT licensed) and Enterprise Edition offerings.
### Background Workers (Celery)
Onyx uses Celery for asynchronous task processing with multiple specialized workers:
@@ -92,6 +91,7 @@ Onyx uses Celery for asynchronous task processing with multiple specialized work
Onyx supports two deployment modes for background workers, controlled by the `USE_LIGHTWEIGHT_BACKGROUND_WORKER` environment variable:
**Lightweight Mode** (default, `USE_LIGHTWEIGHT_BACKGROUND_WORKER=true`):
- Runs a single consolidated `background` worker that handles all background tasks:
- Light worker tasks (Vespa operations, permissions sync, deletion)
- Document processing (indexing pipeline)
@@ -105,12 +105,14 @@ Onyx supports two deployment modes for background workers, controlled by the `US
- Default concurrency: 20 threads (increased to handle combined workload)
**Standard Mode** (`USE_LIGHTWEIGHT_BACKGROUND_WORKER=false`):
- Runs separate specialized workers as documented above (light, docprocessing, docfetching, heavy, kg_processing, monitoring, user_file_processing)
- Better isolation and scalability
- Can scale individual workers independently based on workload
- Suitable for production deployments with higher load
The deployment mode affects:
- **Backend**: Worker processes spawned by supervisord or dev scripts
- **Helm**: Which Kubernetes deployments are created
- **Dev Environment**: Which workers `dev_run_background_jobs.py` spawns
@@ -119,18 +121,18 @@ The deployment mode affects:
- **Thread-based Workers**: All workers use thread pools (not processes) for stability
- **Tenant Awareness**: Multi-tenant support with per-tenant task isolation. There is a
middleware layer that automatically finds the appropriate tenant ID when sending tasks
via Celery Beat.
middleware layer that automatically finds the appropriate tenant ID when sending tasks
via Celery Beat.
- **Task Prioritization**: High, Medium, Low priority queues
- **Monitoring**: Built-in heartbeat and liveness checking
- **Failure Handling**: Automatic retry and failure recovery mechanisms
- **Redis Coordination**: Inter-process communication via Redis
- **PostgreSQL State**: Task state and metadata stored in PostgreSQL
#### Important Notes
**Defining Tasks**:
**Defining Tasks**:
- Always use `@shared_task` rather than `@celery_app`
- Put tasks under `background/celery/tasks/` or `ee/background/celery/tasks`
@@ -142,7 +144,12 @@ function.
If you make any updates to a celery worker and you want to test these changes, you will need
to ask me to restart the celery worker. There is no auto-restart on code-change mechanism.
**Task Time Limits**:
Since all tasks are executed in thread pools, the time limit features of Celery are silently
disabled and won't work. Timeout logic must be implemented within the task itself.
### Code Quality
```bash
# Install and run pre-commit hooks
pre-commit install
@@ -154,6 +161,7 @@ NOTE: Always make sure everything is strictly typed (both in Python and Typescri
## Architecture Overview
### Technology Stack
- **Backend**: Python 3.11, FastAPI, SQLAlchemy, Alembic, Celery
- **Frontend**: Next.js 15+, React 18, TypeScript, Tailwind CSS
- **Database**: PostgreSQL with Redis caching
@@ -435,6 +443,7 @@ function ContactForm() {
**Reason:** Our custom color system uses CSS variables that automatically handle dark mode and maintain design consistency across the app. Standard Tailwind colors bypass this system.
**Available color categories:**
- **Text:** `text-01` through `text-05`, `text-inverted-XX`
- **Backgrounds:** `background-neutral-XX`, `background-tint-XX` (and inverted variants)
- **Borders:** `border-01` through `border-05`, `border-inverted-XX`
@@ -467,6 +476,7 @@ function ContactForm() {
## Database & Migrations
### Running Migrations
```bash
# Standard migrations
alembic upgrade head
@@ -476,6 +486,7 @@ alembic -n schema_private upgrade head
```
### Creating Migrations
```bash
# Create migration
alembic revision -m "description"
@@ -488,13 +499,14 @@ Write the migration manually and place it in the file that alembic creates when
## Testing Strategy
First, you must activate the virtual environment with `source .venv/bin/activate`.
First, you must activate the virtual environment with `source .venv/bin/activate`.
There are 4 main types of tests within Onyx:
### Unit Tests
These should not assume any Onyx/external services are available to be called.
Interactions with the outside world should be mocked using `unittest.mock`. Generally, only
Interactions with the outside world should be mocked using `unittest.mock`. Generally, only
write these for complex, isolated modules e.g. `citation_processing.py`.
To run them:
@@ -504,13 +516,14 @@ pytest -xv backend/tests/unit
```
### External Dependency Unit Tests
These tests assume that all external dependencies of Onyx are available and callable (e.g. Postgres, Redis,
These tests assume that all external dependencies of Onyx are available and callable (e.g. Postgres, Redis,
MinIO/S3, Vespa are running + OpenAI can be called + any request to the internet is fine + etc.).
However, the actual Onyx containers are not running and with these tests we call the function to test directly.
We can also mock components/calls at will.
We can also mock components/calls at will.
The goal with these tests are to minimize mocking while giving some flexibility to mock things that are flakey,
The goal with these tests are to minimize mocking while giving some flexibility to mock things that are flakey,
need strictly controlled behavior, or need to have their internal behavior validated (e.g. verify a function is called
with certain args, something that would be impossible with proper integration tests).
@@ -523,15 +536,16 @@ python -m dotenv -f .vscode/.env run -- pytest backend/tests/external_dependency
```
### Integration Tests
Standard integration tests. Every test in `backend/tests/integration` runs against a real Onyx deployment. We cannot
mock anything in these tests. Prefer writing integration tests (or External Dependency Unit Tests if mocking/internal
Standard integration tests. Every test in `backend/tests/integration` runs against a real Onyx deployment. We cannot
mock anything in these tests. Prefer writing integration tests (or External Dependency Unit Tests if mocking/internal
verification is necessary) over any other type of test.
Tests are parallelized at a directory level.
When writing integration tests, make sure to check the root `conftest.py` for useful fixtures + the `backend/tests/integration/common_utils` directory for utilities. Prefer (if one exists), calling the appropriate Manager
When writing integration tests, make sure to check the root `conftest.py` for useful fixtures + the `backend/tests/integration/common_utils` directory for utilities. Prefer (if one exists), calling the appropriate Manager
class in the utils over directly calling the APIs with a library like `requests`. Prefer using fixtures rather than
calling the utilities directly (e.g. do NOT create admin users with
calling the utilities directly (e.g. do NOT create admin users with
`admin_user = UserManager.create(name="admin_user")`, instead use the `admin_user` fixture).
A great example of this type of test is `backend/tests/integration/dev_apis/test_simple_chat_api.py`.
@@ -543,8 +557,9 @@ python -m dotenv -f .vscode/.env run -- pytest backend/tests/integration
```
### Playwright (E2E) Tests
These tests are an even more complete version of the Integration Tests mentioned above. Has all services of Onyx
running, *including* the Web Server.
These tests are an even more complete version of the Integration Tests mentioned above. Has all services of Onyx
running, _including_ the Web Server.
Use these tests for anything that requires significant frontend <-> backend coordination.
@@ -556,13 +571,11 @@ To run them:
npx playwright test <TEST_NAME>
```
## Logs
When (1) writing integration tests or (2) doing live tests (e.g. curl / playwright) you can get access
to logs via the `backend/log/<service_name>_debug.log` file. All Onyx services (api_server, web_server, celery_X)
will be tailing their logs to this file.
will be tailing their logs to this file.
## Security Considerations
@@ -581,6 +594,7 @@ will be tailing their logs to this file.
- Custom prompts and agent actions
## Creating a Plan
When creating a plan in the `plans` directory, make sure to include at least these elements:
**Issues to Address**
@@ -593,10 +607,10 @@ Things you come across in your research that are important to the implementation
How you are going to make the changes happen. High level approach.
**Tests**
What unit (use rarely), external dependency unit, integration, and playwright tests you plan to write to
What unit (use rarely), external dependency unit, integration, and playwright tests you plan to write to
verify the correct behavior. Don't overtest. Usually, a given change only needs one type of test.
Do NOT include these: *Timeline*, *Rollback plan*
Do NOT include these: _Timeline_, _Rollback plan_
This is a minimal list - feel free to include more. Do NOT write code as part of your plan.
Keep it high level. You can reference certain files or functions though.

View File

@@ -1,599 +0,0 @@
# AGENTS.md
This file provides guidance to AI agents when working with code in this repository.
## KEY NOTES
- If you run into any missing python dependency errors, try running your command with `source .venv/bin/activate` \
to assume the python venv.
- To make tests work, check the `.env` file at the root of the project to find an OpenAI key.
- If using `playwright` to explore the frontend, you can usually log in with username `a@example.com` and password
`a`. The app can be accessed at `http://localhost:3000`.
- You should assume that all Onyx services are running. To verify, you can check the `backend/log` directory to
make sure we see logs coming out from the relevant service.
- To connect to the Postgres database, use: `docker exec -it onyx-relational_db-1 psql -U postgres -c "<SQL>"`
- When making calls to the backend, always go through the frontend. E.g. make a call to `http://localhost:3000/api/persona` not `http://localhost:8080/api/persona`
- Put ALL db operations under the `backend/onyx/db` / `backend/ee/onyx/db` directories. Don't run queries
outside of those directories.
## Project Overview
**Onyx** (formerly Danswer) is an open-source Gen-AI and Enterprise Search platform that connects to company documents, apps, and people. It features a modular architecture with both Community Edition (MIT licensed) and Enterprise Edition offerings.
### Background Workers (Celery)
Onyx uses Celery for asynchronous task processing with multiple specialized workers:
#### Worker Types
1. **Primary Worker** (`celery_app.py`)
- Coordinates core background tasks and system-wide operations
- Handles connector management, document sync, pruning, and periodic checks
- Runs with 4 threads concurrency
- Tasks: connector deletion, vespa sync, pruning, LLM model updates, user file sync
2. **Docfetching Worker** (`docfetching`)
- Fetches documents from external data sources (connectors)
- Spawns docprocessing tasks for each document batch
- Implements watchdog monitoring for stuck connectors
- Configurable concurrency (default from env)
3. **Docprocessing Worker** (`docprocessing`)
- Processes fetched documents through the indexing pipeline:
- Upserts documents to PostgreSQL
- Chunks documents and adds contextual information
- Embeds chunks via model server
- Writes chunks to Vespa vector database
- Updates document metadata
- Configurable concurrency (default from env)
4. **Light Worker** (`light`)
- Handles lightweight, fast operations
- Tasks: vespa operations, document permissions sync, external group sync
- Higher concurrency for quick tasks
5. **Heavy Worker** (`heavy`)
- Handles resource-intensive operations
- Primary task: document pruning operations
- Runs with 4 threads concurrency
6. **KG Processing Worker** (`kg_processing`)
- Handles Knowledge Graph processing and clustering
- Builds relationships between documents
- Runs clustering algorithms
- Configurable concurrency
7. **Monitoring Worker** (`monitoring`)
- System health monitoring and metrics collection
- Monitors Celery queues, process memory, and system status
- Single thread (monitoring doesn't need parallelism)
- Cloud-specific monitoring tasks
8. **User File Processing Worker** (`user_file_processing`)
- Processes user-uploaded files
- Handles user file indexing and project synchronization
- Configurable concurrency
9. **Beat Worker** (`beat`)
- Celery's scheduler for periodic tasks
- Uses DynamicTenantScheduler for multi-tenant support
- Schedules tasks like:
- Indexing checks (every 15 seconds)
- Connector deletion checks (every 20 seconds)
- Vespa sync checks (every 20 seconds)
- Pruning checks (every 20 seconds)
- KG processing (every 60 seconds)
- Monitoring tasks (every 5 minutes)
- Cleanup tasks (hourly)
#### Worker Deployment Modes
Onyx supports two deployment modes for background workers, controlled by the `USE_LIGHTWEIGHT_BACKGROUND_WORKER` environment variable:
**Lightweight Mode** (default, `USE_LIGHTWEIGHT_BACKGROUND_WORKER=true`):
- Runs a single consolidated `background` worker that handles all background tasks:
- Pruning operations (from `heavy` worker)
- Knowledge graph processing (from `kg_processing` worker)
- Monitoring tasks (from `monitoring` worker)
- User file processing (from `user_file_processing` worker)
- Lower resource footprint (single worker process)
- Suitable for smaller deployments or development environments
- Default concurrency: 6 threads
**Standard Mode** (`USE_LIGHTWEIGHT_BACKGROUND_WORKER=false`):
- Runs separate specialized workers as documented above (heavy, kg_processing, monitoring, user_file_processing)
- Better isolation and scalability
- Can scale individual workers independently based on workload
- Suitable for production deployments with higher load
The deployment mode affects:
- **Backend**: Worker processes spawned by supervisord or dev scripts
- **Helm**: Which Kubernetes deployments are created
- **Dev Environment**: Which workers `dev_run_background_jobs.py` spawns
#### Key Features
- **Thread-based Workers**: All workers use thread pools (not processes) for stability
- **Tenant Awareness**: Multi-tenant support with per-tenant task isolation. There is a
middleware layer that automatically finds the appropriate tenant ID when sending tasks
via Celery Beat.
- **Task Prioritization**: High, Medium, Low priority queues
- **Monitoring**: Built-in heartbeat and liveness checking
- **Failure Handling**: Automatic retry and failure recovery mechanisms
- **Redis Coordination**: Inter-process communication via Redis
- **PostgreSQL State**: Task state and metadata stored in PostgreSQL
#### Important Notes
**Defining Tasks**:
- Always use `@shared_task` rather than `@celery_app`
- Put tasks under `background/celery/tasks/` or `ee/background/celery/tasks`
**Defining APIs**:
When creating new FastAPI APIs, do NOT use the `response_model` field. Instead, just type the
function.
**Testing Updates**:
If you make any updates to a celery worker and you want to test these changes, you will need
to ask me to restart the celery worker. There is no auto-restart on code-change mechanism.
### Code Quality
```bash
# Install and run pre-commit hooks
pre-commit install
pre-commit run --all-files
```
NOTE: Always make sure everything is strictly typed (both in Python and Typescript).
## Architecture Overview
### Technology Stack
- **Backend**: Python 3.11, FastAPI, SQLAlchemy, Alembic, Celery
- **Frontend**: Next.js 15+, React 18, TypeScript, Tailwind CSS
- **Database**: PostgreSQL with Redis caching
- **Search**: Vespa vector database
- **Auth**: OAuth2, SAML, multi-provider support
- **AI/ML**: LangChain, LiteLLM, multiple embedding models
### Directory Structure
```
backend/
├── onyx/
│ ├── auth/ # Authentication & authorization
│ ├── chat/ # Chat functionality & LLM interactions
│ ├── connectors/ # Data source connectors
│ ├── db/ # Database models & operations
│ ├── document_index/ # Vespa integration
│ ├── federated_connectors/ # External search connectors
│ ├── llm/ # LLM provider integrations
│ └── server/ # API endpoints & routers
├── ee/ # Enterprise Edition features
├── alembic/ # Database migrations
└── tests/ # Test suites
web/
├── src/app/ # Next.js app router pages
├── src/components/ # Reusable React components
└── src/lib/ # Utilities & business logic
```
## Frontend Standards
### 1. Import Standards
**Always use absolute imports with the `@` prefix.**
**Reason:** Moving files around becomes easier since you don't also have to update those import statements. This makes modifications to the codebase much nicer.
```typescript
// ✅ Good
import { Button } from "@/components/ui/button";
import { useAuth } from "@/hooks/useAuth";
import { Text } from "@/refresh-components/texts/Text";
// ❌ Bad
import { Button } from "../../../components/ui/button";
import { useAuth } from "./hooks/useAuth";
```
### 2. React Component Functions
**Prefer regular functions over arrow functions for React components.**
**Reason:** Functions just become easier to read.
```typescript
// ✅ Good
function UserProfile({ userId }: UserProfileProps) {
return <div>User Profile</div>
}
// ❌ Bad
const UserProfile = ({ userId }: UserProfileProps) => {
return <div>User Profile</div>
}
```
### 3. Props Interface Extraction
**Extract prop types into their own interface definitions.**
**Reason:** Functions just become easier to read.
```typescript
// ✅ Good
interface UserCardProps {
user: User
showActions?: boolean
onEdit?: (userId: string) => void
}
function UserCard({ user, showActions = false, onEdit }: UserCardProps) {
return <div>User Card</div>
}
// ❌ Bad
function UserCard({
user,
showActions = false,
onEdit
}: {
user: User
showActions?: boolean
onEdit?: (userId: string) => void
}) {
return <div>User Card</div>
}
```
### 4. Spacing Guidelines
**Prefer padding over margins for spacing.**
**Reason:** We want to consolidate usage to paddings instead of margins.
```typescript
// ✅ Good
<div className="p-4 space-y-2">
<div className="p-2">Content</div>
</div>
// ❌ Bad
<div className="m-4 space-y-2">
<div className="m-2">Content</div>
</div>
```
### 5. Tailwind Dark Mode
**Strictly forbid using the `dark:` modifier in Tailwind classes, except for logo icon handling.**
**Reason:** The `colors.css` file already, VERY CAREFULLY, defines what the exact opposite colour of each light-mode colour is. Overriding this behaviour is VERY bad and will lead to horrible UI breakages.
**Exception:** The `createLogoIcon` helper in `web/src/components/icons/icons.tsx` uses `dark:` modifiers (`dark:invert`, `dark:hidden`, `dark:block`) to handle third-party logo icons that cannot automatically adapt through `colors.css`. This is the ONLY acceptable use of dark mode modifiers.
```typescript
// ✅ Good - Standard components use `web/tailwind-themes/tailwind.config.js` / `web/src/app/css/colors.css`
<div className="bg-background-neutral-03 text-text-02">
Content
</div>
// ✅ Good - Logo icons with dark mode handling via createLogoIcon
export const GithubIcon = createLogoIcon(githubLightIcon, {
monochromatic: true, // Will apply dark:invert internally
});
export const GitbookIcon = createLogoIcon(gitbookLightIcon, {
darkSrc: gitbookDarkIcon, // Will use dark:hidden/dark:block internally
});
// ❌ Bad - Manual dark mode overrides
<div className="bg-white dark:bg-black text-black dark:text-white">
Content
</div>
```
### 6. Class Name Utilities
**Use the `cn` utility instead of raw string formatting for classNames.**
**Reason:** `cn`s are easier to read. They also allow for more complex types (i.e., string-arrays) to get formatted properly (it flattens each element in that string array down). As a result, it can allow things such as conditionals (i.e., `myCondition && "some-tailwind-class"`, which evaluates to `false` when `myCondition` is `false`) to get filtered out.
```typescript
import { cn } from '@/lib/utils'
// ✅ Good
<div className={cn(
'base-class',
isActive && 'active-class',
className
)}>
Content
</div>
// ❌ Bad
<div className={`base-class ${isActive ? 'active-class' : ''} ${className}`}>
Content
</div>
```
### 7. Custom Hooks Organization
**Follow a "hook-per-file" layout. Each hook should live in its own file within `web/src/hooks`.**
**Reason:** This is just a layout preference. Keeps code clean.
```typescript
// web/src/hooks/useUserData.ts
export function useUserData(userId: string) {
// hook implementation
}
// web/src/hooks/useLocalStorage.ts
export function useLocalStorage<T>(key: string, initialValue: T) {
// hook implementation
}
```
### 8. Icon Usage
**ONLY use icons from the `web/src/icons` directory. Do NOT use icons from `react-icons`, `lucide`, or other external libraries.**
**Reason:** We have a very carefully curated selection of icons that match our Onyx guidelines. We do NOT want to muddy those up with different aesthetic stylings.
```typescript
// ✅ Good
import SvgX from "@/icons/x";
import SvgMoreHorizontal from "@/icons/more-horizontal";
// ❌ Bad
import { User } from "lucide-react";
import { FiSearch } from "react-icons/fi";
```
**Missing Icons**: If an icon is needed but doesn't exist in the `web/src/icons` directory, import it from Figma using the Figma MCP tool and add it to the icons directory.
If you need help with this step, reach out to `raunak@onyx.app`.
### 9. Text Rendering
**Prefer using the `refresh-components/texts/Text` component for all text rendering. Avoid "naked" text nodes.**
**Reason:** The `Text` component is fully compliant with the stylings provided in Figma. It provides easy utilities to specify the text-colour and font-size in the form of flags. Super duper easy.
```typescript
// ✅ Good
import { Text } from '@/refresh-components/texts/Text'
function UserCard({ name }: { name: string }) {
return (
<Text
{/* The `text03` flag makes the text it renders to be coloured the 3rd-scale grey */}
text03
{/* The `mainAction` flag makes the text it renders to be "main-action" font + line-height + weightage, as described in the Figma */}
mainAction
>
{name}
</Text>
)
}
// ❌ Bad
function UserCard({ name }: { name: string }) {
return (
<div>
<h2>{name}</h2>
<p>User details</p>
</div>
)
}
```
### 10. Component Usage
**Heavily avoid raw HTML input components. Always use components from the `web/src/refresh-components` or `web/lib/opal/src` directory.**
**Reason:** We've put in a lot of effort to unify the components that are rendered in the Onyx app. Using raw components breaks the entire UI of the application, and leaves it in a muddier state than before.
```typescript
// ✅ Good
import Button from '@/refresh-components/buttons/Button'
import InputTypeIn from '@/refresh-components/inputs/InputTypeIn'
import SvgPlusCircle from '@/icons/plus-circle'
function ContactForm() {
return (
<form>
<InputTypeIn placeholder="Search..." />
<Button type="submit" leftIcon={SvgPlusCircle}>Submit</Button>
</form>
)
}
// ❌ Bad
function ContactForm() {
return (
<form>
<input placeholder="Name" />
<textarea placeholder="Message" />
<button type="submit">Submit</button>
</form>
)
}
```
### 11. Colors
**Always use custom overrides for colors and borders rather than built in Tailwind CSS colors. These overrides live in `web/tailwind-themes/tailwind.config.js`.**
**Reason:** Our custom color system uses CSS variables that automatically handle dark mode and maintain design consistency across the app. Standard Tailwind colors bypass this system.
**Available color categories:**
- **Text:** `text-01` through `text-05`, `text-inverted-XX`
- **Backgrounds:** `background-neutral-XX`, `background-tint-XX` (and inverted variants)
- **Borders:** `border-01` through `border-05`, `border-inverted-XX`
- **Actions:** `action-link-XX`, `action-danger-XX`
- **Status:** `status-info-XX`, `status-success-XX`, `status-warning-XX`, `status-error-XX`
- **Theme:** `theme-primary-XX`, `theme-red-XX`, `theme-blue-XX`, etc.
```typescript
// ✅ Good - Use custom Onyx color classes
<div className="bg-background-neutral-01 border border-border-02" />
<div className="bg-background-tint-02 border border-border-01" />
<div className="bg-status-success-01" />
<div className="bg-action-link-01" />
<div className="bg-theme-primary-05" />
// ❌ Bad - Do NOT use standard Tailwind colors
<div className="bg-gray-100 border border-gray-300 text-gray-600" />
<div className="bg-white border border-slate-200" />
<div className="bg-green-100 text-green-700" />
<div className="bg-blue-100 text-blue-600" />
<div className="bg-indigo-500" />
```
### 12. Data Fetching
**Prefer using `useSWR` for data fetching. Data should generally be fetched on the client side. Components that need data should display a loader / placeholder while waiting for that data. Prefer loading data within the component that needs it rather than at the top level and passing it down.**
**Reason:** Client side fetching allows us to load the skeleton of the page without waiting for data to load, leading to a snappier UX. Loading data where needed reduces dependencies between a component and its parent component(s).
## Database & Migrations
### Running Migrations
```bash
# Standard migrations
alembic upgrade head
# Multi-tenant (Enterprise)
alembic -n schema_private upgrade head
```
### Creating Migrations
```bash
# Create migration
alembic revision -m "description"
# Multi-tenant migration
alembic -n schema_private revision -m "description"
```
Write the migration manually and place it in the file that alembic creates when running the above command.
## Testing Strategy
There are 4 main types of tests within Onyx:
### Unit Tests
These should not assume any Onyx/external services are available to be called.
Interactions with the outside world should be mocked using `unittest.mock`. Generally, only
write these for complex, isolated modules e.g. `citation_processing.py`.
To run them:
```bash
python -m dotenv -f .vscode/.env run -- pytest -xv backend/tests/unit
```
### External Dependency Unit Tests
These tests assume that all external dependencies of Onyx are available and callable (e.g. Postgres, Redis,
MinIO/S3, Vespa are running + OpenAI can be called + any request to the internet is fine + etc.).
However, the actual Onyx containers are not running and with these tests we call the function to test directly.
We can also mock components/calls at will.
The goal with these tests are to minimize mocking while giving some flexibility to mock things that are flakey,
need strictly controlled behavior, or need to have their internal behavior validated (e.g. verify a function is called
with certain args, something that would be impossible with proper integration tests).
A great example of this type of test is `backend/tests/external_dependency_unit/connectors/confluence/test_confluence_group_sync.py`.
To run them:
```bash
python -m dotenv -f .vscode/.env run -- pytest backend/tests/external_dependency_unit
```
### Integration Tests
Standard integration tests. Every test in `backend/tests/integration` runs against a real Onyx deployment. We cannot
mock anything in these tests. Prefer writing integration tests (or External Dependency Unit Tests if mocking/internal
verification is necessary) over any other type of test.
Tests are parallelized at a directory level.
When writing integration tests, make sure to check the root `conftest.py` for useful fixtures + the `backend/tests/integration/common_utils` directory for utilities. Prefer (if one exists), calling the appropriate Manager
class in the utils over directly calling the APIs with a library like `requests`. Prefer using fixtures rather than
calling the utilities directly (e.g. do NOT create admin users with
`admin_user = UserManager.create(name="admin_user")`, instead use the `admin_user` fixture).
A great example of this type of test is `backend/tests/integration/dev_apis/test_simple_chat_api.py`.
To run them:
```bash
python -m dotenv -f .vscode/.env run -- pytest backend/tests/integration
```
### Playwright (E2E) Tests
These tests are an even more complete version of the Integration Tests mentioned above. Has all services of Onyx
running, *including* the Web Server.
Use these tests for anything that requires significant frontend <-> backend coordination.
Tests are located at `web/tests/e2e`. Tests are written in TypeScript.
To run them:
```bash
npx playwright test <TEST_NAME>
```
## Logs
When (1) writing integration tests or (2) doing live tests (e.g. curl / playwright) you can get access
to logs via the `backend/log/<service_name>_debug.log` file. All Onyx services (api_server, web_server, celery_X)
will be tailing their logs to this file.
## Security Considerations
- Never commit API keys or secrets to repository
- Use encrypted credential storage for connector credentials
- Follow RBAC patterns for new features
- Implement proper input validation with Pydantic models
- Use parameterized queries to prevent SQL injection
## AI/LLM Integration
- Multiple LLM providers supported via LiteLLM
- Configurable models per feature (chat, search, embeddings)
- Streaming support for real-time responses
- Token management and rate limiting
- Custom prompts and agent actions
## Creating a Plan
When creating a plan in the `plans` directory, make sure to include at least these elements:
**Issues to Address**
What the change is meant to do.
**Important Notes**
Things you come across in your research that are important to the implementation.
**Implementation strategy**
How you are going to make the changes happen. High level approach.
**Tests**
What unit (use rarely), external dependency unit, integration, and playwright tests you plan to write to
verify the correct behavior. Don't overtest. Usually, a given change only needs one type of test.
Do NOT include these: *Timeline*, *Rollback plan*
This is a minimal list - feel free to include more. Do NOT write code as part of your plan.
Keep it high level. You can reference certain files or functions though.
Before writing your plan, make sure to do research. Explore the relevant sections in the codebase.

1
CLAUDE.md Symbolic link
View File

@@ -0,0 +1 @@
AGENTS.md

View File

@@ -1,262 +1,31 @@
<!-- ONYX_METADATA={"link": "https://github.com/onyx-dot-app/onyx/blob/main/CONTRIBUTING.md"} -->
# Contributing to Onyx
Hey there! We are so excited that you're interested in Onyx.
As an open source project in a rapidly changing space, we welcome all contributions.
## 💃 Guidelines
## Contribution Opportunities
The [GitHub Issues](https://github.com/onyx-dot-app/onyx/issues) page is a great place to look for and share contribution ideas.
### Contribution Opportunities
If you have your own feature that you would like to build please create an issue and community members can provide feedback and
thumb it up if they feel a common need.
The [GitHub Issues](https://github.com/onyx-dot-app/onyx/issues) page is a great place to start for contribution ideas.
To ensure that your contribution is aligned with the project's direction, please reach out to any maintainer on the Onyx team
via [Discord](https://discord.gg/4NA5SbzrWb) or [email](mailto:hello@onyx.app).
## Contributing Code
Please reference the documents in contributing_guides folder to ensure that the code base is kept to a high standard.
1. dev_setup.md (start here): gives you a guide to setting up a local development environment.
2. contribution_process.md: how to ensure you are building valuable features that will get reviewed and merged.
3. best_practices.md: before asking for reviews, ensure your changes meet the repo code quality standards.
Issues that have been explicitly approved by the maintainers (aligned with the direction of the project)
will be marked with the `approved by maintainers` label.
Issues marked `good first issue` are an especially great place to start.
**Connectors** to other tools are another great place to contribute. For details on how, refer to this
[README.md](https://github.com/onyx-dot-app/onyx/blob/main/backend/onyx/connectors/README.md).
If you have a new/different contribution in mind, we'd love to hear about it!
Your input is vital to making sure that Onyx moves in the right direction.
Before starting on implementation, please raise a GitHub issue.
Also, always feel free to message the founders (Chris Weaver / Yuhong Sun) on
[Discord](https://discord.gg/4NA5SbzrWb) directly about anything at all.
### Contributing Code
To contribute to this project, please follow the
To contribute, please follow the
["fork and pull request"](https://docs.github.com/en/get-started/quickstart/contributing-to-projects) workflow.
When opening a pull request, mention related issues and feel free to tag relevant maintainers.
Before creating a pull request please make sure that the new changes conform to the formatting and linting requirements.
See the [Formatting and Linting](#formatting-and-linting) section for how to run these checks locally.
### Getting Help 🙋
## Getting Help 🙋
We have support channels and generally interesting discussions on our [Discord](https://discord.gg/4NA5SbzrWb).
Our goal is to make contributing as easy as possible. If you run into any issues please don't hesitate to reach out.
That way we can help future contributors and users can avoid the same issue.
See you there!
We also have support channels and generally interesting discussions on our
[Discord](https://discord.gg/4NA5SbzrWb).
We would love to see you there!
## Get Started 🚀
Onyx being a fully functional app, relies on some external software, specifically:
- [Postgres](https://www.postgresql.org/) (Relational DB)
- [Vespa](https://vespa.ai/) (Vector DB/Search Engine)
- [Redis](https://redis.io/) (Cache)
- [MinIO](https://min.io/) (File Store)
- [Nginx](https://nginx.org/) (Not needed for development flows generally)
> **Note:**
> This guide provides instructions to build and run Onyx locally from source with Docker containers providing the above external software. We believe this combination is easier for
> development purposes. If you prefer to use pre-built container images, we provide instructions on running the full Onyx stack within Docker below.
### Local Set Up
Be sure to use Python version 3.11. For instructions on installing Python 3.11 on macOS, refer to the [CONTRIBUTING_MACOS.md](./CONTRIBUTING_MACOS.md) readme.
If using a lower version, modifications will have to be made to the code.
If using a higher version, sometimes some libraries will not be available (i.e. we had problems with Tensorflow in the past with higher versions of python).
#### Backend: Python requirements
Currently, we use [uv](https://docs.astral.sh/uv/) and recommend creating a [virtual environment](https://docs.astral.sh/uv/pip/environments/#using-a-virtual-environment).
For convenience here's a command for it:
```bash
uv venv .venv --python 3.11
source .venv/bin/activate
```
_For Windows, activate the virtual environment using Command Prompt:_
```bash
.venv\Scripts\activate
```
If using PowerShell, the command slightly differs:
```powershell
.venv\Scripts\Activate.ps1
```
Install the required python dependencies:
```bash
uv sync --all-extras
```
Install Playwright for Python (headless browser required by the Web Connector):
```bash
uv run playwright install
```
#### Frontend: Node dependencies
Onyx uses Node v22.20.0. We highly recommend you use [Node Version Manager (nvm)](https://github.com/nvm-sh/nvm)
to manage your Node installations. Once installed, you can run
```bash
nvm install 22 && nvm use 22
node -v # verify your active version
```
Navigate to `onyx/web` and run:
```bash
npm i
```
## Formatting and Linting
### Backend
For the backend, you'll need to setup pre-commit hooks (black / reorder-python-imports).
Then run:
```bash
uv run pre-commit install
```
Additionally, we use `mypy` for static type checking.
Onyx is fully type-annotated, and we want to keep it that way!
To run the mypy checks manually, run `uv run mypy .` from the `onyx/backend` directory.
### Web
We use `prettier` for formatting. The desired version will be installed via a `npm i` from the `onyx/web` directory.
To run the formatter, use `npx prettier --write .` from the `onyx/web` directory.
Pre-commit will also run prettier automatically on files you've recently touched. If re-formatted, your commit will fail.
Re-stage your changes and commit again.
# Running the application for development
## Developing using VSCode Debugger (recommended)
**We highly recommend using VSCode debugger for development.**
See [CONTRIBUTING_VSCODE.md](./CONTRIBUTING_VSCODE.md) for more details.
Otherwise, you can follow the instructions below to run the application for development.
## Manually running the application for development
### Docker containers for external software
You will need Docker installed to run these containers.
First navigate to `onyx/deployment/docker_compose`, then start up Postgres/Vespa/Redis/MinIO with:
```bash
docker compose -f docker-compose.yml -f docker-compose.dev.yml up -d index relational_db cache minio
```
(index refers to Vespa, relational_db refers to Postgres, and cache refers to Redis)
### Running Onyx locally
To start the frontend, navigate to `onyx/web` and run:
```bash
npm run dev
```
Next, start the model server which runs the local NLP models.
Navigate to `onyx/backend` and run:
```bash
uvicorn model_server.main:app --reload --port 9000
```
_For Windows (for compatibility with both PowerShell and Command Prompt):_
```bash
powershell -Command "uvicorn model_server.main:app --reload --port 9000"
```
The first time running Onyx, you will need to run the DB migrations for Postgres.
After the first time, this is no longer required unless the DB models change.
Navigate to `onyx/backend` and with the venv active, run:
```bash
alembic upgrade head
```
Next, start the task queue which orchestrates the background jobs.
Jobs that take more time are run async from the API server.
Still in `onyx/backend`, run:
```bash
python ./scripts/dev_run_background_jobs.py
```
To run the backend API server, navigate back to `onyx/backend` and run:
```bash
AUTH_TYPE=disabled uvicorn onyx.main:app --reload --port 8080
```
_For Windows (for compatibility with both PowerShell and Command Prompt):_
```bash
powershell -Command "
$env:AUTH_TYPE='disabled'
uvicorn onyx.main:app --reload --port 8080
"
```
> **Note:**
> If you need finer logging, add the additional environment variable `LOG_LEVEL=DEBUG` to the relevant services.
#### Wrapping up
You should now have 4 servers running:
- Web server
- Backend API
- Model server
- Background jobs
Now, visit `http://localhost:3000` in your browser. You should see the Onyx onboarding wizard where you can connect your external LLM provider to Onyx.
You've successfully set up a local Onyx instance! 🏁
#### Running the Onyx application in a container
You can run the full Onyx application stack from pre-built images including all external software dependencies.
Navigate to `onyx/deployment/docker_compose` and run:
```bash
docker compose up -d
```
After Docker pulls and starts these containers, navigate to `http://localhost:3000` to use Onyx.
If you want to make changes to Onyx and run those changes in Docker, you can also build a local version of the Onyx container images that incorporates your changes like so:
```bash
docker compose up -d --build
```
### Release Process
## Release Process
Onyx loosely follows the SemVer versioning standard.
Major changes are released with a "minor" version bump. Currently we use patch release versions to indicate small feature changes.
A set of Docker containers will be pushed automatically to DockerHub with every tag.

View File

@@ -16,3 +16,8 @@ dist/
.coverage
htmlcov/
model_server/legacy/
# Craft: demo_data directory should be unzipped at container startup, not copied
**/demo_data/
# Craft: templates/outputs/venv is created at container startup
**/templates/outputs/venv

View File

@@ -37,10 +37,6 @@ CVE-2023-50868
CVE-2023-52425
CVE-2024-28757
# sqlite, only used by NLTK library to grab word lemmatizer and stopwords
# No impact in our settings
CVE-2023-7104
# libharfbuzz0b, O(n^2) growth, worst case is denial of service
# Accept the risk
CVE-2023-25193

View File

@@ -7,6 +7,10 @@ have a contract or agreement with DanswerAI, you are not permitted to use the En
Edition features outside of personal development or testing purposes. Please reach out to \
founders@onyx.app for more information. Please visit https://github.com/onyx-dot-app/onyx"
# Build argument for Craft support (disabled by default)
# Use --build-arg ENABLE_CRAFT=true to include Node.js and opencode CLI
ARG ENABLE_CRAFT=false
# DO_NOT_TRACK is used to disable telemetry for Unstructured
ENV DANSWER_RUNNING_IN_DOCKER="true" \
DO_NOT_TRACK="true" \
@@ -46,7 +50,23 @@ RUN apt-get update && \
rm -rf /var/lib/apt/lists/* && \
apt-get clean
# Conditionally install Node.js 20 for Craft (required for Next.js)
# Only installed when ENABLE_CRAFT=true
RUN if [ "$ENABLE_CRAFT" = "true" ]; then \
echo "Installing Node.js 20 for Craft support..." && \
curl -fsSL https://deb.nodesource.com/setup_20.x | bash - && \
apt-get install -y nodejs && \
rm -rf /var/lib/apt/lists/*; \
fi
# Conditionally install opencode CLI for Craft agent functionality
# Only installed when ENABLE_CRAFT=true
# TODO: download a specific, versioned release of the opencode CLI
RUN if [ "$ENABLE_CRAFT" = "true" ]; then \
echo "Installing opencode CLI for Craft support..." && \
curl -fsSL https://opencode.ai/install | bash; \
fi
ENV PATH="/root/.opencode/bin:${PATH}"
# Install Python dependencies
# Remove py which is pulled in by retry, py is not needed and is a CVE
@@ -91,8 +111,8 @@ Tokenizer.from_pretrained('nomic-ai/nomic-embed-text-v1')"
# Pre-downloading NLTK for setups with limited egress
RUN python -c "import nltk; \
nltk.download('stopwords', quiet=True); \
nltk.download('punkt_tab', quiet=True);"
nltk.download('stopwords', quiet=True); \
nltk.download('punkt_tab', quiet=True);"
# nltk.download('wordnet', quiet=True); introduce this back if lemmatization is needed
# Pre-downloading tiktoken for setups with limited egress
@@ -114,12 +134,26 @@ COPY --chown=onyx:onyx ./alembic_tenants /app/alembic_tenants
COPY --chown=onyx:onyx ./alembic.ini /app/alembic.ini
COPY supervisord.conf /usr/etc/supervisord.conf
COPY --chown=onyx:onyx ./static /app/static
COPY --chown=onyx:onyx ./keys /app/keys
# Escape hatch scripts
COPY --chown=onyx:onyx ./scripts/debugging /app/scripts/debugging
COPY --chown=onyx:onyx ./scripts/force_delete_connector_by_id.py /app/scripts/force_delete_connector_by_id.py
COPY --chown=onyx:onyx ./scripts/supervisord_entrypoint.sh /app/scripts/supervisord_entrypoint.sh
RUN chmod +x /app/scripts/supervisord_entrypoint.sh
COPY --chown=onyx:onyx ./scripts/setup_craft_templates.sh /app/scripts/setup_craft_templates.sh
RUN chmod +x /app/scripts/supervisord_entrypoint.sh /app/scripts/setup_craft_templates.sh
# Run Craft template setup at build time when ENABLE_CRAFT=true
# This pre-bakes demo data, Python venv, and npm dependencies into the image
RUN if [ "$ENABLE_CRAFT" = "true" ]; then \
echo "Running Craft template setup at build time..." && \
ENABLE_CRAFT=true /app/scripts/setup_craft_templates.sh; \
fi
# Set Craft template paths to the in-image locations
# These match the paths where setup_craft_templates.sh creates the templates
ENV OUTPUTS_TEMPLATE_PATH=/app/onyx/server/features/build/sandbox/kubernetes/docker/templates/outputs
ENV VENV_TEMPLATE_PATH=/app/onyx/server/features/build/sandbox/kubernetes/docker/templates/venv
# Put logo in assets
COPY --chown=onyx:onyx ./assets /app/assets

View File

@@ -48,6 +48,7 @@ WORKDIR /app
# Utils used by model server
COPY ./onyx/utils/logger.py /app/onyx/utils/logger.py
COPY ./onyx/utils/middleware.py /app/onyx/utils/middleware.py
COPY ./onyx/utils/tenant.py /app/onyx/utils/tenant.py
# Place to fetch version information
COPY ./onyx/__init__.py /app/onyx/__init__.py

View File

@@ -57,7 +57,7 @@ if USE_IAM_AUTH:
def include_object(
object: SchemaItem,
object: SchemaItem, # noqa: ARG001
name: str | None,
type_: Literal[
"schema",
@@ -67,8 +67,8 @@ def include_object(
"unique_constraint",
"foreign_key_constraint",
],
reflected: bool,
compare_to: SchemaItem | None,
reflected: bool, # noqa: ARG001
compare_to: SchemaItem | None, # noqa: ARG001
) -> bool:
if type_ == "table" and name in EXCLUDE_TABLES:
return False
@@ -225,7 +225,6 @@ def do_run_migrations(
) -> None:
if create_schema:
connection.execute(text(f'CREATE SCHEMA IF NOT EXISTS "{schema_name}"'))
connection.execute(text("COMMIT"))
connection.execute(text(f'SET search_path TO "{schema_name}"'))
@@ -245,7 +244,7 @@ def do_run_migrations(
def provide_iam_token_for_alembic(
dialect: Any, conn_rec: Any, cargs: Any, cparams: Any
dialect: Any, conn_rec: Any, cargs: Any, cparams: Any # noqa: ARG001
) -> None:
if USE_IAM_AUTH:
# Database connection settings
@@ -309,6 +308,7 @@ async def run_async_migrations() -> None:
schema_name=schema,
create_schema=create_schema,
)
await connection.commit()
except Exception as e:
logger.error(f"Error migrating schema {schema}: {e}")
if not continue_on_error:
@@ -346,6 +346,7 @@ async def run_async_migrations() -> None:
schema_name=schema,
create_schema=create_schema,
)
await connection.commit()
except Exception as e:
logger.error(f"Error migrating schema {schema}: {e}")
if not continue_on_error:
@@ -473,7 +474,7 @@ def run_migrations_online() -> None:
if connectable is not None:
# pytest-alembic is providing an engine - use it directly
logger.info("run_migrations_online starting (pytest-alembic mode).")
logger.debug("run_migrations_online starting (pytest-alembic mode).")
# For pytest-alembic, we use the default schema (public)
schema_name = context.config.attributes.get(

View File

@@ -0,0 +1,343 @@
#!/usr/bin/env python3
"""Parallel Alembic Migration Runner
Upgrades tenant schemas to head in batched, parallel alembic subprocesses.
Each subprocess handles a batch of schemas (via ``-x schemas=a,b,c``),
reducing per-process overhead compared to one-schema-per-process.
Usage examples::
# defaults: 6 workers, 50 schemas/batch
python alembic/run_multitenant_migrations.py
# custom settings
python alembic/run_multitenant_migrations.py -j 8 -b 100
"""
from __future__ import annotations
import argparse
import subprocess
import sys
import threading
import time
from concurrent.futures import ThreadPoolExecutor, as_completed
from typing import List, NamedTuple
from alembic.config import Config
from alembic.script import ScriptDirectory
from sqlalchemy import text
from onyx.db.engine.sql_engine import is_valid_schema_name
from onyx.db.engine.sql_engine import SqlEngine
from onyx.db.engine.tenant_utils import get_all_tenant_ids
from shared_configs.configs import TENANT_ID_PREFIX
# ---------------------------------------------------------------------------
# Data types
# ---------------------------------------------------------------------------
class Args(NamedTuple):
jobs: int
batch_size: int
class BatchResult(NamedTuple):
schemas: list[str]
success: bool
output: str
elapsed_sec: float
# ---------------------------------------------------------------------------
# Core functions
# ---------------------------------------------------------------------------
def run_alembic_for_batch(schemas: list[str]) -> BatchResult:
"""Run ``alembic upgrade head`` for a batch of schemas in one subprocess.
If the batch fails, it is automatically retried with ``-x continue=true``
so that the remaining schemas in the batch still get migrated. The retry
output (which contains alembic's per-schema error messages) is returned
for diagnosis.
"""
csv = ",".join(schemas)
base_cmd = ["alembic", "-x", f"schemas={csv}"]
start = time.monotonic()
result = subprocess.run(
[*base_cmd, "upgrade", "head"],
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT,
text=True,
)
if result.returncode == 0:
elapsed = time.monotonic() - start
return BatchResult(schemas, True, result.stdout or "", elapsed)
# At least one schema failed. Print the initial error output, then
# re-run with continue=true so the remaining schemas still get migrated.
if result.stdout:
print(f"Initial error output:\n{result.stdout}", file=sys.stderr, flush=True)
print(
f"Batch failed (exit {result.returncode}), retrying with 'continue=true'...",
file=sys.stderr,
flush=True,
)
retry = subprocess.run(
[*base_cmd, "-x", "continue=true", "upgrade", "head"],
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT,
text=True,
)
elapsed = time.monotonic() - start
return BatchResult(schemas, False, retry.stdout or "", elapsed)
def get_head_revision() -> str | None:
"""Get the head revision from the alembic script directory."""
alembic_cfg = Config("alembic.ini")
script = ScriptDirectory.from_config(alembic_cfg)
return script.get_current_head()
def get_schemas_needing_migration(
tenant_schemas: List[str], head_rev: str
) -> List[str]:
"""Return only schemas whose current alembic version is not at head."""
if not tenant_schemas:
return []
engine = SqlEngine.get_engine()
with engine.connect() as conn:
# Find which schemas actually have an alembic_version table
rows = conn.execute(
text(
"SELECT table_schema FROM information_schema.tables "
"WHERE table_name = 'alembic_version' "
"AND table_schema = ANY(:schemas)"
),
{"schemas": tenant_schemas},
)
schemas_with_table = set(row[0] for row in rows)
# Schemas without the table definitely need migration
needs_migration = [s for s in tenant_schemas if s not in schemas_with_table]
if not schemas_with_table:
return needs_migration
# Validate schema names before interpolating into SQL
for schema in schemas_with_table:
if not is_valid_schema_name(schema):
raise ValueError(f"Invalid schema name: {schema}")
# Single query to get every schema's current revision at once.
# Use integer tags instead of interpolating schema names into
# string literals to avoid quoting issues.
schema_list = list(schemas_with_table)
union_parts = [
f'SELECT {i} AS idx, version_num FROM "{schema}".alembic_version'
for i, schema in enumerate(schema_list)
]
rows = conn.execute(text(" UNION ALL ".join(union_parts)))
version_by_schema = {schema_list[row[0]]: row[1] for row in rows}
needs_migration.extend(
s for s in schemas_with_table if version_by_schema.get(s) != head_rev
)
return needs_migration
def run_migrations_parallel(
schemas: list[str],
max_workers: int,
batch_size: int,
) -> bool:
"""Chunk *schemas* into batches and run them in parallel.
A background monitor thread prints a status line every 60 s listing
which batches are still in-flight, making it easy to spot hung tenants.
"""
batches = [schemas[i : i + batch_size] for i in range(0, len(schemas), batch_size)]
total_batches = len(batches)
print(
f"{len(schemas)} schemas in {total_batches} batch(es) "
f"with {max_workers} workers (batch size: {batch_size})...",
flush=True,
)
all_success = True
# Thread-safe tracking of in-flight batches for the monitor thread.
in_flight: dict[int, list[str]] = {}
prev_in_flight: set[int] = set()
lock = threading.Lock()
stop_event = threading.Event()
def _monitor() -> None:
"""Print a status line every 60 s listing batches still in-flight.
Only prints batches that were also present in the previous tick,
making it easy to spot batches that are stuck.
"""
nonlocal prev_in_flight
while not stop_event.wait(60):
with lock:
if not in_flight:
prev_in_flight = set()
continue
current = set(in_flight)
stuck = current & prev_in_flight
prev_in_flight = current
if not stuck:
continue
schemas = [s for idx in sorted(stuck) for s in in_flight[idx]]
print(
f"⏳ batch(es) still running since last check "
f"({', '.join(str(i + 1) for i in sorted(stuck))}): "
+ ", ".join(schemas),
flush=True,
)
monitor_thread = threading.Thread(target=_monitor, daemon=True)
monitor_thread.start()
try:
with ThreadPoolExecutor(max_workers=max_workers) as executor:
def _run(batch_idx: int, batch: list[str]) -> BatchResult:
with lock:
in_flight[batch_idx] = batch
print(
f"Batch {batch_idx + 1}/{total_batches} started "
f"({len(batch)} schemas): {', '.join(batch)}",
flush=True,
)
result = run_alembic_for_batch(batch)
with lock:
in_flight.pop(batch_idx, None)
return result
future_to_idx = {
executor.submit(_run, i, b): i for i, b in enumerate(batches)
}
for future in as_completed(future_to_idx):
batch_idx = future_to_idx[future]
try:
result = future.result()
status = "" if result.success else ""
print(
f"Batch {batch_idx + 1}/{total_batches} "
f"{status} {len(result.schemas)} schemas "
f"in {result.elapsed_sec:.1f}s",
flush=True,
)
if not result.success:
# Print last 20 lines of retry output for diagnosis
tail = result.output.strip().splitlines()[-20:]
for line in tail:
print(f" {line}", flush=True)
all_success = False
except Exception as e:
print(
f"Batch {batch_idx + 1}/{total_batches} " f"✗ exception: {e}",
flush=True,
)
all_success = False
finally:
stop_event.set()
monitor_thread.join(timeout=2)
return all_success
# ---------------------------------------------------------------------------
# CLI
# ---------------------------------------------------------------------------
def parse_args() -> Args:
parser = argparse.ArgumentParser(
description="Run alembic migrations for all tenant schemas in parallel"
)
parser.add_argument(
"-j",
"--jobs",
type=int,
default=6,
metavar="N",
help="Number of parallel alembic processes (default: 6)",
)
parser.add_argument(
"-b",
"--batch-size",
type=int,
default=50,
metavar="N",
help="Schemas per alembic process (default: 50)",
)
args = parser.parse_args()
if args.jobs < 1:
parser.error("--jobs must be >= 1")
if args.batch_size < 1:
parser.error("--batch-size must be >= 1")
return Args(jobs=args.jobs, batch_size=args.batch_size)
def main() -> int:
args = parse_args()
head_rev = get_head_revision()
if head_rev is None:
print("Could not determine head revision.", file=sys.stderr)
return 1
with SqlEngine.scoped_engine(pool_size=5, max_overflow=2):
tenant_ids = get_all_tenant_ids()
tenant_schemas = [tid for tid in tenant_ids if tid.startswith(TENANT_ID_PREFIX)]
if not tenant_schemas:
print(
"No tenant schemas found. Is MULTI_TENANT=true set?",
file=sys.stderr,
)
return 1
schemas_to_migrate = get_schemas_needing_migration(tenant_schemas, head_rev)
if not schemas_to_migrate:
print(
f"All {len(tenant_schemas)} tenants are already at head "
f"revision ({head_rev})."
)
return 0
print(
f"{len(schemas_to_migrate)}/{len(tenant_schemas)} tenants need "
f"migration (head: {head_rev})."
)
success = run_migrations_parallel(
schemas_to_migrate,
max_workers=args.jobs,
batch_size=args.batch_size,
)
print(f"\n{'All migrations successful' if success else 'Some migrations failed'}")
return 0 if success else 1
if __name__ == "__main__":
raise SystemExit(main())

View File

@@ -0,0 +1,112 @@
"""Populate flow mapping data
Revision ID: 01f8e6d95a33
Revises: d5c86e2c6dc6
Create Date: 2026-01-31 17:37:10.485558
"""
from alembic import op
# revision identifiers, used by Alembic.
revision = "01f8e6d95a33"
down_revision = "d5c86e2c6dc6"
branch_labels = None
depends_on = None
def upgrade() -> None:
# Add each model config to the conversation flow, setting the global default if it exists
# Exclude models that are part of ImageGenerationConfig
op.execute(
"""
INSERT INTO llm_model_flow (llm_model_flow_type, is_default, model_configuration_id)
SELECT
'CHAT' AS llm_model_flow_type,
COALESCE(
(lp.is_default_provider IS TRUE AND lp.default_model_name = mc.name),
FALSE
) AS is_default,
mc.id AS model_configuration_id
FROM model_configuration mc
LEFT JOIN llm_provider lp
ON lp.id = mc.llm_provider_id
WHERE NOT EXISTS (
SELECT 1 FROM image_generation_config igc
WHERE igc.model_configuration_id = mc.id
);
"""
)
# Add models with supports_image_input to the vision flow
op.execute(
"""
INSERT INTO llm_model_flow (llm_model_flow_type, is_default, model_configuration_id)
SELECT
'VISION' AS llm_model_flow_type,
COALESCE(
(lp.is_default_vision_provider IS TRUE AND lp.default_vision_model = mc.name),
FALSE
) AS is_default,
mc.id AS model_configuration_id
FROM model_configuration mc
LEFT JOIN llm_provider lp
ON lp.id = mc.llm_provider_id
WHERE mc.supports_image_input IS TRUE;
"""
)
def downgrade() -> None:
# Populate vision defaults from model_flow
op.execute(
"""
UPDATE llm_provider AS lp
SET
is_default_vision_provider = TRUE,
default_vision_model = mc.name
FROM llm_model_flow mf
JOIN model_configuration mc ON mc.id = mf.model_configuration_id
WHERE mf.llm_model_flow_type = 'VISION'
AND mf.is_default = TRUE
AND mc.llm_provider_id = lp.id;
"""
)
# Populate conversation defaults from model_flow
op.execute(
"""
UPDATE llm_provider AS lp
SET
is_default_provider = TRUE,
default_model_name = mc.name
FROM llm_model_flow mf
JOIN model_configuration mc ON mc.id = mf.model_configuration_id
WHERE mf.llm_model_flow_type = 'CHAT'
AND mf.is_default = TRUE
AND mc.llm_provider_id = lp.id;
"""
)
# For providers that have conversation flow mappings but aren't the default,
# we still need a default_model_name (it was NOT NULL originally)
# Pick the first visible model or any model for that provider
op.execute(
"""
UPDATE llm_provider AS lp
SET default_model_name = (
SELECT mc.name
FROM model_configuration mc
JOIN llm_model_flow mf ON mf.model_configuration_id = mc.id
WHERE mc.llm_provider_id = lp.id
AND mf.llm_model_flow_type = 'CHAT'
ORDER BY mc.is_visible DESC, mc.id ASC
LIMIT 1
)
WHERE lp.default_model_name IS NULL;
"""
)
# Delete all model_flow entries (reverse the inserts from upgrade)
op.execute("DELETE FROM llm_model_flow;")

View File

@@ -0,0 +1,33 @@
"""add default_app_mode to user
Revision ID: 114a638452db
Revises: feead2911109
Create Date: 2026-02-09 18:57:08.274640
"""
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = "114a638452db"
down_revision = "feead2911109"
branch_labels = None
depends_on = None
def upgrade() -> None:
op.add_column(
"user",
sa.Column(
"default_app_mode",
sa.String(),
nullable=False,
server_default="CHAT",
),
)
def downgrade() -> None:
op.drop_column("user", "default_app_mode")

View File

@@ -11,7 +11,6 @@ import sqlalchemy as sa
from urllib.parse import urlparse, urlunparse
from httpx import HTTPStatusError
import httpx
from onyx.document_index.factory import get_default_document_index
from onyx.db.search_settings import SearchSettings
from onyx.document_index.vespa.shared_utils.utils import get_vespa_http_client
from onyx.document_index.vespa.shared_utils.utils import (
@@ -519,15 +518,11 @@ def delete_document_from_db(current_doc_id: str, index_name: str) -> None:
def upgrade() -> None:
if SKIP_CANON_DRIVE_IDS:
return
current_search_settings, future_search_settings = active_search_settings()
document_index = get_default_document_index(
current_search_settings,
future_search_settings,
)
current_search_settings, _ = active_search_settings()
# Get the index name
if hasattr(document_index, "index_name"):
index_name = document_index.index_name
if hasattr(current_search_settings, "index_name"):
index_name = current_search_settings.index_name
else:
# Default index name if we can't get it from the document_index
index_name = "danswer_index"

View File

@@ -0,0 +1,27 @@
"""add_user_preferences
Revision ID: 175ea04c7087
Revises: d56ffa94ca32
Create Date: 2026-02-04 18:16:24.830873
"""
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = "175ea04c7087"
down_revision = "d56ffa94ca32"
branch_labels = None
depends_on = None
def upgrade() -> None:
op.add_column(
"user",
sa.Column("user_preferences", sa.Text(), nullable=True),
)
def downgrade() -> None:
op.drop_column("user", "user_preferences")

View File

@@ -10,8 +10,6 @@ from alembic import op
import sqlalchemy as sa
from sqlalchemy.dialects import postgresql
from onyx.configs.chat_configs import NUM_POSTPROCESSED_RESULTS
# revision identifiers, used by Alembic.
revision = "1f60f60c3401"
down_revision = "f17bf3b0d9f1"
@@ -66,7 +64,7 @@ def upgrade() -> None:
"num_rerank",
sa.Integer(),
nullable=False,
server_default=str(NUM_POSTPROCESSED_RESULTS),
server_default=str(20),
),
)

View File

@@ -0,0 +1,351 @@
"""single onyx craft migration
Consolidates all buildmode/onyx craft tables into a single migration.
Tables created:
- build_session: User build sessions with status tracking
- sandbox: User-owned containerized environments (one per user)
- artifact: Build output files (web apps, documents, images)
- snapshot: Sandbox filesystem snapshots
- build_message: Conversation messages for build sessions
Existing table modified:
- connector_credential_pair: Added processing_mode column
Revision ID: 2020d417ec84
Revises: 41fa44bef321
Create Date: 2026-01-26 14:43:54.641405
"""
from alembic import op
import sqlalchemy as sa
from sqlalchemy.dialects import postgresql
# revision identifiers, used by Alembic.
revision = "2020d417ec84"
down_revision = "41fa44bef321"
branch_labels = None
depends_on = None
def upgrade() -> None:
# ==========================================================================
# ENUMS
# ==========================================================================
# Build session status enum
build_session_status_enum = sa.Enum(
"active",
"idle",
name="buildsessionstatus",
native_enum=False,
)
# Sandbox status enum
sandbox_status_enum = sa.Enum(
"provisioning",
"running",
"idle",
"sleeping",
"terminated",
"failed",
name="sandboxstatus",
native_enum=False,
)
# Artifact type enum
artifact_type_enum = sa.Enum(
"web_app",
"pptx",
"docx",
"markdown",
"excel",
"image",
name="artifacttype",
native_enum=False,
)
# ==========================================================================
# BUILD_SESSION TABLE
# ==========================================================================
op.create_table(
"build_session",
sa.Column("id", postgresql.UUID(as_uuid=True), primary_key=True),
sa.Column(
"user_id",
postgresql.UUID(as_uuid=True),
sa.ForeignKey("user.id", ondelete="CASCADE"),
nullable=True,
),
sa.Column("name", sa.String(), nullable=True),
sa.Column(
"status",
build_session_status_enum,
nullable=False,
server_default="active",
),
sa.Column(
"created_at",
sa.DateTime(timezone=True),
server_default=sa.text("now()"),
nullable=False,
),
sa.Column(
"last_activity_at",
sa.DateTime(timezone=True),
server_default=sa.text("now()"),
nullable=False,
),
sa.Column("nextjs_port", sa.Integer(), nullable=True),
sa.PrimaryKeyConstraint("id"),
)
op.create_index(
"ix_build_session_user_created",
"build_session",
["user_id", sa.text("created_at DESC")],
unique=False,
)
op.create_index(
"ix_build_session_status",
"build_session",
["status"],
unique=False,
)
# ==========================================================================
# SANDBOX TABLE (user-owned, one per user)
# ==========================================================================
op.create_table(
"sandbox",
sa.Column("id", postgresql.UUID(as_uuid=True), primary_key=True),
sa.Column(
"user_id",
postgresql.UUID(as_uuid=True),
sa.ForeignKey("user.id", ondelete="CASCADE"),
nullable=False,
),
sa.Column("container_id", sa.String(), nullable=True),
sa.Column(
"status",
sandbox_status_enum,
nullable=False,
server_default="provisioning",
),
sa.Column(
"created_at",
sa.DateTime(timezone=True),
server_default=sa.text("now()"),
nullable=False,
),
sa.Column("last_heartbeat", sa.DateTime(timezone=True), nullable=True),
sa.PrimaryKeyConstraint("id"),
sa.UniqueConstraint("user_id", name="sandbox_user_id_key"),
)
op.create_index(
"ix_sandbox_status",
"sandbox",
["status"],
unique=False,
)
op.create_index(
"ix_sandbox_container_id",
"sandbox",
["container_id"],
unique=False,
)
# ==========================================================================
# ARTIFACT TABLE
# ==========================================================================
op.create_table(
"artifact",
sa.Column("id", postgresql.UUID(as_uuid=True), primary_key=True),
sa.Column(
"session_id",
postgresql.UUID(as_uuid=True),
sa.ForeignKey("build_session.id", ondelete="CASCADE"),
nullable=False,
),
sa.Column("type", artifact_type_enum, nullable=False),
sa.Column("path", sa.String(), nullable=False),
sa.Column("name", sa.String(), nullable=False),
sa.Column(
"created_at",
sa.DateTime(timezone=True),
server_default=sa.text("now()"),
nullable=False,
),
sa.Column(
"updated_at",
sa.DateTime(timezone=True),
server_default=sa.text("now()"),
nullable=False,
),
sa.PrimaryKeyConstraint("id"),
)
op.create_index(
"ix_artifact_session_created",
"artifact",
["session_id", sa.text("created_at DESC")],
unique=False,
)
op.create_index(
"ix_artifact_type",
"artifact",
["type"],
unique=False,
)
# ==========================================================================
# SNAPSHOT TABLE
# ==========================================================================
op.create_table(
"snapshot",
sa.Column("id", postgresql.UUID(as_uuid=True), primary_key=True),
sa.Column(
"session_id",
postgresql.UUID(as_uuid=True),
sa.ForeignKey("build_session.id", ondelete="CASCADE"),
nullable=False,
),
sa.Column("storage_path", sa.String(), nullable=False),
sa.Column("size_bytes", sa.BigInteger(), nullable=False, server_default="0"),
sa.Column(
"created_at",
sa.DateTime(timezone=True),
server_default=sa.text("now()"),
nullable=False,
),
sa.PrimaryKeyConstraint("id"),
)
op.create_index(
"ix_snapshot_session_created",
"snapshot",
["session_id", sa.text("created_at DESC")],
unique=False,
)
# ==========================================================================
# BUILD_MESSAGE TABLE
# ==========================================================================
op.create_table(
"build_message",
sa.Column("id", postgresql.UUID(as_uuid=True), primary_key=True),
sa.Column(
"session_id",
postgresql.UUID(as_uuid=True),
sa.ForeignKey("build_session.id", ondelete="CASCADE"),
nullable=False,
),
sa.Column(
"turn_index",
sa.Integer(),
nullable=False,
),
sa.Column(
"type",
sa.Enum(
"SYSTEM",
"USER",
"ASSISTANT",
"DANSWER",
name="messagetype",
create_type=False,
native_enum=False,
),
nullable=False,
),
sa.Column(
"message_metadata",
postgresql.JSONB(),
nullable=False,
),
sa.Column(
"created_at",
sa.DateTime(timezone=True),
server_default=sa.text("now()"),
nullable=False,
),
sa.PrimaryKeyConstraint("id"),
)
op.create_index(
"ix_build_message_session_turn",
"build_message",
["session_id", "turn_index", sa.text("created_at ASC")],
unique=False,
)
# ==========================================================================
# CONNECTOR_CREDENTIAL_PAIR MODIFICATION
# ==========================================================================
op.add_column(
"connector_credential_pair",
sa.Column(
"processing_mode",
sa.String(),
nullable=False,
server_default="regular",
),
)
def downgrade() -> None:
# ==========================================================================
# CONNECTOR_CREDENTIAL_PAIR MODIFICATION
# ==========================================================================
op.drop_column("connector_credential_pair", "processing_mode")
# ==========================================================================
# BUILD_MESSAGE TABLE
# ==========================================================================
op.drop_index("ix_build_message_session_turn", table_name="build_message")
op.drop_table("build_message")
# ==========================================================================
# SNAPSHOT TABLE
# ==========================================================================
op.drop_index("ix_snapshot_session_created", table_name="snapshot")
op.drop_table("snapshot")
# ==========================================================================
# ARTIFACT TABLE
# ==========================================================================
op.drop_index("ix_artifact_type", table_name="artifact")
op.drop_index("ix_artifact_session_created", table_name="artifact")
op.drop_table("artifact")
sa.Enum(name="artifacttype").drop(op.get_bind(), checkfirst=True)
# ==========================================================================
# SANDBOX TABLE
# ==========================================================================
op.drop_index("ix_sandbox_container_id", table_name="sandbox")
op.drop_index("ix_sandbox_status", table_name="sandbox")
op.drop_table("sandbox")
sa.Enum(name="sandboxstatus").drop(op.get_bind(), checkfirst=True)
# ==========================================================================
# BUILD_SESSION TABLE
# ==========================================================================
op.drop_index("ix_build_session_status", table_name="build_session")
op.drop_index("ix_build_session_user_created", table_name="build_session")
op.drop_table("build_session")
sa.Enum(name="buildsessionstatus").drop(op.get_bind(), checkfirst=True)

View File

@@ -0,0 +1,42 @@
"""add_unique_constraint_to_inputprompt_prompt_user_id
Revision ID: 2c2430828bdf
Revises: fb80bdd256de
Create Date: 2026-01-20 16:01:54.314805
"""
from alembic import op
# revision identifiers, used by Alembic.
revision = "2c2430828bdf"
down_revision = "fb80bdd256de"
branch_labels = None
depends_on = None
def upgrade() -> None:
# Create unique constraint on (prompt, user_id) for user-owned prompts
# This ensures each user can only have one shortcut with a given name
op.create_unique_constraint(
"uq_inputprompt_prompt_user_id",
"inputprompt",
["prompt", "user_id"],
)
# Create partial unique index for public prompts (where user_id IS NULL)
# PostgreSQL unique constraints don't enforce uniqueness for NULL values,
# so we need a partial index to ensure public prompt names are also unique
op.execute(
"""
CREATE UNIQUE INDEX uq_inputprompt_prompt_public
ON inputprompt (prompt)
WHERE user_id IS NULL
"""
)
def downgrade() -> None:
op.execute("DROP INDEX IF EXISTS uq_inputprompt_prompt_public")
op.drop_constraint("uq_inputprompt_prompt_user_id", "inputprompt", type_="unique")

View File

@@ -0,0 +1,29 @@
"""remove default prompt shortcuts
Revision ID: 41fa44bef321
Revises: 2c2430828bdf
Create Date: 2025-01-21
"""
from alembic import op
# revision identifiers, used by Alembic.
revision = "41fa44bef321"
down_revision = "2c2430828bdf"
branch_labels = None
depends_on = None
def upgrade() -> None:
# Delete any user associations for the default prompts first (foreign key constraint)
op.execute(
"DELETE FROM inputprompt__user WHERE input_prompt_id IN (SELECT id FROM inputprompt WHERE id < 0)"
)
# Delete the pre-seeded default prompt shortcuts (they have negative IDs)
op.execute("DELETE FROM inputprompt WHERE id < 0")
def downgrade() -> None:
# We don't restore the default prompts on downgrade
pass

View File

@@ -85,103 +85,122 @@ class UserRow(NamedTuple):
def upgrade() -> None:
conn = op.get_bind()
# Start transaction
conn.execute(sa.text("BEGIN"))
# Step 1: Create or update the unified assistant (ID 0)
search_assistant = conn.execute(
sa.text("SELECT * FROM persona WHERE id = 0")
).fetchone()
try:
# Step 1: Create or update the unified assistant (ID 0)
search_assistant = conn.execute(
sa.text("SELECT * FROM persona WHERE id = 0")
).fetchone()
if search_assistant:
# Update existing Search assistant to be the unified assistant
conn.execute(
sa.text(
"""
UPDATE persona
SET name = :name,
description = :description,
system_prompt = :system_prompt,
num_chunks = :num_chunks,
is_default_persona = true,
is_visible = true,
deleted = false,
display_priority = :display_priority,
llm_filter_extraction = :llm_filter_extraction,
llm_relevance_filter = :llm_relevance_filter,
recency_bias = :recency_bias,
chunks_above = :chunks_above,
chunks_below = :chunks_below,
datetime_aware = :datetime_aware,
starter_messages = null
WHERE id = 0
"""
),
INSERT_DICT,
)
else:
# Create new unified assistant with ID 0
conn.execute(
sa.text(
"""
INSERT INTO persona (
id, name, description, system_prompt, num_chunks,
is_default_persona, is_visible, deleted, display_priority,
llm_filter_extraction, llm_relevance_filter, recency_bias,
chunks_above, chunks_below, datetime_aware, starter_messages,
builtin_persona
) VALUES (
0, :name, :description, :system_prompt, :num_chunks,
true, true, false, :display_priority, :llm_filter_extraction,
:llm_relevance_filter, :recency_bias, :chunks_above, :chunks_below,
:datetime_aware, null, true
)
"""
),
INSERT_DICT,
)
# Step 2: Mark ALL builtin assistants as deleted (except the unified assistant ID 0)
if search_assistant:
# Update existing Search assistant to be the unified assistant
conn.execute(
sa.text(
"""
UPDATE persona
SET deleted = true, is_visible = false, is_default_persona = false
WHERE builtin_persona = true AND id != 0
SET name = :name,
description = :description,
system_prompt = :system_prompt,
num_chunks = :num_chunks,
is_default_persona = true,
is_visible = true,
deleted = false,
display_priority = :display_priority,
llm_filter_extraction = :llm_filter_extraction,
llm_relevance_filter = :llm_relevance_filter,
recency_bias = :recency_bias,
chunks_above = :chunks_above,
chunks_below = :chunks_below,
datetime_aware = :datetime_aware,
starter_messages = null
WHERE id = 0
"""
)
),
INSERT_DICT,
)
else:
# Create new unified assistant with ID 0
conn.execute(
sa.text(
"""
INSERT INTO persona (
id, name, description, system_prompt, num_chunks,
is_default_persona, is_visible, deleted, display_priority,
llm_filter_extraction, llm_relevance_filter, recency_bias,
chunks_above, chunks_below, datetime_aware, starter_messages,
builtin_persona
) VALUES (
0, :name, :description, :system_prompt, :num_chunks,
true, true, false, :display_priority, :llm_filter_extraction,
:llm_relevance_filter, :recency_bias, :chunks_above, :chunks_below,
:datetime_aware, null, true
)
"""
),
INSERT_DICT,
)
# Step 3: Add all built-in tools to the unified assistant
# First, get the tool IDs for SearchTool, ImageGenerationTool, and WebSearchTool
search_tool = conn.execute(
sa.text("SELECT id FROM tool WHERE in_code_tool_id = 'SearchTool'")
).fetchone()
# Step 2: Mark ALL builtin assistants as deleted (except the unified assistant ID 0)
conn.execute(
sa.text(
"""
UPDATE persona
SET deleted = true, is_visible = false, is_default_persona = false
WHERE builtin_persona = true AND id != 0
"""
)
)
if not search_tool:
raise ValueError(
"SearchTool not found in database. Ensure tools migration has run first."
)
# Step 3: Add all built-in tools to the unified assistant
# First, get the tool IDs for SearchTool, ImageGenerationTool, and WebSearchTool
search_tool = conn.execute(
sa.text("SELECT id FROM tool WHERE in_code_tool_id = 'SearchTool'")
).fetchone()
image_gen_tool = conn.execute(
sa.text("SELECT id FROM tool WHERE in_code_tool_id = 'ImageGenerationTool'")
).fetchone()
if not search_tool:
raise ValueError(
"SearchTool not found in database. Ensure tools migration has run first."
)
if not image_gen_tool:
raise ValueError(
"ImageGenerationTool not found in database. Ensure tools migration has run first."
)
image_gen_tool = conn.execute(
sa.text("SELECT id FROM tool WHERE in_code_tool_id = 'ImageGenerationTool'")
).fetchone()
# WebSearchTool is optional - may not be configured
web_search_tool = conn.execute(
sa.text("SELECT id FROM tool WHERE in_code_tool_id = 'WebSearchTool'")
).fetchone()
if not image_gen_tool:
raise ValueError(
"ImageGenerationTool not found in database. Ensure tools migration has run first."
)
# Clear existing tool associations for persona 0
conn.execute(sa.text("DELETE FROM persona__tool WHERE persona_id = 0"))
# WebSearchTool is optional - may not be configured
web_search_tool = conn.execute(
sa.text("SELECT id FROM tool WHERE in_code_tool_id = 'WebSearchTool'")
).fetchone()
# Add tools to the unified assistant
# Clear existing tool associations for persona 0
conn.execute(sa.text("DELETE FROM persona__tool WHERE persona_id = 0"))
# Add tools to the unified assistant
conn.execute(
sa.text(
"""
INSERT INTO persona__tool (persona_id, tool_id)
VALUES (0, :tool_id)
ON CONFLICT DO NOTHING
"""
),
{"tool_id": search_tool[0]},
)
conn.execute(
sa.text(
"""
INSERT INTO persona__tool (persona_id, tool_id)
VALUES (0, :tool_id)
ON CONFLICT DO NOTHING
"""
),
{"tool_id": image_gen_tool[0]},
)
if web_search_tool:
conn.execute(
sa.text(
"""
@@ -190,191 +209,148 @@ def upgrade() -> None:
ON CONFLICT DO NOTHING
"""
),
{"tool_id": search_tool[0]},
{"tool_id": web_search_tool[0]},
)
conn.execute(
sa.text(
"""
INSERT INTO persona__tool (persona_id, tool_id)
VALUES (0, :tool_id)
ON CONFLICT DO NOTHING
# Step 4: Migrate existing chat sessions from all builtin assistants to unified assistant
conn.execute(
sa.text(
"""
),
{"tool_id": image_gen_tool[0]},
UPDATE chat_session
SET persona_id = 0
WHERE persona_id IN (
SELECT id FROM persona WHERE builtin_persona = true AND id != 0
)
"""
)
)
if web_search_tool:
# Step 5: Migrate user preferences - remove references to all builtin assistants
# First, get all builtin assistant IDs (except 0)
builtin_assistants_result = conn.execute(
sa.text(
"""
SELECT id FROM persona
WHERE builtin_persona = true AND id != 0
"""
)
).fetchall()
builtin_assistant_ids = [row[0] for row in builtin_assistants_result]
# Get all users with preferences
users_result = conn.execute(
sa.text(
"""
SELECT id, chosen_assistants, visible_assistants,
hidden_assistants, pinned_assistants
FROM "user"
"""
)
).fetchall()
for user_row in users_result:
user = UserRow(*user_row)
user_id: UUID = user.id
updates: dict[str, Any] = {}
# Remove all builtin assistants from chosen_assistants
if user.chosen_assistants:
new_chosen: list[int] = [
assistant_id
for assistant_id in user.chosen_assistants
if assistant_id not in builtin_assistant_ids
]
if new_chosen != user.chosen_assistants:
updates["chosen_assistants"] = json.dumps(new_chosen)
# Remove all builtin assistants from visible_assistants
if user.visible_assistants:
new_visible: list[int] = [
assistant_id
for assistant_id in user.visible_assistants
if assistant_id not in builtin_assistant_ids
]
if new_visible != user.visible_assistants:
updates["visible_assistants"] = json.dumps(new_visible)
# Add all builtin assistants to hidden_assistants
if user.hidden_assistants:
new_hidden: list[int] = list(user.hidden_assistants)
for old_id in builtin_assistant_ids:
if old_id not in new_hidden:
new_hidden.append(old_id)
if new_hidden != user.hidden_assistants:
updates["hidden_assistants"] = json.dumps(new_hidden)
else:
updates["hidden_assistants"] = json.dumps(builtin_assistant_ids)
# Remove all builtin assistants from pinned_assistants
if user.pinned_assistants:
new_pinned: list[int] = [
assistant_id
for assistant_id in user.pinned_assistants
if assistant_id not in builtin_assistant_ids
]
if new_pinned != user.pinned_assistants:
updates["pinned_assistants"] = json.dumps(new_pinned)
# Apply updates if any
if updates:
set_clause = ", ".join([f"{k} = :{k}" for k in updates.keys()])
updates["user_id"] = str(user_id) # Convert UUID to string for SQL
conn.execute(
sa.text(
"""
INSERT INTO persona__tool (persona_id, tool_id)
VALUES (0, :tool_id)
ON CONFLICT DO NOTHING
"""
),
{"tool_id": web_search_tool[0]},
sa.text(f'UPDATE "user" SET {set_clause} WHERE id = :user_id'),
updates,
)
# Step 4: Migrate existing chat sessions from all builtin assistants to unified assistant
conn.execute(
sa.text(
"""
UPDATE chat_session
SET persona_id = 0
WHERE persona_id IN (
SELECT id FROM persona WHERE builtin_persona = true AND id != 0
)
"""
)
)
# Step 5: Migrate user preferences - remove references to all builtin assistants
# First, get all builtin assistant IDs (except 0)
builtin_assistants_result = conn.execute(
sa.text(
"""
SELECT id FROM persona
WHERE builtin_persona = true AND id != 0
"""
)
).fetchall()
builtin_assistant_ids = [row[0] for row in builtin_assistants_result]
# Get all users with preferences
users_result = conn.execute(
sa.text(
"""
SELECT id, chosen_assistants, visible_assistants,
hidden_assistants, pinned_assistants
FROM "user"
"""
)
).fetchall()
for user_row in users_result:
user = UserRow(*user_row)
user_id: UUID = user.id
updates: dict[str, Any] = {}
# Remove all builtin assistants from chosen_assistants
if user.chosen_assistants:
new_chosen: list[int] = [
assistant_id
for assistant_id in user.chosen_assistants
if assistant_id not in builtin_assistant_ids
]
if new_chosen != user.chosen_assistants:
updates["chosen_assistants"] = json.dumps(new_chosen)
# Remove all builtin assistants from visible_assistants
if user.visible_assistants:
new_visible: list[int] = [
assistant_id
for assistant_id in user.visible_assistants
if assistant_id not in builtin_assistant_ids
]
if new_visible != user.visible_assistants:
updates["visible_assistants"] = json.dumps(new_visible)
# Add all builtin assistants to hidden_assistants
if user.hidden_assistants:
new_hidden: list[int] = list(user.hidden_assistants)
for old_id in builtin_assistant_ids:
if old_id not in new_hidden:
new_hidden.append(old_id)
if new_hidden != user.hidden_assistants:
updates["hidden_assistants"] = json.dumps(new_hidden)
else:
updates["hidden_assistants"] = json.dumps(builtin_assistant_ids)
# Remove all builtin assistants from pinned_assistants
if user.pinned_assistants:
new_pinned: list[int] = [
assistant_id
for assistant_id in user.pinned_assistants
if assistant_id not in builtin_assistant_ids
]
if new_pinned != user.pinned_assistants:
updates["pinned_assistants"] = json.dumps(new_pinned)
# Apply updates if any
if updates:
set_clause = ", ".join([f"{k} = :{k}" for k in updates.keys()])
updates["user_id"] = str(user_id) # Convert UUID to string for SQL
conn.execute(
sa.text(f'UPDATE "user" SET {set_clause} WHERE id = :user_id'),
updates,
)
# Commit transaction
conn.execute(sa.text("COMMIT"))
except Exception as e:
# Rollback on error
conn.execute(sa.text("ROLLBACK"))
raise e
def downgrade() -> None:
conn = op.get_bind()
# Start transaction
conn.execute(sa.text("BEGIN"))
try:
# Only restore General (ID -1) and Art (ID -3) assistants
# Step 1: Keep Search assistant (ID 0) as default but restore original state
conn.execute(
sa.text(
"""
UPDATE persona
SET is_default_persona = true,
is_visible = true,
deleted = false
WHERE id = 0
# Only restore General (ID -1) and Art (ID -3) assistants
# Step 1: Keep Search assistant (ID 0) as default but restore original state
conn.execute(
sa.text(
"""
)
UPDATE persona
SET is_default_persona = true,
is_visible = true,
deleted = false
WHERE id = 0
"""
)
)
# Step 2: Restore General assistant (ID -1)
conn.execute(
sa.text(
"""
UPDATE persona
SET deleted = false,
is_visible = true,
is_default_persona = true
WHERE id = :general_assistant_id
# Step 2: Restore General assistant (ID -1)
conn.execute(
sa.text(
"""
),
{"general_assistant_id": GENERAL_ASSISTANT_ID},
)
UPDATE persona
SET deleted = false,
is_visible = true,
is_default_persona = true
WHERE id = :general_assistant_id
"""
),
{"general_assistant_id": GENERAL_ASSISTANT_ID},
)
# Step 3: Restore Art assistant (ID -3)
conn.execute(
sa.text(
"""
UPDATE persona
SET deleted = false,
is_visible = true,
is_default_persona = true
WHERE id = :art_assistant_id
# Step 3: Restore Art assistant (ID -3)
conn.execute(
sa.text(
"""
),
{"art_assistant_id": ART_ASSISTANT_ID},
)
UPDATE persona
SET deleted = false,
is_visible = true,
is_default_persona = true
WHERE id = :art_assistant_id
"""
),
{"art_assistant_id": ART_ASSISTANT_ID},
)
# Note: We don't restore the original tool associations, names, or descriptions
# as those would require more complex logic to determine original state.
# We also cannot restore original chat session persona_ids as we don't
# have the original mappings.
# Other builtin assistants remain deleted as per the requirement.
# Commit transaction
conn.execute(sa.text("COMMIT"))
except Exception as e:
# Rollback on error
conn.execute(sa.text("ROLLBACK"))
raise e
# Note: We don't restore the original tool associations, names, or descriptions
# as those would require more complex logic to determine original state.
# We also cannot restore original chat session persona_ids as we don't
# have the original mappings.
# Other builtin assistants remain deleted as per the requirement.

View File

@@ -0,0 +1,45 @@
"""make processing mode default all caps
Revision ID: 72aa7de2e5cf
Revises: 2020d417ec84
Create Date: 2026-01-26 18:58:47.705253
This migration fixes the ProcessingMode enum value mismatch:
- SQLAlchemy's Enum with native_enum=False uses enum member NAMES as valid values
- The original migration stored lowercase VALUES ('regular', 'file_system')
- This converts existing data to uppercase NAMES ('REGULAR', 'FILE_SYSTEM')
- Also drops any spurious native PostgreSQL enum type that may have been auto-created
"""
from alembic import op
# revision identifiers, used by Alembic.
revision = "72aa7de2e5cf"
down_revision = "2020d417ec84"
branch_labels = None
depends_on = None
def upgrade() -> None:
# Convert existing lowercase values to uppercase to match enum member names
op.execute(
"UPDATE connector_credential_pair SET processing_mode = 'REGULAR' "
"WHERE processing_mode = 'regular'"
)
op.execute(
"UPDATE connector_credential_pair SET processing_mode = 'FILE_SYSTEM' "
"WHERE processing_mode = 'file_system'"
)
# Update the server default to use uppercase
op.alter_column(
"connector_credential_pair",
"processing_mode",
server_default="REGULAR",
)
def downgrade() -> None:
# State prior to this was broken, so we don't want to revert back to it
pass

View File

@@ -0,0 +1,47 @@
"""add_search_query_table
Revision ID: 73e9983e5091
Revises: d1b637d7050a
Create Date: 2026-01-14 14:16:52.837489
"""
from alembic import op
import sqlalchemy as sa
from sqlalchemy.dialects import postgresql
# revision identifiers, used by Alembic.
revision = "73e9983e5091"
down_revision = "d1b637d7050a"
branch_labels = None
depends_on = None
def upgrade() -> None:
op.create_table(
"search_query",
sa.Column("id", postgresql.UUID(as_uuid=True), primary_key=True),
sa.Column(
"user_id",
postgresql.UUID(as_uuid=True),
sa.ForeignKey("user.id"),
nullable=False,
),
sa.Column("query", sa.String(), nullable=False),
sa.Column("query_expansions", postgresql.ARRAY(sa.String()), nullable=True),
sa.Column(
"created_at",
sa.DateTime(timezone=True),
nullable=False,
server_default=sa.func.now(),
),
)
op.create_index("ix_search_query_user_id", "search_query", ["user_id"])
op.create_index("ix_search_query_created_at", "search_query", ["created_at"])
def downgrade() -> None:
op.drop_index("ix_search_query_created_at", table_name="search_query")
op.drop_index("ix_search_query_user_id", table_name="search_query")
op.drop_table("search_query")

View File

@@ -10,8 +10,7 @@ from alembic import op
import sqlalchemy as sa
from onyx.db.models import IndexModelStatus
from onyx.context.search.enums import RecencyBiasSetting
from onyx.context.search.enums import SearchType
from onyx.context.search.enums import RecencyBiasSetting, SearchType
# revision identifiers, used by Alembic.
revision = "776b3bbe9092"

View File

@@ -0,0 +1,58 @@
"""remove reranking from search_settings
Revision ID: 78ebc66946a0
Revises: 849b21c732f8
Create Date: 2026-01-28
"""
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = "78ebc66946a0"
down_revision = "849b21c732f8"
branch_labels: None = None
depends_on: None = None
def upgrade() -> None:
op.drop_column("search_settings", "disable_rerank_for_streaming")
op.drop_column("search_settings", "rerank_model_name")
op.drop_column("search_settings", "rerank_provider_type")
op.drop_column("search_settings", "rerank_api_key")
op.drop_column("search_settings", "rerank_api_url")
op.drop_column("search_settings", "num_rerank")
def downgrade() -> None:
op.add_column(
"search_settings",
sa.Column(
"disable_rerank_for_streaming",
sa.Boolean(),
nullable=False,
server_default="false",
),
)
op.add_column(
"search_settings", sa.Column("rerank_model_name", sa.String(), nullable=True)
)
op.add_column(
"search_settings", sa.Column("rerank_provider_type", sa.String(), nullable=True)
)
op.add_column(
"search_settings", sa.Column("rerank_api_key", sa.String(), nullable=True)
)
op.add_column(
"search_settings", sa.Column("rerank_api_url", sa.String(), nullable=True)
)
op.add_column(
"search_settings",
sa.Column(
"num_rerank",
sa.Integer(),
nullable=False,
server_default=str(20),
),
)

View File

@@ -0,0 +1,349 @@
"""hierarchy_nodes_v1
Revision ID: 81c22b1e2e78
Revises: 72aa7de2e5cf
Create Date: 2026-01-13 18:10:01.021451
"""
from alembic import op
import sqlalchemy as sa
from sqlalchemy.dialects import postgresql
from onyx.configs.constants import DocumentSource
# revision identifiers, used by Alembic.
revision = "81c22b1e2e78"
down_revision = "72aa7de2e5cf"
branch_labels = None
depends_on = None
# Human-readable display names for each source
SOURCE_DISPLAY_NAMES: dict[str, str] = {
"ingestion_api": "Ingestion API",
"slack": "Slack",
"web": "Web",
"google_drive": "Google Drive",
"gmail": "Gmail",
"requesttracker": "Request Tracker",
"github": "GitHub",
"gitbook": "GitBook",
"gitlab": "GitLab",
"guru": "Guru",
"bookstack": "BookStack",
"outline": "Outline",
"confluence": "Confluence",
"jira": "Jira",
"slab": "Slab",
"productboard": "Productboard",
"file": "File",
"coda": "Coda",
"notion": "Notion",
"zulip": "Zulip",
"linear": "Linear",
"hubspot": "HubSpot",
"document360": "Document360",
"gong": "Gong",
"google_sites": "Google Sites",
"zendesk": "Zendesk",
"loopio": "Loopio",
"dropbox": "Dropbox",
"sharepoint": "SharePoint",
"teams": "Teams",
"salesforce": "Salesforce",
"discourse": "Discourse",
"axero": "Axero",
"clickup": "ClickUp",
"mediawiki": "MediaWiki",
"wikipedia": "Wikipedia",
"asana": "Asana",
"s3": "S3",
"r2": "R2",
"google_cloud_storage": "Google Cloud Storage",
"oci_storage": "OCI Storage",
"xenforo": "XenForo",
"not_applicable": "Not Applicable",
"discord": "Discord",
"freshdesk": "Freshdesk",
"fireflies": "Fireflies",
"egnyte": "Egnyte",
"airtable": "Airtable",
"highspot": "Highspot",
"drupal_wiki": "Drupal Wiki",
"imap": "IMAP",
"bitbucket": "Bitbucket",
"testrail": "TestRail",
"mock_connector": "Mock Connector",
"user_file": "User File",
}
def upgrade() -> None:
# 1. Create hierarchy_node table
op.create_table(
"hierarchy_node",
sa.Column("id", sa.Integer(), nullable=False),
sa.Column("raw_node_id", sa.String(), nullable=False),
sa.Column("display_name", sa.String(), nullable=False),
sa.Column("link", sa.String(), nullable=True),
sa.Column("source", sa.String(), nullable=False),
sa.Column("node_type", sa.String(), nullable=False),
sa.Column("document_id", sa.String(), nullable=True),
sa.Column("parent_id", sa.Integer(), nullable=True),
# Permission fields - same pattern as Document table
sa.Column(
"external_user_emails",
postgresql.ARRAY(sa.String()),
nullable=True,
),
sa.Column(
"external_user_group_ids",
postgresql.ARRAY(sa.String()),
nullable=True,
),
sa.Column("is_public", sa.Boolean(), nullable=False, server_default="false"),
sa.PrimaryKeyConstraint("id"),
# When document is deleted, just unlink (node can exist without document)
sa.ForeignKeyConstraint(["document_id"], ["document.id"], ondelete="SET NULL"),
# When parent node is deleted, orphan children (cleanup via pruning)
sa.ForeignKeyConstraint(
["parent_id"], ["hierarchy_node.id"], ondelete="SET NULL"
),
sa.UniqueConstraint(
"raw_node_id", "source", name="uq_hierarchy_node_raw_id_source"
),
)
op.create_index("ix_hierarchy_node_parent_id", "hierarchy_node", ["parent_id"])
op.create_index(
"ix_hierarchy_node_source_type", "hierarchy_node", ["source", "node_type"]
)
# Add partial unique index to ensure only one SOURCE-type node per source
# This prevents duplicate source root nodes from being created
# NOTE: node_type stores enum NAME ('SOURCE'), not value ('source')
op.execute(
sa.text(
"""
CREATE UNIQUE INDEX uq_hierarchy_node_one_source_per_type
ON hierarchy_node (source)
WHERE node_type = 'SOURCE'
"""
)
)
# 2. Create hierarchy_fetch_attempt table
op.create_table(
"hierarchy_fetch_attempt",
sa.Column("id", postgresql.UUID(as_uuid=True), nullable=False),
sa.Column("connector_credential_pair_id", sa.Integer(), nullable=False),
sa.Column("status", sa.String(), nullable=False),
sa.Column("nodes_fetched", sa.Integer(), nullable=True, server_default="0"),
sa.Column("nodes_updated", sa.Integer(), nullable=True, server_default="0"),
sa.Column("error_msg", sa.Text(), nullable=True),
sa.Column("full_exception_trace", sa.Text(), nullable=True),
sa.Column(
"time_created",
sa.DateTime(timezone=True),
server_default=sa.func.now(),
nullable=False,
),
sa.Column("time_started", sa.DateTime(timezone=True), nullable=True),
sa.Column(
"time_updated",
sa.DateTime(timezone=True),
server_default=sa.func.now(),
nullable=False,
),
sa.PrimaryKeyConstraint("id"),
sa.ForeignKeyConstraint(
["connector_credential_pair_id"],
["connector_credential_pair.id"],
ondelete="CASCADE",
),
)
op.create_index(
"ix_hierarchy_fetch_attempt_status", "hierarchy_fetch_attempt", ["status"]
)
op.create_index(
"ix_hierarchy_fetch_attempt_time_created",
"hierarchy_fetch_attempt",
["time_created"],
)
op.create_index(
"ix_hierarchy_fetch_attempt_cc_pair",
"hierarchy_fetch_attempt",
["connector_credential_pair_id"],
)
# 3. Insert SOURCE-type hierarchy nodes for each DocumentSource
# We insert these so every existing document can have a parent hierarchy node
# NOTE: SQLAlchemy's Enum with native_enum=False stores the enum NAME (e.g., 'GOOGLE_DRIVE'),
# not the VALUE (e.g., 'google_drive'). We must use .name for source and node_type columns.
# SOURCE nodes are always public since they're just categorical roots.
for source in DocumentSource:
source_name = (
source.name
) # e.g., 'GOOGLE_DRIVE' - what SQLAlchemy stores/expects
source_value = source.value # e.g., 'google_drive' - the raw_node_id
display_name = SOURCE_DISPLAY_NAMES.get(
source_value, source_value.replace("_", " ").title()
)
op.execute(
sa.text(
"""
INSERT INTO hierarchy_node (raw_node_id, display_name, source, node_type, parent_id, is_public)
VALUES (:raw_node_id, :display_name, :source, 'SOURCE', NULL, true)
ON CONFLICT (raw_node_id, source) DO NOTHING
"""
).bindparams(
raw_node_id=source_value, # Use .value for raw_node_id (human-readable identifier)
display_name=display_name,
source=source_name, # Use .name for source column (SQLAlchemy enum storage)
)
)
# 4. Add parent_hierarchy_node_id column to document table
op.add_column(
"document",
sa.Column("parent_hierarchy_node_id", sa.Integer(), nullable=True),
)
# When hierarchy node is deleted, just unlink the document (SET NULL)
op.create_foreign_key(
"fk_document_parent_hierarchy_node",
"document",
"hierarchy_node",
["parent_hierarchy_node_id"],
["id"],
ondelete="SET NULL",
)
op.create_index(
"ix_document_parent_hierarchy_node_id",
"document",
["parent_hierarchy_node_id"],
)
# 5. Set all existing documents' parent_hierarchy_node_id to their source's SOURCE node
# For documents with multiple connectors, we pick one source deterministically (MIN connector_id)
# NOTE: Both connector.source and hierarchy_node.source store enum NAMEs (e.g., 'GOOGLE_DRIVE')
# because SQLAlchemy Enum(native_enum=False) uses the enum name for storage.
op.execute(
sa.text(
"""
UPDATE document d
SET parent_hierarchy_node_id = hn.id
FROM (
-- Get the source for each document (pick MIN connector_id for determinism)
SELECT DISTINCT ON (dbcc.id)
dbcc.id as doc_id,
c.source as source
FROM document_by_connector_credential_pair dbcc
JOIN connector c ON dbcc.connector_id = c.id
ORDER BY dbcc.id, dbcc.connector_id
) doc_source
JOIN hierarchy_node hn ON hn.source = doc_source.source AND hn.node_type = 'SOURCE'
WHERE d.id = doc_source.doc_id
"""
)
)
# Create the persona__hierarchy_node association table
op.create_table(
"persona__hierarchy_node",
sa.Column("persona_id", sa.Integer(), nullable=False),
sa.Column("hierarchy_node_id", sa.Integer(), nullable=False),
sa.ForeignKeyConstraint(
["persona_id"],
["persona.id"],
ondelete="CASCADE",
),
sa.ForeignKeyConstraint(
["hierarchy_node_id"],
["hierarchy_node.id"],
ondelete="CASCADE",
),
sa.PrimaryKeyConstraint("persona_id", "hierarchy_node_id"),
)
# Add index for efficient lookups
op.create_index(
"ix_persona__hierarchy_node_hierarchy_node_id",
"persona__hierarchy_node",
["hierarchy_node_id"],
)
# Create the persona__document association table for attaching individual
# documents directly to assistants
op.create_table(
"persona__document",
sa.Column("persona_id", sa.Integer(), nullable=False),
sa.Column("document_id", sa.String(), nullable=False),
sa.ForeignKeyConstraint(
["persona_id"],
["persona.id"],
ondelete="CASCADE",
),
sa.ForeignKeyConstraint(
["document_id"],
["document.id"],
ondelete="CASCADE",
),
sa.PrimaryKeyConstraint("persona_id", "document_id"),
)
# Add index for efficient lookups by document_id
op.create_index(
"ix_persona__document_document_id",
"persona__document",
["document_id"],
)
# 6. Add last_time_hierarchy_fetch column to connector_credential_pair table
op.add_column(
"connector_credential_pair",
sa.Column(
"last_time_hierarchy_fetch", sa.DateTime(timezone=True), nullable=True
),
)
def downgrade() -> None:
# Remove last_time_hierarchy_fetch from connector_credential_pair
op.drop_column("connector_credential_pair", "last_time_hierarchy_fetch")
# Drop persona__document table
op.drop_index("ix_persona__document_document_id", table_name="persona__document")
op.drop_table("persona__document")
# Drop persona__hierarchy_node table
op.drop_index(
"ix_persona__hierarchy_node_hierarchy_node_id",
table_name="persona__hierarchy_node",
)
op.drop_table("persona__hierarchy_node")
# Remove parent_hierarchy_node_id from document
op.drop_index("ix_document_parent_hierarchy_node_id", table_name="document")
op.drop_constraint(
"fk_document_parent_hierarchy_node", "document", type_="foreignkey"
)
op.drop_column("document", "parent_hierarchy_node_id")
# Drop hierarchy_fetch_attempt table
op.drop_index(
"ix_hierarchy_fetch_attempt_cc_pair", table_name="hierarchy_fetch_attempt"
)
op.drop_index(
"ix_hierarchy_fetch_attempt_time_created", table_name="hierarchy_fetch_attempt"
)
op.drop_index(
"ix_hierarchy_fetch_attempt_status", table_name="hierarchy_fetch_attempt"
)
op.drop_table("hierarchy_fetch_attempt")
# Drop hierarchy_node table
op.drop_index("uq_hierarchy_node_one_source_per_type", table_name="hierarchy_node")
op.drop_index("ix_hierarchy_node_source_type", table_name="hierarchy_node")
op.drop_index("ix_hierarchy_node_parent_id", table_name="hierarchy_node")
op.drop_table("hierarchy_node")

View File

@@ -0,0 +1,49 @@
"""notifications constraint, sort index, and cleanup old notifications
Revision ID: 8405ca81cc83
Revises: a3c1a7904cd0
Create Date: 2026-01-07 16:43:44.855156
"""
from alembic import op
# revision identifiers, used by Alembic.
revision = "8405ca81cc83"
down_revision = "a3c1a7904cd0"
branch_labels = None
depends_on = None
def upgrade() -> None:
# Create unique index for notification deduplication.
# This enables atomic ON CONFLICT DO NOTHING inserts in batch_create_notifications.
#
# Uses COALESCE to handle NULL additional_data (NULLs are normally distinct
# in unique constraints, but we want NULL == NULL for deduplication).
# The '{}' represents an empty JSONB object as the NULL replacement.
# Clean up legacy notifications first
op.execute("DELETE FROM notification WHERE title = 'New Notification'")
op.execute(
"""
CREATE UNIQUE INDEX IF NOT EXISTS ix_notification_user_type_data
ON notification (user_id, notif_type, COALESCE(additional_data, '{}'::jsonb))
"""
)
# Create index for efficient notification sorting by user
# Covers: WHERE user_id = ? ORDER BY dismissed, first_shown DESC
op.execute(
"""
CREATE INDEX IF NOT EXISTS ix_notification_user_sort
ON notification (user_id, dismissed, first_shown DESC)
"""
)
def downgrade() -> None:
op.execute("DROP INDEX IF EXISTS ix_notification_user_type_data")
op.execute("DROP INDEX IF EXISTS ix_notification_user_sort")

View File

@@ -0,0 +1,32 @@
"""add demo_data_enabled to build_session
Revision ID: 849b21c732f8
Revises: 81c22b1e2e78
Create Date: 2026-01-28 10:00:00.000000
"""
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = "849b21c732f8"
down_revision = "81c22b1e2e78"
branch_labels = None
depends_on = None
def upgrade() -> None:
op.add_column(
"build_session",
sa.Column(
"demo_data_enabled",
sa.Boolean(),
nullable=False,
server_default=sa.text("true"),
),
)
def downgrade() -> None:
op.drop_column("build_session", "demo_data_enabled")

View File

@@ -0,0 +1,116 @@
"""Add Discord bot tables
Revision ID: 8b5ce697290e
Revises: a1b2c3d4e5f7
Create Date: 2025-01-14
"""
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = "8b5ce697290e"
down_revision = "a1b2c3d4e5f7"
branch_labels: None = None
depends_on: None = None
def upgrade() -> None:
# DiscordBotConfig (singleton table - one per tenant)
op.create_table(
"discord_bot_config",
sa.Column(
"id",
sa.String(),
primary_key=True,
server_default=sa.text("'SINGLETON'"),
),
sa.Column("bot_token", sa.LargeBinary(), nullable=False), # EncryptedString
sa.Column(
"created_at",
sa.DateTime(timezone=True),
server_default=sa.func.now(),
nullable=False,
),
sa.CheckConstraint("id = 'SINGLETON'", name="ck_discord_bot_config_singleton"),
)
# DiscordGuildConfig
op.create_table(
"discord_guild_config",
sa.Column("id", sa.Integer(), primary_key=True),
sa.Column("guild_id", sa.BigInteger(), nullable=True, unique=True),
sa.Column("guild_name", sa.String(), nullable=True),
sa.Column("registration_key", sa.String(), nullable=False, unique=True),
sa.Column("registered_at", sa.DateTime(timezone=True), nullable=True),
sa.Column(
"default_persona_id",
sa.Integer(),
sa.ForeignKey("persona.id", ondelete="SET NULL"),
nullable=True,
),
sa.Column(
"enabled", sa.Boolean(), server_default=sa.text("true"), nullable=False
),
)
# DiscordChannelConfig
op.create_table(
"discord_channel_config",
sa.Column("id", sa.Integer(), primary_key=True),
sa.Column(
"guild_config_id",
sa.Integer(),
sa.ForeignKey("discord_guild_config.id", ondelete="CASCADE"),
nullable=False,
),
sa.Column("channel_id", sa.BigInteger(), nullable=False),
sa.Column("channel_name", sa.String(), nullable=False),
sa.Column(
"channel_type",
sa.String(20),
server_default=sa.text("'text'"),
nullable=False,
),
sa.Column(
"is_private",
sa.Boolean(),
server_default=sa.text("false"),
nullable=False,
),
sa.Column(
"thread_only_mode",
sa.Boolean(),
server_default=sa.text("false"),
nullable=False,
),
sa.Column(
"require_bot_invocation",
sa.Boolean(),
server_default=sa.text("true"),
nullable=False,
),
sa.Column(
"persona_override_id",
sa.Integer(),
sa.ForeignKey("persona.id", ondelete="SET NULL"),
nullable=True,
),
sa.Column(
"enabled", sa.Boolean(), server_default=sa.text("false"), nullable=False
),
)
# Unique constraint: one config per channel per guild
op.create_unique_constraint(
"uq_discord_channel_guild_channel",
"discord_channel_config",
["guild_config_id", "channel_id"],
)
def downgrade() -> None:
op.drop_table("discord_channel_config")
op.drop_table("discord_guild_config")
op.drop_table("discord_bot_config")

View File

@@ -0,0 +1,36 @@
"""add_chat_compression_fields
Revision ID: 90b409d06e50
Revises: f220515df7b4
Create Date: 2026-01-26 09:13:09.635427
"""
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = "90b409d06e50"
down_revision = "f220515df7b4"
branch_labels = None
depends_on = None
def upgrade() -> None:
# Add last_summarized_message_id to chat_message
# This field marks a message as a summary and indicates the last message it covers.
# Summaries are branch-aware via their parent_message_id pointing to the branch.
op.add_column(
"chat_message",
sa.Column(
"last_summarized_message_id",
sa.Integer(),
sa.ForeignKey("chat_message.id", ondelete="SET NULL"),
nullable=True,
),
)
def downgrade() -> None:
op.drop_column("chat_message", "last_summarized_message_id")

View File

@@ -16,7 +16,6 @@ from typing import Generator
from alembic import op
import sqlalchemy as sa
from onyx.document_index.factory import get_default_document_index
from onyx.document_index.vespa_constants import DOCUMENT_ID_ENDPOINT
from onyx.db.search_settings import SearchSettings
from onyx.configs.app_configs import AUTH_TYPE
@@ -126,14 +125,11 @@ def remove_old_tags() -> None:
the document got reindexed, the old tag would not be removed.
This function removes those old tags by comparing it against the tags in vespa.
"""
current_search_settings, future_search_settings = active_search_settings()
document_index = get_default_document_index(
current_search_settings, future_search_settings
)
current_search_settings, _ = active_search_settings()
# Get the index name
if hasattr(document_index, "index_name"):
index_name = document_index.index_name
if hasattr(current_search_settings, "index_name"):
index_name = current_search_settings.index_name
else:
# Default index name if we can't get it from the document_index
index_name = "danswer_index"

View File

@@ -0,0 +1,43 @@
"""add chunk error and vespa count columns to opensearch tenant migration
Revision ID: 93c15d6a6fbb
Revises: d3fd499c829c
Create Date: 2026-02-11 23:07:34.576725
"""
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = "93c15d6a6fbb"
down_revision = "d3fd499c829c"
branch_labels = None
depends_on = None
def upgrade() -> None:
op.add_column(
"opensearch_tenant_migration_record",
sa.Column(
"total_chunks_errored",
sa.Integer(),
nullable=False,
server_default="0",
),
)
op.add_column(
"opensearch_tenant_migration_record",
sa.Column(
"total_chunks_in_vespa",
sa.Integer(),
nullable=False,
server_default="0",
),
)
def downgrade() -> None:
op.drop_column("opensearch_tenant_migration_record", "total_chunks_in_vespa")
op.drop_column("opensearch_tenant_migration_record", "total_chunks_errored")

View File

@@ -0,0 +1,27 @@
"""add processing_duration_seconds to chat_message
Revision ID: 9d1543a37106
Revises: cbc03e08d0f3
Create Date: 2026-01-21 11:42:18.546188
"""
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = "9d1543a37106"
down_revision = "cbc03e08d0f3"
branch_labels = None
depends_on = None
def upgrade() -> None:
op.add_column(
"chat_message",
sa.Column("processing_duration_seconds", sa.Float(), nullable=True),
)
def downgrade() -> None:
op.drop_column("chat_message", "processing_duration_seconds")

View File

@@ -42,20 +42,13 @@ TOOL_DESCRIPTIONS = {
def upgrade() -> None:
conn = op.get_bind()
conn.execute(sa.text("BEGIN"))
try:
for tool_id, description in TOOL_DESCRIPTIONS.items():
conn.execute(
sa.text(
"UPDATE tool SET description = :description WHERE in_code_tool_id = :tool_id"
),
{"description": description, "tool_id": tool_id},
)
conn.execute(sa.text("COMMIT"))
except Exception as e:
conn.execute(sa.text("ROLLBACK"))
raise e
for tool_id, description in TOOL_DESCRIPTIONS.items():
conn.execute(
sa.text(
"UPDATE tool SET description = :description WHERE in_code_tool_id = :tool_id"
),
{"description": description, "tool_id": tool_id},
)
def downgrade() -> None:

View File

@@ -0,0 +1,47 @@
"""drop agent_search_metrics table
Revision ID: a1b2c3d4e5f7
Revises: 73e9983e5091
Create Date: 2026-01-17
"""
from alembic import op
import sqlalchemy as sa
from sqlalchemy.dialects import postgresql
# revision identifiers, used by Alembic.
revision = "a1b2c3d4e5f7"
down_revision = "73e9983e5091"
branch_labels = None
depends_on = None
def upgrade() -> None:
op.drop_table("agent__search_metrics")
def downgrade() -> None:
op.create_table(
"agent__search_metrics",
sa.Column("id", sa.Integer(), nullable=False),
sa.Column("user_id", sa.UUID(), nullable=True),
sa.Column("persona_id", sa.Integer(), nullable=True),
sa.Column("agent_type", sa.String(), nullable=False),
sa.Column("start_time", sa.DateTime(timezone=True), nullable=False),
sa.Column("base_duration_s", sa.Float(), nullable=False),
sa.Column("full_duration_s", sa.Float(), nullable=False),
sa.Column("base_metrics", postgresql.JSONB(), nullable=True),
sa.Column("refined_metrics", postgresql.JSONB(), nullable=True),
sa.Column("all_metrics", postgresql.JSONB(), nullable=True),
sa.ForeignKeyConstraint(
["user_id"],
["user.id"],
ondelete="CASCADE",
),
sa.ForeignKeyConstraint(
["persona_id"],
["persona.id"],
),
sa.PrimaryKeyConstraint("id"),
)

View File

@@ -0,0 +1,81 @@
"""seed_memory_tool and add enable_memory_tool to user
Revision ID: b51c6844d1df
Revises: 93c15d6a6fbb
Create Date: 2026-02-11 00:00:00.000000
"""
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = "b51c6844d1df"
down_revision = "93c15d6a6fbb"
branch_labels = None
depends_on = None
MEMORY_TOOL = {
"name": "MemoryTool",
"display_name": "Add Memory",
"description": "Save memories about the user for future conversations.",
"in_code_tool_id": "MemoryTool",
"enabled": True,
}
def upgrade() -> None:
conn = op.get_bind()
existing = conn.execute(
sa.text(
"SELECT in_code_tool_id FROM tool WHERE in_code_tool_id = :in_code_tool_id"
),
{"in_code_tool_id": MEMORY_TOOL["in_code_tool_id"]},
).fetchone()
if existing:
conn.execute(
sa.text(
"""
UPDATE tool
SET name = :name,
display_name = :display_name,
description = :description
WHERE in_code_tool_id = :in_code_tool_id
"""
),
MEMORY_TOOL,
)
else:
conn.execute(
sa.text(
"""
INSERT INTO tool (name, display_name, description, in_code_tool_id, enabled)
VALUES (:name, :display_name, :description, :in_code_tool_id, :enabled)
"""
),
MEMORY_TOOL,
)
op.add_column(
"user",
sa.Column(
"enable_memory_tool",
sa.Boolean(),
nullable=False,
server_default=sa.true(),
),
)
def downgrade() -> None:
op.drop_column("user", "enable_memory_tool")
conn = op.get_bind()
conn.execute(
sa.text("DELETE FROM tool WHERE in_code_tool_id = :in_code_tool_id"),
{"in_code_tool_id": MEMORY_TOOL["in_code_tool_id"]},
)

View File

@@ -0,0 +1,40 @@
"""Persona new default model configuration id column
Revision ID: be87a654d5af
Revises: e7f8a9b0c1d2
Create Date: 2026-01-30 11:14:17.306275
"""
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = "be87a654d5af"
down_revision = "e7f8a9b0c1d2"
branch_labels = None
depends_on = None
def upgrade() -> None:
op.add_column(
"persona",
sa.Column("default_model_configuration_id", sa.Integer(), nullable=True),
)
op.create_foreign_key(
"fk_persona_default_model_configuration_id",
"persona",
"model_configuration",
["default_model_configuration_id"],
["id"],
ondelete="SET NULL",
)
def downgrade() -> None:
op.drop_constraint(
"fk_persona_default_model_configuration_id", "persona", type_="foreignkey"
)
op.drop_column("persona", "default_model_configuration_id")

View File

@@ -7,7 +7,6 @@ Create Date: 2025-12-18 16:00:00.000000
"""
from alembic import op
from onyx.deep_research.dr_mock_tools import RESEARCH_AGENT_DB_NAME
import sqlalchemy as sa
@@ -19,7 +18,7 @@ depends_on = None
DEEP_RESEARCH_TOOL = {
"name": RESEARCH_AGENT_DB_NAME,
"name": "ResearchAgent",
"display_name": "Research Agent",
"description": "The Research Agent is a sub-agent that conducts research on a specific topic.",
"in_code_tool_id": "ResearchAgent",

View File

@@ -0,0 +1,128 @@
"""add_opensearch_migration_tables
Revision ID: cbc03e08d0f3
Revises: be87a654d5af
Create Date: 2026-01-31 17:00:45.176604
"""
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = "cbc03e08d0f3"
down_revision = "be87a654d5af"
branch_labels = None
depends_on = None
def upgrade() -> None:
# 1. Create opensearch_document_migration_record table.
op.create_table(
"opensearch_document_migration_record",
sa.Column("document_id", sa.String(), nullable=False),
sa.Column("status", sa.String(), nullable=False, server_default="pending"),
sa.Column("error_message", sa.Text(), nullable=True),
sa.Column("attempts_count", sa.Integer(), nullable=False, server_default="0"),
sa.Column("last_attempt_at", sa.DateTime(timezone=True), nullable=True),
sa.Column(
"created_at",
sa.DateTime(timezone=True),
server_default=sa.func.now(),
nullable=False,
),
sa.PrimaryKeyConstraint("document_id"),
sa.ForeignKeyConstraint(
["document_id"],
["document.id"],
ondelete="CASCADE",
),
)
# 2. Create indices.
op.create_index(
"ix_opensearch_document_migration_record_status",
"opensearch_document_migration_record",
["status"],
)
op.create_index(
"ix_opensearch_document_migration_record_attempts_count",
"opensearch_document_migration_record",
["attempts_count"],
)
op.create_index(
"ix_opensearch_document_migration_record_created_at",
"opensearch_document_migration_record",
["created_at"],
)
# 3. Create opensearch_tenant_migration_record table (singleton).
op.create_table(
"opensearch_tenant_migration_record",
sa.Column("id", sa.Integer(), nullable=False),
sa.Column(
"document_migration_record_table_population_status",
sa.String(),
nullable=False,
server_default="pending",
),
sa.Column(
"num_times_observed_no_additional_docs_to_populate_migration_table",
sa.Integer(),
nullable=False,
server_default="0",
),
sa.Column(
"overall_document_migration_status",
sa.String(),
nullable=False,
server_default="pending",
),
sa.Column(
"num_times_observed_no_additional_docs_to_migrate",
sa.Integer(),
nullable=False,
server_default="0",
),
sa.Column(
"last_updated_at",
sa.DateTime(timezone=True),
server_default=sa.func.now(),
nullable=False,
),
sa.PrimaryKeyConstraint("id"),
)
# 4. Create unique index on constant to enforce singleton pattern.
op.execute(
sa.text(
"""
CREATE UNIQUE INDEX idx_opensearch_tenant_migration_singleton
ON opensearch_tenant_migration_record ((true))
"""
)
)
def downgrade() -> None:
# Drop opensearch_tenant_migration_record.
op.drop_index(
"idx_opensearch_tenant_migration_singleton",
table_name="opensearch_tenant_migration_record",
)
op.drop_table("opensearch_tenant_migration_record")
# Drop opensearch_document_migration_record.
op.drop_index(
"ix_opensearch_document_migration_record_created_at",
table_name="opensearch_document_migration_record",
)
op.drop_index(
"ix_opensearch_document_migration_record_attempts_count",
table_name="opensearch_document_migration_record",
)
op.drop_index(
"ix_opensearch_document_migration_record_status",
table_name="opensearch_document_migration_record",
)
op.drop_table("opensearch_document_migration_record")

View File

@@ -70,80 +70,66 @@ BUILT_IN_TOOLS = [
def upgrade() -> None:
conn = op.get_bind()
# Start transaction
conn.execute(sa.text("BEGIN"))
# Get existing tools to check what already exists
existing_tools = conn.execute(
sa.text("SELECT in_code_tool_id FROM tool WHERE in_code_tool_id IS NOT NULL")
).fetchall()
existing_tool_ids = {row[0] for row in existing_tools}
try:
# Get existing tools to check what already exists
existing_tools = conn.execute(
sa.text(
"SELECT in_code_tool_id FROM tool WHERE in_code_tool_id IS NOT NULL"
# Insert or update built-in tools
for tool in BUILT_IN_TOOLS:
in_code_id = tool["in_code_tool_id"]
# Handle historical rename: InternetSearchTool -> WebSearchTool
if (
in_code_id == "WebSearchTool"
and "WebSearchTool" not in existing_tool_ids
and "InternetSearchTool" in existing_tool_ids
):
# Rename the existing InternetSearchTool row in place and update fields
conn.execute(
sa.text(
"""
UPDATE tool
SET name = :name,
display_name = :display_name,
description = :description,
in_code_tool_id = :in_code_tool_id
WHERE in_code_tool_id = 'InternetSearchTool'
"""
),
tool,
)
).fetchall()
existing_tool_ids = {row[0] for row in existing_tools}
# Keep the local view of existing ids in sync to avoid duplicate insert
existing_tool_ids.discard("InternetSearchTool")
existing_tool_ids.add("WebSearchTool")
continue
# Insert or update built-in tools
for tool in BUILT_IN_TOOLS:
in_code_id = tool["in_code_tool_id"]
# Handle historical rename: InternetSearchTool -> WebSearchTool
if (
in_code_id == "WebSearchTool"
and "WebSearchTool" not in existing_tool_ids
and "InternetSearchTool" in existing_tool_ids
):
# Rename the existing InternetSearchTool row in place and update fields
conn.execute(
sa.text(
"""
UPDATE tool
SET name = :name,
display_name = :display_name,
description = :description,
in_code_tool_id = :in_code_tool_id
WHERE in_code_tool_id = 'InternetSearchTool'
"""
),
tool,
)
# Keep the local view of existing ids in sync to avoid duplicate insert
existing_tool_ids.discard("InternetSearchTool")
existing_tool_ids.add("WebSearchTool")
continue
if in_code_id in existing_tool_ids:
# Update existing tool
conn.execute(
sa.text(
"""
UPDATE tool
SET name = :name,
display_name = :display_name,
description = :description
WHERE in_code_tool_id = :in_code_tool_id
"""
),
tool,
)
else:
# Insert new tool
conn.execute(
sa.text(
"""
INSERT INTO tool (name, display_name, description, in_code_tool_id)
VALUES (:name, :display_name, :description, :in_code_tool_id)
"""
),
tool,
)
# Commit transaction
conn.execute(sa.text("COMMIT"))
except Exception as e:
# Rollback on error
conn.execute(sa.text("ROLLBACK"))
raise e
if in_code_id in existing_tool_ids:
# Update existing tool
conn.execute(
sa.text(
"""
UPDATE tool
SET name = :name,
display_name = :display_name,
description = :description
WHERE in_code_tool_id = :in_code_tool_id
"""
),
tool,
)
else:
# Insert new tool
conn.execute(
sa.text(
"""
INSERT INTO tool (name, display_name, description, in_code_tool_id)
VALUES (:name, :display_name, :description, :in_code_tool_id)
"""
),
tool,
)
def downgrade() -> None:

View File

@@ -0,0 +1,64 @@
"""sync_exa_api_key_to_content_provider
Revision ID: d1b637d7050a
Revises: d25168c2beee
Create Date: 2026-01-09 15:54:15.646249
"""
from alembic import op
from sqlalchemy import text
# revision identifiers, used by Alembic.
revision = "d1b637d7050a"
down_revision = "d25168c2beee"
branch_labels = None
depends_on = None
def upgrade() -> None:
# Exa uses a shared API key between search and content providers.
# For existing Exa search providers with API keys, create the corresponding
# content provider if it doesn't exist yet.
connection = op.get_bind()
# Check if Exa search provider exists with an API key
result = connection.execute(
text(
"""
SELECT api_key FROM internet_search_provider
WHERE provider_type = 'exa' AND api_key IS NOT NULL
LIMIT 1
"""
)
)
row = result.fetchone()
if row:
api_key = row[0]
# Create Exa content provider with the shared key
connection.execute(
text(
"""
INSERT INTO internet_content_provider
(name, provider_type, api_key, is_active)
VALUES ('Exa', 'exa', :api_key, false)
ON CONFLICT (name) DO NOTHING
"""
),
{"api_key": api_key},
)
def downgrade() -> None:
# Remove the Exa content provider that was created by this migration
connection = op.get_bind()
connection.execute(
text(
"""
DELETE FROM internet_content_provider
WHERE provider_type = 'exa'
"""
)
)

View File

@@ -0,0 +1,86 @@
"""tool_name_consistency
Revision ID: d25168c2beee
Revises: 8405ca81cc83
Create Date: 2026-01-11 17:54:40.135777
"""
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = "d25168c2beee"
down_revision = "8405ca81cc83"
branch_labels = None
depends_on = None
# Currently the seeded tools have the in_code_tool_id == name
CURRENT_TOOL_NAME_MAPPING = [
"SearchTool",
"WebSearchTool",
"ImageGenerationTool",
"PythonTool",
"OpenURLTool",
"KnowledgeGraphTool",
"ResearchAgent",
]
# Mapping of in_code_tool_id -> name
# These are the expected names that we want in the database
EXPECTED_TOOL_NAME_MAPPING = {
"SearchTool": "internal_search",
"WebSearchTool": "web_search",
"ImageGenerationTool": "generate_image",
"PythonTool": "python",
"OpenURLTool": "open_url",
"KnowledgeGraphTool": "run_kg_search",
"ResearchAgent": "research_agent",
}
def upgrade() -> None:
conn = op.get_bind()
# Mapping of in_code_tool_id to the NAME constant from each tool class
# These match the .name property of each tool implementation
tool_name_mapping = EXPECTED_TOOL_NAME_MAPPING
# Update the name column for each tool based on its in_code_tool_id
for in_code_tool_id, expected_name in tool_name_mapping.items():
conn.execute(
sa.text(
"""
UPDATE tool
SET name = :expected_name
WHERE in_code_tool_id = :in_code_tool_id
"""
),
{
"expected_name": expected_name,
"in_code_tool_id": in_code_tool_id,
},
)
def downgrade() -> None:
conn = op.get_bind()
# Reverse the migration by setting name back to in_code_tool_id
# This matches the original pattern where name was the class name
for in_code_tool_id in CURRENT_TOOL_NAME_MAPPING:
conn.execute(
sa.text(
"""
UPDATE tool
SET name = :current_name
WHERE in_code_tool_id = :in_code_tool_id
"""
),
{
"current_name": in_code_tool_id,
"in_code_tool_id": in_code_tool_id,
},
)

View File

@@ -0,0 +1,102 @@
"""add_file_reader_tool
Revision ID: d3fd499c829c
Revises: 114a638452db
Create Date: 2026-02-07 19:28:22.452337
"""
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = "d3fd499c829c"
down_revision = "114a638452db"
branch_labels = None
depends_on = None
FILE_READER_TOOL = {
"name": "read_file",
"display_name": "File Reader",
"description": (
"Read sections of user-uploaded files by character offset. "
"Useful for inspecting large files that cannot fit entirely in context."
),
"in_code_tool_id": "FileReaderTool",
"enabled": True,
}
def upgrade() -> None:
conn = op.get_bind()
# Check if tool already exists
existing = conn.execute(
sa.text("SELECT id FROM tool WHERE in_code_tool_id = :in_code_tool_id"),
{"in_code_tool_id": FILE_READER_TOOL["in_code_tool_id"]},
).fetchone()
if existing:
# Update existing tool
conn.execute(
sa.text(
"""
UPDATE tool
SET name = :name,
display_name = :display_name,
description = :description
WHERE in_code_tool_id = :in_code_tool_id
"""
),
FILE_READER_TOOL,
)
tool_id = existing[0]
else:
# Insert new tool
result = conn.execute(
sa.text(
"""
INSERT INTO tool (name, display_name, description, in_code_tool_id, enabled)
VALUES (:name, :display_name, :description, :in_code_tool_id, :enabled)
RETURNING id
"""
),
FILE_READER_TOOL,
)
tool_id = result.scalar_one()
# Attach to the default persona (id=0) if not already attached
conn.execute(
sa.text(
"""
INSERT INTO persona__tool (persona_id, tool_id)
VALUES (0, :tool_id)
ON CONFLICT DO NOTHING
"""
),
{"tool_id": tool_id},
)
def downgrade() -> None:
conn = op.get_bind()
in_code_tool_id = FILE_READER_TOOL["in_code_tool_id"]
# Remove persona associations first (FK constraint)
conn.execute(
sa.text(
"""
DELETE FROM persona__tool
WHERE tool_id IN (
SELECT id FROM tool WHERE in_code_tool_id = :in_code_tool_id
)
"""
),
{"in_code_tool_id": in_code_tool_id},
)
conn.execute(
sa.text("DELETE FROM tool WHERE in_code_tool_id = :in_code_tool_id"),
{"in_code_tool_id": in_code_tool_id},
)

View File

@@ -0,0 +1,35 @@
"""add_file_content
Revision ID: d56ffa94ca32
Revises: 01f8e6d95a33
Create Date: 2026-02-06 15:29:34.192960
"""
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = "d56ffa94ca32"
down_revision = "01f8e6d95a33"
branch_labels = None
depends_on = None
def upgrade() -> None:
op.create_table(
"file_content",
sa.Column(
"file_id",
sa.String(),
sa.ForeignKey("file_record.file_id", ondelete="CASCADE"),
primary_key=True,
),
sa.Column("lobj_oid", sa.BigInteger(), nullable=False),
sa.Column("file_size", sa.BigInteger(), nullable=False, server_default="0"),
)
def downgrade() -> None:
op.drop_table("file_content")

View File

@@ -0,0 +1,35 @@
"""add_cascade_delete_to_search_query_user_id
Revision ID: d5c86e2c6dc6
Revises: 90b409d06e50
Create Date: 2026-02-04 16:05:04.749804
"""
from alembic import op
# revision identifiers, used by Alembic.
revision = "d5c86e2c6dc6"
down_revision = "90b409d06e50"
branch_labels = None
depends_on = None
def upgrade() -> None:
op.drop_constraint("search_query_user_id_fkey", "search_query", type_="foreignkey")
op.create_foreign_key(
"search_query_user_id_fkey",
"search_query",
"user",
["user_id"],
["id"],
ondelete="CASCADE",
)
def downgrade() -> None:
op.drop_constraint("search_query_user_id_fkey", "search_query", type_="foreignkey")
op.create_foreign_key(
"search_query_user_id_fkey", "search_query", "user", ["user_id"], ["id"]
)

View File

@@ -0,0 +1,125 @@
"""create_anonymous_user
This migration creates a permanent anonymous user in the database.
When anonymous access is enabled, unauthenticated requests will use this user
instead of returning user_id=NULL.
Revision ID: e7f8a9b0c1d2
Revises: f7ca3e2f45d9
Create Date: 2026-01-15 14:00:00.000000
"""
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = "e7f8a9b0c1d2"
down_revision = "f7ca3e2f45d9"
branch_labels = None
depends_on = None
# Must match constants in onyx/configs/constants.py file
ANONYMOUS_USER_UUID = "00000000-0000-0000-0000-000000000002"
ANONYMOUS_USER_EMAIL = "anonymous@onyx.app"
# Tables with user_id foreign key that may need migration
TABLES_WITH_USER_ID = [
"chat_session",
"credential",
"document_set",
"persona",
"tool",
"notification",
"inputprompt",
]
def upgrade() -> None:
"""
Create the anonymous user for anonymous access feature.
Also migrates any remaining user_id=NULL records to the anonymous user.
"""
connection = op.get_bind()
# Create the anonymous user (using ON CONFLICT to be idempotent)
connection.execute(
sa.text(
"""
INSERT INTO "user" (id, email, hashed_password, is_active, is_superuser, is_verified, role)
VALUES (:id, :email, :hashed_password, :is_active, :is_superuser, :is_verified, :role)
ON CONFLICT (id) DO NOTHING
"""
),
{
"id": ANONYMOUS_USER_UUID,
"email": ANONYMOUS_USER_EMAIL,
"hashed_password": "", # Empty password - user cannot log in directly
"is_active": True, # Active so it can be used for anonymous access
"is_superuser": False,
"is_verified": True, # Verified since no email verification needed
"role": "LIMITED", # Anonymous users have limited role to restrict access
},
)
# Migrate any remaining user_id=NULL records to anonymous user
for table in TABLES_WITH_USER_ID:
try:
# Exclude public credential (id=0) which must remain user_id=NULL
# Exclude builtin tools (in_code_tool_id IS NOT NULL) which must remain user_id=NULL
# Exclude builtin personas (builtin_persona=True) which must remain user_id=NULL
# Exclude system input prompts (is_public=True with user_id=NULL) which must remain user_id=NULL
if table == "credential":
condition = "user_id IS NULL AND id != 0"
elif table == "tool":
condition = "user_id IS NULL AND in_code_tool_id IS NULL"
elif table == "persona":
condition = "user_id IS NULL AND builtin_persona = false"
elif table == "inputprompt":
condition = "user_id IS NULL AND is_public = false"
else:
condition = "user_id IS NULL"
result = connection.execute(
sa.text(
f"""
UPDATE "{table}"
SET user_id = :user_id
WHERE {condition}
"""
),
{"user_id": ANONYMOUS_USER_UUID},
)
if result.rowcount > 0:
print(f"Updated {result.rowcount} rows in {table} to anonymous user")
except Exception as e:
print(f"Skipping {table}: {e}")
def downgrade() -> None:
"""
Set anonymous user's records back to NULL and delete the anonymous user.
"""
connection = op.get_bind()
# Set records back to NULL
for table in TABLES_WITH_USER_ID:
try:
connection.execute(
sa.text(
f"""
UPDATE "{table}"
SET user_id = NULL
WHERE user_id = :user_id
"""
),
{"user_id": ANONYMOUS_USER_UUID},
)
except Exception:
pass
# Delete the anonymous user
connection.execute(
sa.text('DELETE FROM "user" WHERE id = :user_id'),
{"user_id": ANONYMOUS_USER_UUID},
)

View File

@@ -0,0 +1,57 @@
"""Add flow mapping table
Revision ID: f220515df7b4
Revises: cbc03e08d0f3
Create Date: 2026-01-30 12:21:24.955922
"""
from onyx.db.enums import LLMModelFlowType
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = "f220515df7b4"
down_revision = "9d1543a37106"
branch_labels = None
depends_on = None
def upgrade() -> None:
op.create_table(
"llm_model_flow",
sa.Column("id", sa.Integer(), nullable=False),
sa.Column(
"llm_model_flow_type",
sa.Enum(LLMModelFlowType, name="llmmodelflowtype", native_enum=False),
nullable=False,
),
sa.Column(
"is_default", sa.Boolean(), nullable=False, server_default=sa.text("false")
),
sa.Column("model_configuration_id", sa.Integer(), nullable=False),
sa.PrimaryKeyConstraint("id"),
sa.ForeignKeyConstraint(
["model_configuration_id"], ["model_configuration.id"], ondelete="CASCADE"
),
sa.UniqueConstraint(
"llm_model_flow_type",
"model_configuration_id",
name="uq_model_config_per_llm_model_flow_type",
),
)
# Partial unique index so that there is at most one default for each flow type
op.create_index(
"ix_one_default_per_llm_model_flow",
"llm_model_flow",
["llm_model_flow_type"],
unique=True,
postgresql_where=sa.text("is_default IS TRUE"),
)
def downgrade() -> None:
# Drop the llm_model_flow table (index is dropped automatically with table)
op.drop_table("llm_model_flow")

View File

@@ -0,0 +1,281 @@
"""migrate_no_auth_data_to_placeholder
This migration handles the transition from AUTH_TYPE=disabled to requiring
authentication. It creates a placeholder user and assigns all data that was
created without a user (user_id=NULL) to this placeholder.
A database trigger is installed that automatically transfers all data from
the placeholder user to the first real user who registers, then drops itself.
Revision ID: f7ca3e2f45d9
Revises: 78ebc66946a0
Create Date: 2026-01-15 12:49:53.802741
"""
import os
from alembic import op
import sqlalchemy as sa
from shared_configs.configs import MULTI_TENANT
# revision identifiers, used by Alembic.
revision = "f7ca3e2f45d9"
down_revision = "78ebc66946a0"
branch_labels = None
depends_on = None
# Must match constants in onyx/configs/constants.py file
NO_AUTH_PLACEHOLDER_USER_UUID = "00000000-0000-0000-0000-000000000001"
NO_AUTH_PLACEHOLDER_USER_EMAIL = "no-auth-placeholder@onyx.app"
# Trigger and function names
TRIGGER_NAME = "trg_migrate_no_auth_data"
FUNCTION_NAME = "migrate_no_auth_data_to_user"
# Trigger function that migrates data from placeholder to first real user
MIGRATE_NO_AUTH_TRIGGER_FUNCTION = f"""
CREATE OR REPLACE FUNCTION {FUNCTION_NAME}()
RETURNS TRIGGER AS $$
DECLARE
placeholder_uuid UUID := '00000000-0000-0000-0000-000000000001'::uuid;
anonymous_uuid UUID := '00000000-0000-0000-0000-000000000002'::uuid;
placeholder_row RECORD;
schema_name TEXT;
BEGIN
-- Skip if this is the placeholder user being inserted
IF NEW.id = placeholder_uuid THEN
RETURN NULL;
END IF;
-- Skip if this is the anonymous user being inserted (not a real user)
IF NEW.id = anonymous_uuid THEN
RETURN NULL;
END IF;
-- Skip if the new user is not active
IF NEW.is_active = FALSE THEN
RETURN NULL;
END IF;
-- Get current schema for self-cleanup
schema_name := current_schema();
-- Try to lock the placeholder user row with FOR UPDATE SKIP LOCKED
-- This ensures only one concurrent transaction can proceed with migration
-- SKIP LOCKED means if another transaction has the lock, we skip (don't wait)
SELECT id INTO placeholder_row
FROM "user"
WHERE id = placeholder_uuid
FOR UPDATE SKIP LOCKED;
IF NOT FOUND THEN
-- Either placeholder doesn't exist or another transaction has it locked
-- Either way, drop the trigger and return without making admin
EXECUTE format('DROP TRIGGER IF EXISTS {TRIGGER_NAME} ON %I."user"', schema_name);
EXECUTE format('DROP FUNCTION IF EXISTS %I.{FUNCTION_NAME}()', schema_name);
RETURN NULL;
END IF;
-- We have exclusive lock on placeholder - proceed with migration
-- The INSERT has already completed (AFTER INSERT), so NEW.id exists in the table
-- Migrate chat_session
UPDATE "chat_session" SET user_id = NEW.id WHERE user_id = placeholder_uuid;
-- Migrate credential (exclude public credential id=0)
UPDATE "credential" SET user_id = NEW.id WHERE user_id = placeholder_uuid AND id != 0;
-- Migrate document_set
UPDATE "document_set" SET user_id = NEW.id WHERE user_id = placeholder_uuid;
-- Migrate persona (exclude builtin personas)
UPDATE "persona" SET user_id = NEW.id WHERE user_id = placeholder_uuid AND builtin_persona = FALSE;
-- Migrate tool (exclude builtin tools)
UPDATE "tool" SET user_id = NEW.id WHERE user_id = placeholder_uuid AND in_code_tool_id IS NULL;
-- Migrate notification
UPDATE "notification" SET user_id = NEW.id WHERE user_id = placeholder_uuid;
-- Migrate inputprompt (exclude system/public prompts)
UPDATE "inputprompt" SET user_id = NEW.id WHERE user_id = placeholder_uuid AND is_public = FALSE;
-- Make the new user an admin (they had admin access in no-auth mode)
-- In AFTER INSERT trigger, we must UPDATE the row since it already exists
UPDATE "user" SET role = 'ADMIN' WHERE id = NEW.id;
-- Delete the placeholder user (we hold the lock so this is safe)
DELETE FROM "user" WHERE id = placeholder_uuid;
-- Drop the trigger and function (self-cleanup)
EXECUTE format('DROP TRIGGER IF EXISTS {TRIGGER_NAME} ON %I."user"', schema_name);
EXECUTE format('DROP FUNCTION IF EXISTS %I.{FUNCTION_NAME}()', schema_name);
RETURN NULL;
END;
$$ LANGUAGE plpgsql;
"""
MIGRATE_NO_AUTH_TRIGGER = f"""
CREATE TRIGGER {TRIGGER_NAME}
AFTER INSERT ON "user"
FOR EACH ROW
EXECUTE FUNCTION {FUNCTION_NAME}();
"""
def upgrade() -> None:
"""
Create a placeholder user and assign all NULL user_id records to it.
Install a trigger that migrates data to the first real user and self-destructs.
Only runs if AUTH_TYPE is currently disabled/none.
Skipped in multi-tenant mode - each tenant starts fresh with no legacy data.
"""
# Skip in multi-tenant mode - this migration handles single-tenant
# AUTH_TYPE=disabled -> auth transitions only
if MULTI_TENANT:
return
# Only run if AUTH_TYPE is currently disabled/none
# If they've already switched to auth-enabled, NULL data is stale anyway
auth_type = (os.environ.get("AUTH_TYPE") or "").lower()
if auth_type not in ("disabled", "none", ""):
print(f"AUTH_TYPE is '{auth_type}', not disabled. Skipping migration.")
return
connection = op.get_bind()
# Check if there are any NULL user_id records that need migration
tables_to_check = [
"chat_session",
"credential",
"document_set",
"persona",
"tool",
"notification",
"inputprompt",
]
has_null_records = False
for table in tables_to_check:
try:
result = connection.execute(
sa.text(f'SELECT 1 FROM "{table}" WHERE user_id IS NULL LIMIT 1')
)
if result.fetchone():
has_null_records = True
break
except Exception:
# Table might not exist
pass
if not has_null_records:
return
# Create the placeholder user
connection.execute(
sa.text(
"""
INSERT INTO "user" (id, email, hashed_password, is_active, is_superuser, is_verified, role)
VALUES (:id, :email, :hashed_password, :is_active, :is_superuser, :is_verified, :role)
"""
),
{
"id": NO_AUTH_PLACEHOLDER_USER_UUID,
"email": NO_AUTH_PLACEHOLDER_USER_EMAIL,
"hashed_password": "", # Empty password - user cannot log in
"is_active": False, # Inactive - user cannot log in
"is_superuser": False,
"is_verified": False,
"role": "BASIC",
},
)
# Assign NULL user_id records to the placeholder user
for table in tables_to_check:
try:
# Base condition for all tables
condition = "user_id IS NULL"
# Exclude public credential (id=0) which must remain user_id=NULL
if table == "credential":
condition += " AND id != 0"
# Exclude builtin tools (in_code_tool_id IS NOT NULL) which must remain user_id=NULL
elif table == "tool":
condition += " AND in_code_tool_id IS NULL"
# Exclude builtin personas which must remain user_id=NULL
elif table == "persona":
condition += " AND builtin_persona = FALSE"
# Exclude system/public input prompts which must remain user_id=NULL
elif table == "inputprompt":
condition += " AND is_public = FALSE"
result = connection.execute(
sa.text(
f"""
UPDATE "{table}"
SET user_id = :user_id
WHERE {condition}
"""
),
{"user_id": NO_AUTH_PLACEHOLDER_USER_UUID},
)
if result.rowcount > 0:
print(f"Updated {result.rowcount} rows in {table}")
except Exception as e:
print(f"Skipping {table}: {e}")
# Install the trigger function and trigger for automatic migration on first user registration
connection.execute(sa.text(MIGRATE_NO_AUTH_TRIGGER_FUNCTION))
connection.execute(sa.text(MIGRATE_NO_AUTH_TRIGGER))
print("Installed trigger for automatic data migration on first user registration")
def downgrade() -> None:
"""
Drop trigger and function, set placeholder user's records back to NULL,
and delete the placeholder user.
"""
# Skip in multi-tenant mode for consistency with upgrade
if MULTI_TENANT:
return
connection = op.get_bind()
# Drop trigger and function if they exist (they may have already self-destructed)
connection.execute(sa.text(f'DROP TRIGGER IF EXISTS {TRIGGER_NAME} ON "user"'))
connection.execute(sa.text(f"DROP FUNCTION IF EXISTS {FUNCTION_NAME}()"))
tables_to_update = [
"chat_session",
"credential",
"document_set",
"persona",
"tool",
"notification",
"inputprompt",
]
# Set records back to NULL
for table in tables_to_update:
try:
connection.execute(
sa.text(
f"""
UPDATE "{table}"
SET user_id = NULL
WHERE user_id = :user_id
"""
),
{"user_id": NO_AUTH_PLACEHOLDER_USER_UUID},
)
except Exception:
pass
# Delete the placeholder user
connection.execute(
sa.text('DELETE FROM "user" WHERE id = :user_id'),
{"user_id": NO_AUTH_PLACEHOLDER_USER_UUID},
)

View File

@@ -0,0 +1,31 @@
"""add chat_background to user
Revision ID: fb80bdd256de
Revises: 8b5ce697290e
Create Date: 2026-01-16 16:15:59.222617
"""
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = "fb80bdd256de"
down_revision = "8b5ce697290e"
branch_labels = None
depends_on = None
def upgrade() -> None:
op.add_column(
"user",
sa.Column(
"chat_background",
sa.String(),
nullable=True,
),
)
def downgrade() -> None:
op.drop_column("user", "chat_background")

View File

@@ -0,0 +1,69 @@
"""add_opensearch_tenant_migration_columns
Revision ID: feead2911109
Revises: d56ffa94ca32
Create Date: 2026-02-10 17:46:34.029937
"""
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = "feead2911109"
down_revision = "175ea04c7087"
branch_labels = None
depends_on = None
def upgrade() -> None:
op.add_column(
"opensearch_tenant_migration_record",
sa.Column("vespa_visit_continuation_token", sa.Text(), nullable=True),
)
op.add_column(
"opensearch_tenant_migration_record",
sa.Column(
"total_chunks_migrated",
sa.Integer(),
nullable=False,
server_default="0",
),
)
op.add_column(
"opensearch_tenant_migration_record",
sa.Column(
"created_at",
sa.DateTime(timezone=True),
nullable=False,
server_default=sa.func.now(),
),
)
op.add_column(
"opensearch_tenant_migration_record",
sa.Column(
"migration_completed_at",
sa.DateTime(timezone=True),
nullable=True,
),
)
op.add_column(
"opensearch_tenant_migration_record",
sa.Column(
"enable_opensearch_retrieval",
sa.Boolean(),
nullable=False,
server_default="false",
),
)
def downgrade() -> None:
op.drop_column("opensearch_tenant_migration_record", "enable_opensearch_retrieval")
op.drop_column("opensearch_tenant_migration_record", "migration_completed_at")
op.drop_column("opensearch_tenant_migration_record", "created_at")
op.drop_column("opensearch_tenant_migration_record", "total_chunks_migrated")
op.drop_column(
"opensearch_tenant_migration_record", "vespa_visit_continuation_token"
)

View File

@@ -39,7 +39,7 @@ EXCLUDE_TABLES = {"kombu_queue", "kombu_message"}
def include_object(
object: SchemaItem,
object: SchemaItem, # noqa: ARG001
name: str | None,
type_: Literal[
"schema",
@@ -49,8 +49,8 @@ def include_object(
"unique_constraint",
"foreign_key_constraint",
],
reflected: bool,
compare_to: SchemaItem | None,
reflected: bool, # noqa: ARG001
compare_to: SchemaItem | None, # noqa: ARG001
) -> bool:
if type_ == "table" and name in EXCLUDE_TABLES:
return False

View File

@@ -116,7 +116,7 @@ def _get_access_for_documents(
return access_map
def _get_acl_for_user(user: User | None, db_session: Session) -> set[str]:
def _get_acl_for_user(user: User, db_session: Session) -> set[str]:
"""Returns a list of ACL entries that the user has access to. This is meant to be
used downstream to filter out documents that the user does not have access to. The
user should have access to a document if at least one entry in the document's ACL
@@ -124,13 +124,16 @@ def _get_acl_for_user(user: User | None, db_session: Session) -> set[str]:
NOTE: is imported in onyx.access.access by `fetch_versioned_implementation`
DO NOT REMOVE."""
db_user_groups = fetch_user_groups_for_user(db_session, user.id) if user else []
is_anonymous = user.is_anonymous
db_user_groups = (
[] if is_anonymous else fetch_user_groups_for_user(db_session, user.id)
)
prefixed_user_groups = [
prefix_user_group(db_user_group.name) for db_user_group in db_user_groups
]
db_external_groups = (
fetch_external_groups_for_user(db_session, user.id) if user else []
[] if is_anonymous else fetch_external_groups_for_user(db_session, user.id)
)
prefixed_external_groups = [
prefix_external_group(db_external_group.external_user_group_id)

View File

@@ -0,0 +1,11 @@
from sqlalchemy.orm import Session
from ee.onyx.db.external_perm import fetch_external_groups_for_user
from onyx.db.models import User
def _get_user_external_group_ids(db_session: Session, user: User) -> list[str]:
if not user:
return []
external_groups = fetch_external_groups_for_user(db_session, user.id)
return [external_group.external_user_group_id for external_group in external_groups]

View File

@@ -33,8 +33,8 @@ def get_default_admin_user_emails_() -> list[str]:
async def current_cloud_superuser(
request: Request,
user: User | None = Depends(current_admin_user),
) -> User | None:
user: User = Depends(current_admin_user),
) -> User:
api_key = request.headers.get("Authorization", "").replace("Bearer ", "")
if api_key != SUPER_CLOUD_API_KEY:
raise HTTPException(status_code=401, detail="Invalid API key")

View File

@@ -1,12 +1,15 @@
from onyx.background.celery.apps import app_base
from onyx.background.celery.apps.background import celery_app
celery_app.autodiscover_tasks(
[
"ee.onyx.background.celery.tasks.doc_permission_syncing",
"ee.onyx.background.celery.tasks.external_group_syncing",
"ee.onyx.background.celery.tasks.cleanup",
"ee.onyx.background.celery.tasks.tenant_provisioning",
"ee.onyx.background.celery.tasks.query_history",
]
app_base.filter_task_modules(
[
"ee.onyx.background.celery.tasks.doc_permission_syncing",
"ee.onyx.background.celery.tasks.external_group_syncing",
"ee.onyx.background.celery.tasks.cleanup",
"ee.onyx.background.celery.tasks.tenant_provisioning",
"ee.onyx.background.celery.tasks.query_history",
]
)
)

View File

@@ -1,11 +1,14 @@
from onyx.background.celery.apps import app_base
from onyx.background.celery.apps.heavy import celery_app
celery_app.autodiscover_tasks(
[
"ee.onyx.background.celery.tasks.doc_permission_syncing",
"ee.onyx.background.celery.tasks.external_group_syncing",
"ee.onyx.background.celery.tasks.cleanup",
"ee.onyx.background.celery.tasks.query_history",
]
app_base.filter_task_modules(
[
"ee.onyx.background.celery.tasks.doc_permission_syncing",
"ee.onyx.background.celery.tasks.external_group_syncing",
"ee.onyx.background.celery.tasks.cleanup",
"ee.onyx.background.celery.tasks.query_history",
]
)
)

View File

@@ -1,8 +1,11 @@
from onyx.background.celery.apps import app_base
from onyx.background.celery.apps.light import celery_app
celery_app.autodiscover_tasks(
[
"ee.onyx.background.celery.tasks.doc_permission_syncing",
"ee.onyx.background.celery.tasks.external_group_syncing",
]
app_base.filter_task_modules(
[
"ee.onyx.background.celery.tasks.doc_permission_syncing",
"ee.onyx.background.celery.tasks.external_group_syncing",
]
)
)

View File

@@ -1,7 +1,10 @@
from onyx.background.celery.apps import app_base
from onyx.background.celery.apps.monitoring import celery_app
celery_app.autodiscover_tasks(
[
"ee.onyx.background.celery.tasks.tenant_provisioning",
]
app_base.filter_task_modules(
[
"ee.onyx.background.celery.tasks.tenant_provisioning",
]
)
)

View File

@@ -1,12 +1,15 @@
from onyx.background.celery.apps import app_base
from onyx.background.celery.apps.primary import celery_app
celery_app.autodiscover_tasks(
[
"ee.onyx.background.celery.tasks.doc_permission_syncing",
"ee.onyx.background.celery.tasks.external_group_syncing",
"ee.onyx.background.celery.tasks.cloud",
"ee.onyx.background.celery.tasks.ttl_management",
"ee.onyx.background.celery.tasks.usage_reporting",
]
app_base.filter_task_modules(
[
"ee.onyx.background.celery.tasks.doc_permission_syncing",
"ee.onyx.background.celery.tasks.external_group_syncing",
"ee.onyx.background.celery.tasks.cloud",
"ee.onyx.background.celery.tasks.ttl_management",
"ee.onyx.background.celery.tasks.usage_reporting",
]
)
)

View File

@@ -25,6 +25,7 @@ from ee.onyx.db.connector_credential_pair import get_all_auto_sync_cc_pairs
from ee.onyx.db.document import upsert_document_external_perms
from ee.onyx.external_permissions.sync_params import get_source_perm_sync_config
from onyx.access.models import DocExternalAccess
from onyx.access.models import ElementExternalAccess
from onyx.background.celery.apps.app_base import task_logger
from onyx.background.celery.celery_redis import celery_find_task
from onyx.background.celery.celery_redis import celery_get_queue_length
@@ -55,6 +56,9 @@ from onyx.db.enums import AccessType
from onyx.db.enums import ConnectorCredentialPairStatus
from onyx.db.enums import SyncStatus
from onyx.db.enums import SyncType
from onyx.db.hierarchy import (
update_hierarchy_node_permissions as db_update_hierarchy_node_permissions,
)
from onyx.db.models import ConnectorCredentialPair
from onyx.db.permission_sync_attempt import complete_doc_permission_sync_attempt
from onyx.db.permission_sync_attempt import create_doc_permission_sync_attempt
@@ -637,18 +641,25 @@ def connector_permission_sync_generator_task(
),
stop=stop_after_delay(DOCUMENT_PERMISSIONS_UPDATE_STOP_AFTER),
)
def document_update_permissions(
def element_update_permissions(
tenant_id: str,
permissions: DocExternalAccess,
permissions: ElementExternalAccess,
source_type_str: str,
connector_id: int,
credential_id: int,
) -> bool:
"""Update permissions for a document or hierarchy node."""
start = time.monotonic()
doc_id = permissions.doc_id
external_access = permissions.external_access
# Determine element type and identifier for logging
if isinstance(permissions, DocExternalAccess):
element_id = permissions.doc_id
element_type = "doc"
else:
element_id = permissions.raw_node_id
element_type = "node"
try:
with get_session_with_tenant(tenant_id=tenant_id) as db_session:
# Add the users to the DB if they don't exist
@@ -657,39 +668,57 @@ def document_update_permissions(
emails=list(external_access.external_user_emails),
continue_on_error=True,
)
# Then upsert the document's external permissions
created_new_doc = upsert_document_external_perms(
db_session=db_session,
doc_id=doc_id,
external_access=external_access,
source_type=DocumentSource(source_type_str),
)
if created_new_doc:
# If a new document was created, we associate it with the cc_pair
upsert_document_by_connector_credential_pair(
if isinstance(permissions, DocExternalAccess):
# Document permission update
created_new_doc = upsert_document_external_perms(
db_session=db_session,
connector_id=connector_id,
credential_id=credential_id,
document_ids=[doc_id],
doc_id=permissions.doc_id,
external_access=external_access,
source_type=DocumentSource(source_type_str),
)
if created_new_doc:
# If a new document was created, we associate it with the cc_pair
upsert_document_by_connector_credential_pair(
db_session=db_session,
connector_id=connector_id,
credential_id=credential_id,
document_ids=[permissions.doc_id],
)
else:
# Hierarchy node permission update
db_update_hierarchy_node_permissions(
db_session=db_session,
raw_node_id=permissions.raw_node_id,
source=DocumentSource(permissions.source),
is_public=external_access.is_public,
external_user_emails=(
list(external_access.external_user_emails)
if external_access.external_user_emails
else None
),
external_user_group_ids=(
list(external_access.external_user_group_ids)
if external_access.external_user_group_ids
else None
),
)
elapsed = time.monotonic() - start
task_logger.info(
f"connector_id={connector_id} "
f"doc={doc_id} "
f"{element_type}={element_id} "
f"action=update_permissions "
f"elapsed={elapsed:.2f}"
)
except Exception as e:
task_logger.exception(
f"document_update_permissions exceptioned: "
f"connector_id={connector_id} doc_id={doc_id}"
f"element_update_permissions exceptioned: {element_type}={element_id}, {connector_id=} {credential_id=}"
)
raise e
finally:
task_logger.info(
f"document_update_permissions completed: connector_id={connector_id} doc={doc_id}"
f"element_update_permissions completed: {element_type}={element_id}, {connector_id=} {credential_id=}"
)
return True
@@ -922,7 +951,7 @@ class PermissionSyncCallback(IndexingHeartbeatInterface):
return False
def progress(self, tag: str, amount: int) -> None:
def progress(self, tag: str, amount: int) -> None: # noqa: ARG002
try:
self.redis_connector.permissions.set_active()
@@ -953,7 +982,7 @@ class PermissionSyncCallback(IndexingHeartbeatInterface):
def monitor_ccpair_permissions_taskset(
tenant_id: str, key_bytes: bytes, r: Redis, db_session: Session
tenant_id: str, key_bytes: bytes, r: Redis, db_session: Session # noqa: ARG001
) -> None:
fence_key = key_bytes.decode("utf-8")
cc_pair_id_str = RedisConnector.get_id_from_fence_key(fence_key)

View File

@@ -259,7 +259,7 @@ def check_for_external_group_sync(self: Task, *, tenant_id: str) -> bool | None:
def try_creating_external_group_sync_task(
app: Celery,
cc_pair_id: int,
r: Redis,
r: Redis, # noqa: ARG001
tenant_id: str,
) -> str | None:
"""Returns an int if syncing is needed. The int represents the number of sync tasks generated.
@@ -344,7 +344,7 @@ def try_creating_external_group_sync_task(
bind=True,
)
def connector_external_group_sync_generator_task(
self: Task,
self: Task, # noqa: ARG001
cc_pair_id: int,
tenant_id: str,
) -> None:
@@ -590,8 +590,8 @@ def _perform_external_group_sync(
def validate_external_group_sync_fences(
tenant_id: str,
celery_app: Celery,
r: Redis,
celery_app: Celery, # noqa: ARG001
r: Redis, # noqa: ARG001
r_replica: Redis,
r_celery: Redis,
lock_beat: RedisLock,

View File

@@ -40,7 +40,7 @@ def export_query_history_task(
end: datetime,
start_time: datetime,
# Need to include the tenant_id since the TenantAwareTask needs this
tenant_id: str,
tenant_id: str, # noqa: ARG001
) -> None:
if not self.request.id:
raise RuntimeError("No task id defined for this task; cannot identify it")

View File

@@ -43,7 +43,7 @@ _TENANT_PROVISIONING_TIME_LIMIT = 60 * 10 # 10 minutes
trail=False,
bind=True,
)
def check_available_tenants(self: Task) -> None:
def check_available_tenants(self: Task) -> None: # noqa: ARG001
"""
Check if we have enough pre-provisioned tenants available.
If not, trigger the pre-provisioning of new tenants.

View File

@@ -21,9 +21,9 @@ logger = setup_logger()
trail=False,
)
def generate_usage_report_task(
self: Task,
self: Task, # noqa: ARG001
*,
tenant_id: str,
tenant_id: str, # noqa: ARG001
user_id: str | None = None,
period_from: str | None = None,
period_to: str | None = None,

View File

@@ -7,7 +7,7 @@ QUERY_HISTORY_TASK_NAME_PREFIX = OnyxCeleryTask.EXPORT_QUERY_HISTORY_TASK
def name_chat_ttl_task(
retention_limit_days: float, tenant_id: str | None = None
retention_limit_days: float, tenant_id: str | None = None # noqa: ARG001
) -> str:
return f"chat_ttl_{retention_limit_days}_days"

View File

@@ -109,7 +109,6 @@ CHECK_TTL_MANAGEMENT_TASK_FREQUENCY_IN_HOURS = float(
STRIPE_SECRET_KEY = os.environ.get("STRIPE_SECRET_KEY")
STRIPE_PRICE_ID = os.environ.get("STRIPE_PRICE")
# JWT Public Key URL
JWT_PUBLIC_KEY_URL: str | None = os.getenv("JWT_PUBLIC_KEY_URL", None)
@@ -123,9 +122,23 @@ SUPER_CLOUD_API_KEY = os.environ.get("SUPER_CLOUD_API_KEY", "api_key")
# when the capture is called. These defaults prevent Posthog issues from breaking the Onyx app
POSTHOG_API_KEY = os.environ.get("POSTHOG_API_KEY") or "FooBar"
POSTHOG_HOST = os.environ.get("POSTHOG_HOST") or "https://us.i.posthog.com"
POSTHOG_DEBUG_LOGS_ENABLED = (
os.environ.get("POSTHOG_DEBUG_LOGS_ENABLED", "").lower() == "true"
)
MARKETING_POSTHOG_API_KEY = os.environ.get("MARKETING_POSTHOG_API_KEY")
HUBSPOT_TRACKING_URL = os.environ.get("HUBSPOT_TRACKING_URL")
GATED_TENANTS_KEY = "gated_tenants"
# License enforcement - when True, blocks API access for gated/expired licenses
LICENSE_ENFORCEMENT_ENABLED = (
os.environ.get("LICENSE_ENFORCEMENT_ENABLED", "true").lower() == "true"
)
# Cloud data plane URL - self-hosted instances call this to reach cloud proxy endpoints
# Used when MULTI_TENANT=false (self-hosted mode)
CLOUD_DATA_PLANE_URL = os.environ.get(
"CLOUD_DATA_PLANE_URL", "https://cloud.onyx.app/api"
)

View File

@@ -0,0 +1,73 @@
"""Constants for license enforcement.
This file is the single source of truth for:
1. Paths that bypass license enforcement (always accessible)
2. Paths that require an EE license (EE-only features)
Import these constants in both production code and tests to ensure consistency.
"""
# Paths that are ALWAYS accessible, even when license is expired/gated.
# These enable users to:
# /auth - Log in/out (users can't fix billing if locked out of auth)
# /license - Fetch, upload, or check license status
# /health - Health checks for load balancers/orchestrators
# /me - Basic user info needed for UI rendering
# /settings, /enterprise-settings - View app status and branding
# /billing - Unified billing API
# /proxy - Self-hosted proxy endpoints (have own license-based auth)
# /tenants/billing-* - Legacy billing endpoints (backwards compatibility)
# /manage/users, /users - User management (needed for seat limit resolution)
# /notifications - Needed for UI to load properly
LICENSE_ENFORCEMENT_ALLOWED_PREFIXES: frozenset[str] = frozenset(
{
"/auth",
"/license",
"/health",
"/me",
"/settings",
"/enterprise-settings",
# Billing endpoints (unified API for both MT and self-hosted)
"/billing",
"/admin/billing",
# Proxy endpoints for self-hosted billing (no tenant context)
"/proxy",
# Legacy tenant billing endpoints (kept for backwards compatibility)
"/tenants/billing-information",
"/tenants/create-customer-portal-session",
"/tenants/create-subscription-session",
# User management - needed to remove users when seat limit exceeded
"/manage/users",
"/manage/admin/users",
"/manage/admin/valid-domains",
"/manage/admin/deactivate-user",
"/manage/admin/delete-user",
"/users",
# Notifications - needed for UI to load properly
"/notifications",
}
)
# EE-only paths that require a valid license.
# Users without a license (community edition) cannot access these.
# These are blocked even when user has never subscribed (no license).
EE_ONLY_PATH_PREFIXES: frozenset[str] = frozenset(
{
# User groups and access control
"/manage/admin/user-group",
# Analytics and reporting
"/analytics",
# Query history (admin chat session endpoints)
"/admin/chat-sessions",
"/admin/chat-session-history",
"/admin/query-history",
# Usage reporting/export
"/admin/usage-report",
# Standard answers (canned responses)
"/manage/admin/standard-answer",
# Token rate limits
"/admin/token-rate-limits",
# Evals
"/evals",
}
)

View File

@@ -334,11 +334,9 @@ def fetch_assistant_unique_users_total(
# Users can view assistant stats if they created the persona,
# or if they are an admin
def user_can_view_assistant_stats(
db_session: Session, user: User | None, assistant_id: int
db_session: Session, user: User, assistant_id: int
) -> bool:
# If user is None and auth is disabled, assume the user is an admin
if user is None or user.role == UserRole.ADMIN:
if user.role == UserRole.ADMIN:
return True
# Check if the user created the persona

Some files were not shown because too many files have changed in this diff Show More