Compare commits

..

1 Commits

Author SHA1 Message Date
Jamison Lahman
7535c3593f chore(fe): rm redundant alignBubble 2026-03-04 20:45:29 -08:00
134 changed files with 1168 additions and 8663 deletions

View File

@@ -1,161 +0,0 @@
---
name: onyx-cli
description: Query the Onyx knowledge base using the onyx-cli command. Use when the user wants to search company documents, ask questions about internal knowledge, query connected data sources, or look up information stored in Onyx.
---
# Onyx CLI — Agent Tool
Onyx is an enterprise search and Gen-AI platform that connects to company documents, apps, and people. The `onyx-cli` CLI provides non-interactive commands to query the Onyx knowledge base and list available agents.
## Prerequisites
### 1. Check if installed
```bash
which onyx-cli
```
### 2. Install (if needed)
**Primary — pip:**
```bash
pip install onyx-cli
```
**From source (Go):**
```bash
cd cli && go build -o onyx-cli . && sudo mv onyx-cli /usr/local/bin/
```
### 3. Check if configured
```bash
onyx-cli validate-config
```
This checks the config file exists, API key is present, and tests the server connection via `/api/me`. Exit code 0 on success, non-zero with a descriptive error on failure.
If unconfigured, you have two options:
**Option A — Interactive setup (requires user input):**
```bash
onyx-cli configure
```
This prompts for the Onyx server URL and API key, tests the connection, and saves config.
**Option B — Environment variables (non-interactive, preferred for agents):**
```bash
export ONYX_SERVER_URL="https://your-onyx-server.com" # default: https://cloud.onyx.app
export ONYX_API_KEY="your-api-key"
```
Environment variables override the config file. If these are set, no config file is needed.
| Variable | Required | Description |
|----------|----------|-------------|
| `ONYX_SERVER_URL` | No | Onyx server base URL (default: `https://cloud.onyx.app`) |
| `ONYX_API_KEY` | Yes | API key for authentication |
| `ONYX_PERSONA_ID` | No | Default agent/persona ID |
If neither the config file nor environment variables are set, tell the user that `onyx-cli` needs to be configured and ask them to either:
- Run `onyx-cli configure` interactively, or
- Set `ONYX_SERVER_URL` and `ONYX_API_KEY` environment variables
## Commands
### Validate configuration
```bash
onyx-cli validate-config
```
Checks config file exists, API key is present, and tests the server connection. Use this before `ask` or `agents` to confirm the CLI is properly set up.
### List available agents
```bash
onyx-cli agents
```
Prints a table of agent IDs, names, and descriptions. Use `--json` for structured output:
```bash
onyx-cli agents --json
```
Use agent IDs with `ask --agent-id` to query a specific agent.
### Basic query (plain text output)
```bash
onyx-cli ask "What is our company's PTO policy?"
```
Streams the answer as plain text to stdout. Exit code 0 on success, non-zero on error.
### JSON output (structured events)
```bash
onyx-cli ask --json "What authentication methods do we support?"
```
Outputs JSON-encoded parsed stream events (one object per line). Key event objects include message deltas, stop, errors, search-start, and citation payloads.
| Event Type | Description |
|------------|-------------|
| `message_delta` | Content token — concatenate all `content` fields for the full answer |
| `stop` | Stream complete |
| `error` | Error with `error` message field |
| `search_tool_start` | Onyx started searching documents |
| `citation_info` | Source citation with `citation_number` and `document_id` |
### Specify an agent
```bash
onyx-cli ask --agent-id 5 "Summarize our Q4 roadmap"
```
Uses a specific Onyx agent/persona instead of the default.
### All flags
| Flag | Type | Description |
|------|------|-------------|
| `--agent-id` | int | Agent ID to use (overrides default) |
| `--json` | bool | Output raw NDJSON events instead of plain text |
## When to Use
Use `onyx-cli ask` when:
- The user asks about company-specific information (policies, docs, processes)
- You need to search internal knowledge bases or connected data sources
- The user references Onyx, asks you to "search Onyx", or wants to query their documents
- You need context from company wikis, Confluence, Google Drive, Slack, or other connected sources
Do NOT use when:
- The question is about general programming knowledge (use your own knowledge)
- The user is asking about code in the current repository (use grep/read tools)
- The user hasn't mentioned Onyx and the question doesn't require internal company data
## Examples
```bash
# Simple question
onyx-cli ask "What are the steps to deploy to production?"
# Get structured output for parsing
onyx-cli ask --json "List all active API integrations"
# Use a specialized agent
onyx-cli ask --agent-id 3 "What were the action items from last week's standup?"
# Pipe the answer into another command
onyx-cli ask "What is the database schema for users?" | head -20
```

View File

@@ -182,52 +182,8 @@ jobs:
title: "🚨 Version Tag Check Failed"
ref-name: ${{ github.ref_name }}
# Create GitHub release first, before desktop builds start.
# This ensures all desktop matrix jobs upload to the same release instead of
# racing to create duplicate releases.
create-release:
needs: determine-builds
if: needs.determine-builds.outputs.build-desktop == 'true'
runs-on: ubuntu-slim
timeout-minutes: 10
permissions:
contents: write
outputs:
release-id: ${{ steps.create-release.outputs.id }}
steps:
- name: Checkout
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # ratchet:actions/checkout@v6
with:
persist-credentials: false
- name: Determine release tag
id: release-tag
env:
IS_TEST_RUN: ${{ needs.determine-builds.outputs.is-test-run }}
SHORT_SHA: ${{ needs.determine-builds.outputs.short-sha }}
run: |
if [ "${IS_TEST_RUN}" == "true" ]; then
echo "tag=v0.0.0-dev+${SHORT_SHA}" >> "$GITHUB_OUTPUT"
else
echo "tag=${GITHUB_REF_NAME}" >> "$GITHUB_OUTPUT"
fi
- name: Create GitHub Release
id: create-release
uses: softprops/action-gh-release@da05d552573ad5aba039eaac05058a918a7bf631 # ratchet:softprops/action-gh-release@v2
with:
tag_name: ${{ steps.release-tag.outputs.tag }}
name: ${{ steps.release-tag.outputs.tag }}
body: "See the assets to download this version and install."
draft: true
prerelease: false
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
build-desktop:
needs:
- determine-builds
- create-release
needs: determine-builds
if: needs.determine-builds.outputs.build-desktop == 'true'
permissions:
id-token: write
@@ -252,12 +208,12 @@ jobs:
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # ratchet:actions/checkout@v6.0.2
with:
# NOTE: persist-credentials is needed for tauri-action to upload assets to GitHub releases.
# NOTE: persist-credentials is needed for tauri-action to create GitHub releases.
persist-credentials: true # zizmor: ignore[artipacked]
- name: Configure AWS credentials
if: startsWith(matrix.platform, 'macos-')
uses: aws-actions/configure-aws-credentials@8df5847569e6427dd6c4fb1cf565c83acfa8afa7
uses: aws-actions/configure-aws-credentials@61815dcd50bd041e203e49132bacad1fd04d2708
with:
role-to-assume: ${{ secrets.AWS_OIDC_ROLE_ARN }}
aws-region: us-east-2
@@ -397,9 +353,11 @@ jobs:
APPLE_SIGNING_IDENTITY: ${{ env.CERT_ID }}
APPLE_TEAM_ID: ${{ env.APPLE_TEAM_ID }}
with:
# Use the release created by the create-release job to avoid race conditions
# when multiple matrix jobs try to create/update the same release simultaneously
releaseId: ${{ needs.create-release.outputs.release-id }}
tagName: ${{ needs.determine-builds.outputs.is-test-run != 'true' && 'v__VERSION__' || format('v0.0.0-dev+{0}', needs.determine-builds.outputs.short-sha) }}
releaseName: ${{ needs.determine-builds.outputs.is-test-run != 'true' && 'v__VERSION__' || format('v0.0.0-dev+{0}', needs.determine-builds.outputs.short-sha) }}
releaseBody: "See the assets to download this version and install."
releaseDraft: true
prerelease: false
assetNamePattern: "[name]_[arch][ext]"
args: ${{ matrix.args }}
@@ -426,7 +384,7 @@ jobs:
persist-credentials: false
- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials@8df5847569e6427dd6c4fb1cf565c83acfa8afa7
uses: aws-actions/configure-aws-credentials@61815dcd50bd041e203e49132bacad1fd04d2708
with:
role-to-assume: ${{ secrets.AWS_OIDC_ROLE_ARN }}
aws-region: us-east-2
@@ -500,7 +458,7 @@ jobs:
persist-credentials: false
- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials@8df5847569e6427dd6c4fb1cf565c83acfa8afa7
uses: aws-actions/configure-aws-credentials@61815dcd50bd041e203e49132bacad1fd04d2708
with:
role-to-assume: ${{ secrets.AWS_OIDC_ROLE_ARN }}
aws-region: us-east-2
@@ -569,7 +527,7 @@ jobs:
- uses: runs-on/action@cd2b598b0515d39d78c38a02d529db87d2196d1e # ratchet:runs-on/action@v2
- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials@8df5847569e6427dd6c4fb1cf565c83acfa8afa7
uses: aws-actions/configure-aws-credentials@61815dcd50bd041e203e49132bacad1fd04d2708
with:
role-to-assume: ${{ secrets.AWS_OIDC_ROLE_ARN }}
aws-region: us-east-2
@@ -639,7 +597,7 @@ jobs:
persist-credentials: false
- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials@8df5847569e6427dd6c4fb1cf565c83acfa8afa7
uses: aws-actions/configure-aws-credentials@61815dcd50bd041e203e49132bacad1fd04d2708
with:
role-to-assume: ${{ secrets.AWS_OIDC_ROLE_ARN }}
aws-region: us-east-2
@@ -721,7 +679,7 @@ jobs:
persist-credentials: false
- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials@8df5847569e6427dd6c4fb1cf565c83acfa8afa7
uses: aws-actions/configure-aws-credentials@61815dcd50bd041e203e49132bacad1fd04d2708
with:
role-to-assume: ${{ secrets.AWS_OIDC_ROLE_ARN }}
aws-region: us-east-2
@@ -798,7 +756,7 @@ jobs:
- uses: runs-on/action@cd2b598b0515d39d78c38a02d529db87d2196d1e # ratchet:runs-on/action@v2
- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials@8df5847569e6427dd6c4fb1cf565c83acfa8afa7
uses: aws-actions/configure-aws-credentials@61815dcd50bd041e203e49132bacad1fd04d2708
with:
role-to-assume: ${{ secrets.AWS_OIDC_ROLE_ARN }}
aws-region: us-east-2
@@ -865,7 +823,7 @@ jobs:
persist-credentials: false
- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials@8df5847569e6427dd6c4fb1cf565c83acfa8afa7
uses: aws-actions/configure-aws-credentials@61815dcd50bd041e203e49132bacad1fd04d2708
with:
role-to-assume: ${{ secrets.AWS_OIDC_ROLE_ARN }}
aws-region: us-east-2
@@ -938,7 +896,7 @@ jobs:
persist-credentials: false
- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials@8df5847569e6427dd6c4fb1cf565c83acfa8afa7
uses: aws-actions/configure-aws-credentials@61815dcd50bd041e203e49132bacad1fd04d2708
with:
role-to-assume: ${{ secrets.AWS_OIDC_ROLE_ARN }}
aws-region: us-east-2
@@ -1006,7 +964,7 @@ jobs:
- uses: runs-on/action@cd2b598b0515d39d78c38a02d529db87d2196d1e # ratchet:runs-on/action@v2
- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials@8df5847569e6427dd6c4fb1cf565c83acfa8afa7
uses: aws-actions/configure-aws-credentials@61815dcd50bd041e203e49132bacad1fd04d2708
with:
role-to-assume: ${{ secrets.AWS_OIDC_ROLE_ARN }}
aws-region: us-east-2
@@ -1076,7 +1034,7 @@ jobs:
persist-credentials: false
- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials@8df5847569e6427dd6c4fb1cf565c83acfa8afa7
uses: aws-actions/configure-aws-credentials@61815dcd50bd041e203e49132bacad1fd04d2708
with:
role-to-assume: ${{ secrets.AWS_OIDC_ROLE_ARN }}
aws-region: us-east-2
@@ -1149,7 +1107,7 @@ jobs:
persist-credentials: false
- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials@8df5847569e6427dd6c4fb1cf565c83acfa8afa7
uses: aws-actions/configure-aws-credentials@61815dcd50bd041e203e49132bacad1fd04d2708
with:
role-to-assume: ${{ secrets.AWS_OIDC_ROLE_ARN }}
aws-region: us-east-2
@@ -1218,7 +1176,7 @@ jobs:
- uses: runs-on/action@cd2b598b0515d39d78c38a02d529db87d2196d1e # ratchet:runs-on/action@v2
- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials@8df5847569e6427dd6c4fb1cf565c83acfa8afa7
uses: aws-actions/configure-aws-credentials@61815dcd50bd041e203e49132bacad1fd04d2708
with:
role-to-assume: ${{ secrets.AWS_OIDC_ROLE_ARN }}
aws-region: us-east-2
@@ -1288,7 +1246,7 @@ jobs:
persist-credentials: false
- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials@8df5847569e6427dd6c4fb1cf565c83acfa8afa7
uses: aws-actions/configure-aws-credentials@61815dcd50bd041e203e49132bacad1fd04d2708
with:
role-to-assume: ${{ secrets.AWS_OIDC_ROLE_ARN }}
aws-region: us-east-2
@@ -1368,7 +1326,7 @@ jobs:
persist-credentials: false
- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials@8df5847569e6427dd6c4fb1cf565c83acfa8afa7
uses: aws-actions/configure-aws-credentials@61815dcd50bd041e203e49132bacad1fd04d2708
with:
role-to-assume: ${{ secrets.AWS_OIDC_ROLE_ARN }}
aws-region: us-east-2
@@ -1442,7 +1400,7 @@ jobs:
- uses: runs-on/action@cd2b598b0515d39d78c38a02d529db87d2196d1e # ratchet:runs-on/action@v2
- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials@8df5847569e6427dd6c4fb1cf565c83acfa8afa7
uses: aws-actions/configure-aws-credentials@61815dcd50bd041e203e49132bacad1fd04d2708
with:
role-to-assume: ${{ secrets.AWS_OIDC_ROLE_ARN }}
aws-region: us-east-2
@@ -1507,7 +1465,7 @@ jobs:
- uses: runs-on/action@cd2b598b0515d39d78c38a02d529db87d2196d1e # ratchet:runs-on/action@v2
- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials@8df5847569e6427dd6c4fb1cf565c83acfa8afa7
uses: aws-actions/configure-aws-credentials@61815dcd50bd041e203e49132bacad1fd04d2708
with:
role-to-assume: ${{ secrets.AWS_OIDC_ROLE_ARN }}
aws-region: us-east-2
@@ -1562,7 +1520,7 @@ jobs:
- uses: runs-on/action@cd2b598b0515d39d78c38a02d529db87d2196d1e # ratchet:runs-on/action@v2
- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials@8df5847569e6427dd6c4fb1cf565c83acfa8afa7
uses: aws-actions/configure-aws-credentials@61815dcd50bd041e203e49132bacad1fd04d2708
with:
role-to-assume: ${{ secrets.AWS_OIDC_ROLE_ARN }}
aws-region: us-east-2
@@ -1622,7 +1580,7 @@ jobs:
persist-credentials: false
- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials@8df5847569e6427dd6c4fb1cf565c83acfa8afa7
uses: aws-actions/configure-aws-credentials@61815dcd50bd041e203e49132bacad1fd04d2708
with:
role-to-assume: ${{ secrets.AWS_OIDC_ROLE_ARN }}
aws-region: us-east-2
@@ -1679,7 +1637,7 @@ jobs:
- uses: runs-on/action@cd2b598b0515d39d78c38a02d529db87d2196d1e # ratchet:runs-on/action@v2
- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials@8df5847569e6427dd6c4fb1cf565c83acfa8afa7
uses: aws-actions/configure-aws-credentials@61815dcd50bd041e203e49132bacad1fd04d2708
with:
role-to-assume: ${{ secrets.AWS_OIDC_ROLE_ARN }}
aws-region: us-east-2

View File

@@ -1,56 +0,0 @@
name: Golang Tests
concurrency:
group: Golang-Tests-${{ github.workflow }}-${{ github.head_ref || github.event.workflow_run.head_branch || github.run_id }}
cancel-in-progress: true
on:
merge_group:
pull_request:
branches:
- main
- "release/**"
push:
tags:
- "v*.*.*"
permissions: {}
env:
GO_VERSION: "1.26"
jobs:
detect-modules:
runs-on: ubuntu-latest
timeout-minutes: 10
outputs:
modules: ${{ steps.set-modules.outputs.modules }}
steps:
- uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8
with:
persist-credentials: false
- id: set-modules
run: echo "modules=$(find . -name 'go.mod' -exec dirname {} \; | jq -Rc '[.,inputs]')" >> "$GITHUB_OUTPUT"
golang:
needs: detect-modules
runs-on: ubuntu-latest
timeout-minutes: 10
strategy:
matrix:
modules: ${{ fromJSON(needs.detect-modules.outputs.modules) }}
steps:
- uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # ratchet:actions/checkout@v6
with:
persist-credentials: false
- uses: actions/setup-go@4dc6199c7b1a012772edbd06daecab0f50c9053c # zizmor: ignore[cache-poisoning]
with:
go-version: ${{ env.GO_VERSION }}
cache-dependency-path: "**/go.sum"
- run: go mod tidy
working-directory: ${{ matrix.modules }}
- run: git diff --exit-code go.mod go.sum
working-directory: ${{ matrix.modules }}
- run: go test ./...
working-directory: ${{ matrix.modules }}

View File

@@ -71,7 +71,7 @@ jobs:
- name: Create kind cluster
if: steps.list-changed.outputs.changed == 'true'
uses: helm/kind-action@ef37e7f390d99f746eb8b610417061a60e82a6cc # ratchet:helm/kind-action@v1.14.0
uses: helm/kind-action@92086f6be054225fa813e0a4b13787fc9088faab # ratchet:helm/kind-action@v1.13.0
- name: Pre-install cluster status check
if: steps.list-changed.outputs.changed == 'true'

View File

@@ -461,7 +461,7 @@ jobs:
# --- Visual Regression Diff ---
- name: Configure AWS credentials
if: always()
uses: aws-actions/configure-aws-credentials@8df5847569e6427dd6c4fb1cf565c83acfa8afa7
uses: aws-actions/configure-aws-credentials@61815dcd50bd041e203e49132bacad1fd04d2708
with:
role-to-assume: ${{ secrets.AWS_OIDC_ROLE_ARN }}
aws-region: us-east-2

View File

@@ -38,9 +38,9 @@ jobs:
- name: Install node dependencies
working-directory: ./web
run: npm ci
- uses: j178/prek-action@0bb87d7f00b0c99306c8bcb8b8beba1eb581c037 # ratchet:j178/prek-action@v1
- uses: j178/prek-action@9d6a3097e0c1865ecce00cfb89fe80f2ee91b547 # ratchet:j178/prek-action@v1
with:
prek-version: '0.3.4'
prek-version: '0.2.21'
extra-args: ${{ github.event_name == 'pull_request' && format('--from-ref {0} --to-ref {1}', github.event.pull_request.base.sha, github.event.pull_request.head.sha) || github.event_name == 'merge_group' && format('--from-ref {0} --to-ref {1}', github.event.merge_group.base_sha, github.event.merge_group.head_sha) || github.ref_name == 'main' && '--all-files' || '' }}
- name: Check Actions
uses: giner/check-actions@28d366c7cbbe235f9624a88aa31a628167eee28c # ratchet:giner/check-actions@v1.0.1

View File

@@ -73,7 +73,7 @@ jobs:
persist-credentials: false
- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials@8df5847569e6427dd6c4fb1cf565c83acfa8afa7
uses: aws-actions/configure-aws-credentials@61815dcd50bd041e203e49132bacad1fd04d2708
with:
role-to-assume: ${{ secrets.AWS_OIDC_ROLE_ARN }}
aws-region: us-east-2
@@ -116,7 +116,7 @@ jobs:
persist-credentials: false
- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials@8df5847569e6427dd6c4fb1cf565c83acfa8afa7
uses: aws-actions/configure-aws-credentials@61815dcd50bd041e203e49132bacad1fd04d2708
with:
role-to-assume: ${{ secrets.AWS_OIDC_ROLE_ARN }}
aws-region: us-east-2
@@ -158,7 +158,7 @@ jobs:
persist-credentials: false
- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials@8df5847569e6427dd6c4fb1cf565c83acfa8afa7
uses: aws-actions/configure-aws-credentials@61815dcd50bd041e203e49132bacad1fd04d2708
with:
role-to-assume: ${{ secrets.AWS_OIDC_ROLE_ARN }}
aws-region: us-east-2
@@ -264,7 +264,7 @@ jobs:
persist-credentials: false
- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials@8df5847569e6427dd6c4fb1cf565c83acfa8afa7
uses: aws-actions/configure-aws-credentials@61815dcd50bd041e203e49132bacad1fd04d2708
with:
role-to-assume: ${{ secrets.AWS_OIDC_ROLE_ARN }}
aws-region: us-east-2

View File

@@ -110,7 +110,7 @@ jobs:
persist-credentials: false
- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials@8df5847569e6427dd6c4fb1cf565c83acfa8afa7
uses: aws-actions/configure-aws-credentials@61815dcd50bd041e203e49132bacad1fd04d2708
with:
role-to-assume: ${{ secrets.AWS_OIDC_ROLE_ARN }}
aws-region: us-east-2
@@ -180,7 +180,7 @@ jobs:
persist-credentials: false
- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials@8df5847569e6427dd6c4fb1cf565c83acfa8afa7
uses: aws-actions/configure-aws-credentials@61815dcd50bd041e203e49132bacad1fd04d2708
with:
role-to-assume: ${{ secrets.AWS_OIDC_ROLE_ARN }}
aws-region: us-east-2
@@ -244,7 +244,7 @@ jobs:
- uses: runs-on/action@cd2b598b0515d39d78c38a02d529db87d2196d1e # ratchet:runs-on/action@v2
- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials@8df5847569e6427dd6c4fb1cf565c83acfa8afa7
uses: aws-actions/configure-aws-credentials@61815dcd50bd041e203e49132bacad1fd04d2708
with:
role-to-assume: ${{ secrets.AWS_OIDC_ROLE_ARN }}
aws-region: us-east-2

View File

@@ -119,11 +119,10 @@ repos:
]
- repo: https://github.com/golangci/golangci-lint
rev: 5d1e709b7be35cb2025444e19de266b056b7b7ee # frozen: v2.10.1
rev: 9f61b0f53f80672872fced07b6874397c3ed197b # frozen: v2.7.2
hooks:
- id: golangci-lint
language_version: "1.26.0"
entry: bash -c "find . -name go.mod -not -path './.venv/*' -print0 | xargs -0 -I{} bash -c 'cd \"$(dirname {})\" && golangci-lint run ./...'"
entry: bash -c "find tools/ -name go.mod -print0 | xargs -0 -I{} bash -c 'cd \"$(dirname {})\" && golangci-lint run ./...'"
- repo: https://github.com/astral-sh/ruff-pre-commit
# Ruff version.

View File

@@ -104,10 +104,6 @@ Onyx uses Celery for asynchronous task processing with multiple specialized work
- Always use `@shared_task` rather than `@celery_app`
- Put tasks under `background/celery/tasks/` or `ee/background/celery/tasks`
- Never enqueue a task without an expiration. Always supply `expires=` when
sending tasks, either from the beat schedule or directly from another task. It
should never be acceptable to submit code which enqueues tasks without an
expiration, as doing so can lead to unbounded task queue growth.
**Defining APIs**:
When creating new FastAPI APIs, do NOT use the `response_model` field. Instead, just type the

View File

@@ -246,11 +246,7 @@ async def get_billing_information(
)
except OnyxError as e:
# Open circuit breaker on connection failures (self-hosted only)
if e.status_code in (
OnyxErrorCode.BAD_GATEWAY.status_code,
OnyxErrorCode.SERVICE_UNAVAILABLE.status_code,
OnyxErrorCode.GATEWAY_TIMEOUT.status_code,
):
if e.status_code in (502, 503, 504):
_open_billing_circuit()
raise

View File

@@ -58,6 +58,8 @@ from onyx.file_store.document_batch_storage import DocumentBatchStorage
from onyx.file_store.document_batch_storage import get_document_batch_storage
from onyx.indexing.indexing_heartbeat import IndexingHeartbeatInterface
from onyx.indexing.indexing_pipeline import index_doc_batch_prepare
from onyx.indexing.postgres_sanitization import sanitize_document_for_postgres
from onyx.indexing.postgres_sanitization import sanitize_hierarchy_nodes_for_postgres
from onyx.redis.redis_hierarchy import cache_hierarchy_nodes_batch
from onyx.redis.redis_hierarchy import ensure_source_node_exists
from onyx.redis.redis_hierarchy import get_node_id_from_raw_id
@@ -69,8 +71,6 @@ from onyx.server.features.build.indexing.persistent_document_writer import (
)
from onyx.utils.logger import setup_logger
from onyx.utils.middleware import make_randomized_onyx_request_id
from onyx.utils.postgres_sanitization import sanitize_document_for_postgres
from onyx.utils.postgres_sanitization import sanitize_hierarchy_nodes_for_postgres
from onyx.utils.variable_functionality import global_version
from shared_configs.configs import MULTI_TENANT
from shared_configs.contextvars import INDEX_ATTEMPT_INFO_CONTEXTVAR

View File

@@ -36,6 +36,7 @@ from onyx.db.memory import add_memory
from onyx.db.memory import update_memory_at_index
from onyx.db.memory import UserMemoryContext
from onyx.db.models import Persona
from onyx.llm.constants import LlmProviderNames
from onyx.llm.interfaces import LLM
from onyx.llm.interfaces import LLMUserIdentity
from onyx.llm.interfaces import ToolChoiceOptions
@@ -83,6 +84,28 @@ def _looks_like_xml_tool_call_payload(text: str | None) -> bool:
)
def _should_keep_bedrock_tool_definitions(
llm: object, simple_chat_history: list[ChatMessageSimple]
) -> bool:
"""Bedrock requires tool config when history includes toolUse/toolResult blocks."""
model_provider = getattr(getattr(llm, "config", None), "model_provider", None)
if model_provider not in {
LlmProviderNames.BEDROCK,
LlmProviderNames.BEDROCK_CONVERSE,
}:
return False
return any(
(
msg.message_type == MessageType.ASSISTANT
and msg.tool_calls
and len(msg.tool_calls) > 0
)
or msg.message_type == MessageType.TOOL_CALL_RESPONSE
for msg in simple_chat_history
)
def _try_fallback_tool_extraction(
llm_step_result: LlmStepResult,
tool_choice: ToolChoiceOptions,
@@ -663,7 +686,12 @@ def run_llm_loop(
elif out_of_cycles or ran_image_gen:
# Last cycle, no tools allowed, just answer!
tool_choice = ToolChoiceOptions.NONE
final_tools = []
# Bedrock requires tool config in requests that include toolUse/toolResult history.
final_tools = (
tools
if _should_keep_bedrock_tool_definitions(llm, simple_chat_history)
else []
)
else:
tool_choice = ToolChoiceOptions.AUTO
final_tools = tools

View File

@@ -55,7 +55,6 @@ from onyx.tools.models import ToolCallKickoff
from onyx.tracing.framework.create import generation_span
from onyx.utils.b64 import get_image_type_from_bytes
from onyx.utils.logger import setup_logger
from onyx.utils.postgres_sanitization import sanitize_string
from onyx.utils.text_processing import find_all_json_objects
logger = setup_logger()
@@ -167,6 +166,15 @@ def _find_function_calls_open_marker(text_lower: str) -> int:
search_from = idx + 1
def _sanitize_llm_output(value: str) -> str:
"""Remove characters that PostgreSQL's text/JSONB types cannot store.
- NULL bytes (\x00): Not allowed in PostgreSQL text types
- UTF-16 surrogates (\ud800-\udfff): Invalid in UTF-8 encoding
"""
return "".join(c for c in value if c != "\x00" and not ("\ud800" <= c <= "\udfff"))
def _try_parse_json_string(value: Any) -> Any:
"""Attempt to parse a JSON string value into its Python equivalent.
@@ -214,7 +222,9 @@ def _parse_tool_args_to_dict(raw_args: Any) -> dict[str, Any]:
if isinstance(raw_args, dict):
# Parse any string values that look like JSON arrays/objects
return {
k: _try_parse_json_string(sanitize_string(v) if isinstance(v, str) else v)
k: _try_parse_json_string(
_sanitize_llm_output(v) if isinstance(v, str) else v
)
for k, v in raw_args.items()
}
@@ -222,7 +232,7 @@ def _parse_tool_args_to_dict(raw_args: Any) -> dict[str, Any]:
return {}
# Sanitize before parsing to remove NULL bytes and surrogates
raw_args = sanitize_string(raw_args)
raw_args = _sanitize_llm_output(raw_args)
try:
parsed1: Any = json.loads(raw_args)
@@ -535,12 +545,12 @@ def _extract_xml_attribute(attrs: str, attr_name: str) -> str | None:
)
if not attr_match:
return None
return sanitize_string(unescape(attr_match.group(2).strip()))
return _sanitize_llm_output(unescape(attr_match.group(2).strip()))
def _parse_xml_parameter_value(raw_value: str, string_attr: str | None) -> Any:
"""Parse a parameter value from XML-style tool call payloads."""
value = sanitize_string(unescape(raw_value).strip())
value = _sanitize_llm_output(unescape(raw_value).strip())
if string_attr and string_attr.lower() == "true":
return value
@@ -559,7 +569,6 @@ def _resolve_tool_arguments(obj: dict[str, Any]) -> dict[str, Any] | None:
"""
arguments = obj.get("arguments", obj.get("parameters", {}))
if isinstance(arguments, str):
arguments = sanitize_string(arguments)
try:
arguments = json.loads(arguments)
except json.JSONDecodeError:

View File

@@ -19,7 +19,6 @@ from onyx.natural_language_processing.utils import get_tokenizer
from onyx.server.query_and_chat.chat_utils import mime_type_to_chat_file_type
from onyx.tools.models import ToolCallInfo
from onyx.utils.logger import setup_logger
from onyx.utils.postgres_sanitization import sanitize_string
logger = setup_logger()
@@ -202,13 +201,8 @@ def save_chat_turn(
pre_answer_processing_time: Duration of processing before answer starts (in seconds)
"""
# 1. Update ChatMessage with message content, reasoning tokens, and token count
sanitized_message_text = (
sanitize_string(message_text) if message_text else message_text
)
assistant_message.message = sanitized_message_text
assistant_message.reasoning_tokens = (
sanitize_string(reasoning_tokens) if reasoning_tokens else reasoning_tokens
)
assistant_message.message = message_text
assistant_message.reasoning_tokens = reasoning_tokens
assistant_message.is_clarification = is_clarification
# Use pre-answer processing time (captured when MESSAGE_START was emitted)
@@ -218,10 +212,8 @@ def save_chat_turn(
# Calculate token count using default tokenizer, when storing, this should not use the LLM
# specific one so we use a system default tokenizer here.
default_tokenizer = get_tokenizer(None, None)
if sanitized_message_text:
assistant_message.token_count = len(
default_tokenizer.encode(sanitized_message_text)
)
if message_text:
assistant_message.token_count = len(default_tokenizer.encode(message_text))
else:
assistant_message.token_count = 0
@@ -336,10 +328,8 @@ def save_chat_turn(
# 8. Attach code interpreter generated files that the assistant actually
# referenced in its response, so they are available via load_all_chat_files
# on subsequent turns. Files not mentioned are intermediate artifacts.
if sanitized_message_text:
referenced = _extract_referenced_file_descriptors(
tool_calls, sanitized_message_text
)
if message_text:
referenced = _extract_referenced_file_descriptors(tool_calls, message_text)
if referenced:
existing_files = assistant_message.files or []
assistant_message.files = existing_files + referenced

View File

@@ -38,7 +38,6 @@ from onyx.llm.override_models import LLMOverride
from onyx.llm.override_models import PromptOverride
from onyx.server.query_and_chat.models import ChatMessageDetail
from onyx.utils.logger import setup_logger
from onyx.utils.postgres_sanitization import sanitize_string
logger = setup_logger()
@@ -676,43 +675,58 @@ def set_as_latest_chat_message(
db_session.commit()
def _sanitize_for_postgres(value: str) -> str:
"""Remove NUL (0x00) characters from strings as PostgreSQL doesn't allow them."""
sanitized = value.replace("\x00", "")
if value and not sanitized:
logger.warning("Sanitization removed all characters from string")
return sanitized
def _sanitize_list_for_postgres(values: list[str]) -> list[str]:
"""Remove NUL (0x00) characters from all strings in a list."""
return [_sanitize_for_postgres(v) for v in values]
def create_db_search_doc(
server_search_doc: ServerSearchDoc,
db_session: Session,
commit: bool = True,
) -> DBSearchDoc:
# Sanitize string fields to remove NUL characters (PostgreSQL doesn't allow them)
db_search_doc = DBSearchDoc(
document_id=sanitize_string(server_search_doc.document_id),
document_id=_sanitize_for_postgres(server_search_doc.document_id),
chunk_ind=server_search_doc.chunk_ind,
semantic_id=sanitize_string(server_search_doc.semantic_identifier),
semantic_id=_sanitize_for_postgres(server_search_doc.semantic_identifier),
link=(
sanitize_string(server_search_doc.link)
_sanitize_for_postgres(server_search_doc.link)
if server_search_doc.link is not None
else None
),
blurb=sanitize_string(server_search_doc.blurb),
blurb=_sanitize_for_postgres(server_search_doc.blurb),
source_type=server_search_doc.source_type,
boost=server_search_doc.boost,
hidden=server_search_doc.hidden,
doc_metadata=server_search_doc.metadata,
is_relevant=server_search_doc.is_relevant,
relevance_explanation=(
sanitize_string(server_search_doc.relevance_explanation)
_sanitize_for_postgres(server_search_doc.relevance_explanation)
if server_search_doc.relevance_explanation is not None
else None
),
# For docs further down that aren't reranked, we can't use the retrieval score
score=server_search_doc.score or 0.0,
match_highlights=[
sanitize_string(h) for h in server_search_doc.match_highlights
],
match_highlights=_sanitize_list_for_postgres(
server_search_doc.match_highlights
),
updated_at=server_search_doc.updated_at,
primary_owners=(
[sanitize_string(o) for o in server_search_doc.primary_owners]
_sanitize_list_for_postgres(server_search_doc.primary_owners)
if server_search_doc.primary_owners is not None
else None
),
secondary_owners=(
[sanitize_string(o) for o in server_search_doc.secondary_owners]
_sanitize_list_for_postgres(server_search_doc.secondary_owners)
if server_search_doc.secondary_owners is not None
else None
),

View File

@@ -25,11 +25,8 @@ from onyx.server.manage.embedding.models import CloudEmbeddingProvider
from onyx.server.manage.embedding.models import CloudEmbeddingProviderCreationRequest
from onyx.server.manage.llm.models import LLMProviderUpsertRequest
from onyx.server.manage.llm.models import LLMProviderView
from onyx.utils.logger import setup_logger
from shared_configs.enums import EmbeddingProvider
logger = setup_logger()
def update_group_llm_provider_relationships__no_commit(
llm_provider_id: int,
@@ -815,43 +812,6 @@ def sync_auto_mode_models(
changes += 1
db_session.commit()
# Update the default if this provider currently holds the global CHAT default
recommended_default = llm_recommendations.get_default_model(provider.provider)
if recommended_default:
current_default_name = db_session.scalar(
select(ModelConfiguration.name)
.join(
LLMModelFlow,
LLMModelFlow.model_configuration_id == ModelConfiguration.id,
)
.where(
ModelConfiguration.llm_provider_id == provider.id,
LLMModelFlow.llm_model_flow_type == LLMModelFlowType.CHAT,
LLMModelFlow.is_default == True, # noqa: E712
)
)
if (
current_default_name is not None
and current_default_name != recommended_default.name
):
try:
_update_default_model(
db_session=db_session,
provider_id=provider.id,
model=recommended_default.name,
flow_type=LLMModelFlowType.CHAT,
)
changes += 1
except ValueError:
logger.warning(
"Recommended default model '%s' not found "
"for provider_id=%s; skipping default update.",
recommended_default.name,
provider.id,
)
return changes

View File

@@ -13,15 +13,12 @@ from onyx.db.constants import UNSET
from onyx.db.constants import UnsetType
from onyx.db.enums import MCPServerStatus
from onyx.db.models import MCPServer
from onyx.db.models import OAuthConfig
from onyx.db.models import Tool
from onyx.db.models import ToolCall
from onyx.server.features.tool.models import Header
from onyx.tools.built_in_tools import BUILT_IN_TOOL_TYPES
from onyx.utils.headers import HeaderItemDict
from onyx.utils.logger import setup_logger
from onyx.utils.postgres_sanitization import sanitize_json_like
from onyx.utils.postgres_sanitization import sanitize_string
if TYPE_CHECKING:
pass
@@ -162,26 +159,10 @@ def update_tool(
]
if passthrough_auth is not None:
tool.passthrough_auth = passthrough_auth
old_oauth_config_id = tool.oauth_config_id
if not isinstance(oauth_config_id, UnsetType):
tool.oauth_config_id = oauth_config_id
db_session.flush()
# Clean up orphaned OAuthConfig if the oauth_config_id was changed
if (
old_oauth_config_id is not None
and not isinstance(oauth_config_id, UnsetType)
and old_oauth_config_id != oauth_config_id
):
other_tools = db_session.scalars(
select(Tool).where(Tool.oauth_config_id == old_oauth_config_id)
).all()
if not other_tools:
oauth_config = db_session.get(OAuthConfig, old_oauth_config_id)
if oauth_config:
db_session.delete(oauth_config)
db_session.commit()
return tool
@@ -190,21 +171,8 @@ def delete_tool__no_commit(tool_id: int, db_session: Session) -> None:
if tool is None:
raise ValueError(f"Tool with ID {tool_id} does not exist")
oauth_config_id = tool.oauth_config_id
db_session.delete(tool)
db_session.flush()
# Clean up orphaned OAuthConfig if no other tools reference it
if oauth_config_id is not None:
other_tools = db_session.scalars(
select(Tool).where(Tool.oauth_config_id == oauth_config_id)
).all()
if not other_tools:
oauth_config = db_session.get(OAuthConfig, oauth_config_id)
if oauth_config:
db_session.delete(oauth_config)
db_session.flush()
db_session.flush() # Don't commit yet, let caller decide when to commit
def get_builtin_tool(
@@ -288,13 +256,11 @@ def create_tool_call_no_commit(
tab_index=tab_index,
tool_id=tool_id,
tool_call_id=tool_call_id,
reasoning_tokens=(
sanitize_string(reasoning_tokens) if reasoning_tokens else reasoning_tokens
),
tool_call_arguments=sanitize_json_like(tool_call_arguments),
tool_call_response=sanitize_json_like(tool_call_response),
reasoning_tokens=reasoning_tokens,
tool_call_arguments=tool_call_arguments,
tool_call_response=tool_call_response,
tool_call_tokens=tool_call_tokens,
generated_images=sanitize_json_like(generated_images),
generated_images=generated_images,
)
db_session.add(tool_call)

View File

@@ -61,25 +61,6 @@ class SearchHit(BaseModel, Generic[SchemaDocumentModel]):
explanation: dict[str, Any] | None = None
class IndexInfo(BaseModel):
"""
Represents information about an OpenSearch index.
"""
model_config = {"frozen": True}
name: str
health: str
status: str
num_primary_shards: str
num_replica_shards: str
docs_count: str
docs_deleted: str
created_at: str
total_size: str
primary_shards_size: str
def get_new_body_without_vectors(body: dict[str, Any]) -> dict[str, Any]:
"""Recursively replaces vectors in the body with their length.
@@ -178,8 +159,8 @@ class OpenSearchClient(AbstractContextManager):
Raises:
Exception: There was an error creating the search pipeline.
"""
response = self._client.search_pipeline.put(id=pipeline_id, body=pipeline_body)
if not response.get("acknowledged", False):
result = self._client.search_pipeline.put(id=pipeline_id, body=pipeline_body)
if not result.get("acknowledged", False):
raise RuntimeError(f"Failed to create search pipeline {pipeline_id}.")
@log_function_time(print_only=True, debug_only=True, include_args=True)
@@ -192,8 +173,8 @@ class OpenSearchClient(AbstractContextManager):
Raises:
Exception: There was an error deleting the search pipeline.
"""
response = self._client.search_pipeline.delete(id=pipeline_id)
if not response.get("acknowledged", False):
result = self._client.search_pipeline.delete(id=pipeline_id)
if not result.get("acknowledged", False):
raise RuntimeError(f"Failed to delete search pipeline {pipeline_id}.")
@log_function_time(print_only=True, debug_only=True, include_args=True)
@@ -217,34 +198,6 @@ class OpenSearchClient(AbstractContextManager):
logger.error(f"Failed to put cluster settings: {response}.")
return False
@log_function_time(print_only=True, debug_only=True)
def list_indices_with_info(self) -> list[IndexInfo]:
"""
Lists the indices in the OpenSearch cluster with information about each
index.
Returns:
A list of IndexInfo objects for each index.
"""
response = self._client.cat.indices(format="json")
indices: list[IndexInfo] = []
for raw_index_info in response:
indices.append(
IndexInfo(
name=raw_index_info.get("index", ""),
health=raw_index_info.get("health", ""),
status=raw_index_info.get("status", ""),
num_primary_shards=raw_index_info.get("pri", ""),
num_replica_shards=raw_index_info.get("rep", ""),
docs_count=raw_index_info.get("docs.count", ""),
docs_deleted=raw_index_info.get("docs.deleted", ""),
created_at=raw_index_info.get("creation.date.string", ""),
total_size=raw_index_info.get("store.size", ""),
primary_shards_size=raw_index_info.get("pri.store.size", ""),
)
)
return indices
@log_function_time(print_only=True, debug_only=True)
def ping(self) -> bool:
"""Pings the OpenSearch cluster.

View File

@@ -739,8 +739,7 @@ class OpenSearchDocumentIndex(DocumentIndex):
The number of chunks successfully deleted.
"""
logger.debug(
f"[OpenSearchDocumentIndex] Deleting document {document_id} from index "
f"{self._index_name}."
f"[OpenSearchDocumentIndex] Deleting document {document_id} from index {self._index_name}."
)
query_body = DocumentQuery.delete_from_document_id_query(
document_id=document_id,
@@ -776,8 +775,7 @@ class OpenSearchDocumentIndex(DocumentIndex):
specified documents.
"""
logger.debug(
f"[OpenSearchDocumentIndex] Updating {len(update_requests)} chunks for index "
f"{self._index_name}."
f"[OpenSearchDocumentIndex] Updating {len(update_requests)} chunks for index {self._index_name}."
)
for update_request in update_requests:
properties_to_update: dict[str, Any] = dict()
@@ -833,11 +831,9 @@ class OpenSearchDocumentIndex(DocumentIndex):
# here.
# TODO(andrei): Fix the aforementioned race condition.
raise ChunkCountNotFoundError(
f"Tried to update document {doc_id} but its chunk count is not known. "
"Older versions of the application used to permit this but is not a "
"supported state for a document when using OpenSearch. The document was "
"likely just added to the indexing pipeline and the chunk count will be "
"updated shortly."
f"Tried to update document {doc_id} but its chunk count is not known. Older versions of the "
"application used to permit this but is not a supported state for a document when using OpenSearch. "
"The document was likely just added to the indexing pipeline and the chunk count will be updated shortly."
)
if doc_chunk_count == 0:
raise ValueError(
@@ -869,8 +865,7 @@ class OpenSearchDocumentIndex(DocumentIndex):
chunk IDs vs querying for matching document chunks.
"""
logger.debug(
f"[OpenSearchDocumentIndex] Retrieving {len(chunk_requests)} chunks for index "
f"{self._index_name}."
f"[OpenSearchDocumentIndex] Retrieving {len(chunk_requests)} chunks for index {self._index_name}."
)
results: list[InferenceChunk] = []
for chunk_request in chunk_requests:
@@ -917,8 +912,7 @@ class OpenSearchDocumentIndex(DocumentIndex):
num_to_retrieve: int,
) -> list[InferenceChunk]:
logger.debug(
f"[OpenSearchDocumentIndex] Hybrid retrieving {num_to_retrieve} chunks for index "
f"{self._index_name}."
f"[OpenSearchDocumentIndex] Hybrid retrieving {num_to_retrieve} chunks for index {self._index_name}."
)
# TODO(andrei): This could be better, the caller should just make this
# decision when passing in the query param. See the above comment in the
@@ -938,10 +932,8 @@ class OpenSearchDocumentIndex(DocumentIndex):
index_filters=filters,
include_hidden=False,
)
# NOTE: Using z-score normalization here because it's better for hybrid
# search from a theoretical standpoint. Empirically on a small dataset
# of up to 10K docs, it's not very different. Likely more impactful at
# scale.
# NOTE: Using z-score normalization here because it's better for hybrid search from a theoretical standpoint.
# Empirically on a small dataset of up to 10K docs, it's not very different. Likely more impactful at scale.
# https://opensearch.org/blog/introducing-the-z-score-normalization-technique-for-hybrid-search/
search_hits: list[SearchHit[DocumentChunk]] = self._client.search(
body=query_body,
@@ -968,8 +960,7 @@ class OpenSearchDocumentIndex(DocumentIndex):
dirty: bool | None = None, # noqa: ARG002
) -> list[InferenceChunk]:
logger.debug(
f"[OpenSearchDocumentIndex] Randomly retrieving {num_to_retrieve} chunks for index "
f"{self._index_name}."
f"[OpenSearchDocumentIndex] Randomly retrieving {num_to_retrieve} chunks for index {self._index_name}."
)
query_body = DocumentQuery.get_random_search_query(
tenant_state=self._tenant_state,
@@ -999,8 +990,7 @@ class OpenSearchDocumentIndex(DocumentIndex):
complete.
"""
logger.debug(
f"[OpenSearchDocumentIndex] Indexing {len(chunks)} raw chunks for index "
f"{self._index_name}."
f"[OpenSearchDocumentIndex] Indexing {len(chunks)} raw chunks for index {self._index_name}."
)
# Do not raise if the document already exists, just update. This is
# because the document may already have been indexed during the

View File

@@ -243,8 +243,7 @@ class DocumentChunk(BaseModel):
return value
if not isinstance(value, int):
raise ValueError(
f"Bug: Expected an int for the last_updated property from OpenSearch, got "
f"{type(value)} instead."
f"Bug: Expected an int for the last_updated property from OpenSearch, got {type(value)} instead."
)
return datetime.fromtimestamp(value, tz=timezone.utc)
@@ -285,22 +284,19 @@ class DocumentChunk(BaseModel):
elif isinstance(value, TenantState):
if MULTI_TENANT != value.multitenant:
raise ValueError(
f"Bug: An existing TenantState object was supplied to the DocumentChunk model "
f"but its multi-tenant mode ({value.multitenant}) does not match the program's "
"current global tenancy state."
f"Bug: An existing TenantState object was supplied to the DocumentChunk model but its multi-tenant mode "
f"({value.multitenant}) does not match the program's current global tenancy state."
)
return value
elif not isinstance(value, str):
raise ValueError(
f"Bug: Expected a str for the tenant_id property from OpenSearch, got "
f"{type(value)} instead."
f"Bug: Expected a str for the tenant_id property from OpenSearch, got {type(value)} instead."
)
else:
if not MULTI_TENANT:
raise ValueError(
"Bug: Got a non-null str for the tenant_id property from OpenSearch but "
"multi-tenant mode is not enabled. This is unexpected because in single-tenant "
"mode we don't expect to see a tenant_id."
"Bug: Got a non-null str for the tenant_id property from OpenSearch but multi-tenant mode is not enabled. "
"This is unexpected because in single-tenant mode we don't expect to see a tenant_id."
)
return TenantState(tenant_id=value, multitenant=MULTI_TENANT)
@@ -356,10 +352,8 @@ class DocumentSchema:
"properties": {
TITLE_FIELD_NAME: {
"type": "text",
# Language analyzer (e.g. english) stems at index and search
# time for variant matching. Configure via
# OPENSEARCH_TEXT_ANALYZER. Existing indices need reindexing
# after a change.
# Language analyzer (e.g. english) stems at index and search time for variant matching.
# Configure via OPENSEARCH_TEXT_ANALYZER. Existing indices need reindexing after a change.
"analyzer": OPENSEARCH_TEXT_ANALYZER,
"fields": {
# Subfield accessed as title.keyword. Not indexed for

View File

@@ -48,11 +48,10 @@ class OnyxError(Exception):
*,
status_code_override: int | None = None,
) -> None:
resolved_message = message or error_code.code
super().__init__(resolved_message)
self.error_code = error_code
self.message = resolved_message
self.message = message or error_code.code
self._status_code_override = status_code_override
super().__init__(self.message)
@property
def status_code(self) -> int:

View File

@@ -49,6 +49,7 @@ from onyx.indexing.embedder import IndexingEmbedder
from onyx.indexing.models import DocAwareChunk
from onyx.indexing.models import IndexingBatchAdapter
from onyx.indexing.models import UpdatableChunkData
from onyx.indexing.postgres_sanitization import sanitize_documents_for_postgres
from onyx.indexing.vector_db_insertion import write_chunks_to_vector_db_with_backoff
from onyx.llm.factory import get_default_llm_with_vision
from onyx.llm.factory import get_llm_for_contextual_rag
@@ -64,7 +65,6 @@ from onyx.prompts.contextual_retrieval import CONTEXTUAL_RAG_PROMPT1
from onyx.prompts.contextual_retrieval import CONTEXTUAL_RAG_PROMPT2
from onyx.prompts.contextual_retrieval import DOCUMENT_SUMMARY_PROMPT
from onyx.utils.logger import setup_logger
from onyx.utils.postgres_sanitization import sanitize_documents_for_postgres
from onyx.utils.threadpool_concurrency import run_functions_tuples_in_parallel
from onyx.utils.timing import log_function_time

View File

@@ -1,49 +1,30 @@
import re
from typing import Any
from onyx.access.models import ExternalAccess
from onyx.connectors.models import BasicExpertInfo
from onyx.connectors.models import Document
from onyx.connectors.models import HierarchyNode
from onyx.utils.logger import setup_logger
logger = setup_logger()
_SURROGATE_RE = re.compile(r"[\ud800-\udfff]")
def sanitize_string(value: str) -> str:
"""Strip characters that PostgreSQL text/JSONB columns cannot store.
Removes:
- NUL bytes (\\x00)
- UTF-16 surrogates (\\ud800-\\udfff), which are invalid in UTF-8
"""
sanitized = value.replace("\x00", "")
sanitized = _SURROGATE_RE.sub("", sanitized)
if value and not sanitized:
logger.warning(
"sanitize_string: all characters were removed from a non-empty string"
)
return sanitized
def _sanitize_string(value: str) -> str:
return value.replace("\x00", "")
def sanitize_json_like(value: Any) -> Any:
"""Recursively sanitize all strings in a JSON-like structure (dict/list/tuple)."""
def _sanitize_json_like(value: Any) -> Any:
if isinstance(value, str):
return sanitize_string(value)
return _sanitize_string(value)
if isinstance(value, list):
return [sanitize_json_like(item) for item in value]
return [_sanitize_json_like(item) for item in value]
if isinstance(value, tuple):
return tuple(sanitize_json_like(item) for item in value)
return tuple(_sanitize_json_like(item) for item in value)
if isinstance(value, dict):
sanitized: dict[Any, Any] = {}
for key, nested_value in value.items():
cleaned_key = sanitize_string(key) if isinstance(key, str) else key
sanitized[cleaned_key] = sanitize_json_like(nested_value)
cleaned_key = _sanitize_string(key) if isinstance(key, str) else key
sanitized[cleaned_key] = _sanitize_json_like(nested_value)
return sanitized
return value
@@ -53,27 +34,27 @@ def _sanitize_expert_info(expert: BasicExpertInfo) -> BasicExpertInfo:
return expert.model_copy(
update={
"display_name": (
sanitize_string(expert.display_name)
_sanitize_string(expert.display_name)
if expert.display_name is not None
else None
),
"first_name": (
sanitize_string(expert.first_name)
_sanitize_string(expert.first_name)
if expert.first_name is not None
else None
),
"middle_initial": (
sanitize_string(expert.middle_initial)
_sanitize_string(expert.middle_initial)
if expert.middle_initial is not None
else None
),
"last_name": (
sanitize_string(expert.last_name)
_sanitize_string(expert.last_name)
if expert.last_name is not None
else None
),
"email": (
sanitize_string(expert.email) if expert.email is not None else None
_sanitize_string(expert.email) if expert.email is not None else None
),
}
)
@@ -82,10 +63,10 @@ def _sanitize_expert_info(expert: BasicExpertInfo) -> BasicExpertInfo:
def _sanitize_external_access(external_access: ExternalAccess) -> ExternalAccess:
return ExternalAccess(
external_user_emails={
sanitize_string(email) for email in external_access.external_user_emails
_sanitize_string(email) for email in external_access.external_user_emails
},
external_user_group_ids={
sanitize_string(group_id)
_sanitize_string(group_id)
for group_id in external_access.external_user_group_ids
},
is_public=external_access.is_public,
@@ -95,26 +76,26 @@ def _sanitize_external_access(external_access: ExternalAccess) -> ExternalAccess
def sanitize_document_for_postgres(document: Document) -> Document:
cleaned_doc = document.model_copy(deep=True)
cleaned_doc.id = sanitize_string(cleaned_doc.id)
cleaned_doc.semantic_identifier = sanitize_string(cleaned_doc.semantic_identifier)
cleaned_doc.id = _sanitize_string(cleaned_doc.id)
cleaned_doc.semantic_identifier = _sanitize_string(cleaned_doc.semantic_identifier)
if cleaned_doc.title is not None:
cleaned_doc.title = sanitize_string(cleaned_doc.title)
cleaned_doc.title = _sanitize_string(cleaned_doc.title)
if cleaned_doc.parent_hierarchy_raw_node_id is not None:
cleaned_doc.parent_hierarchy_raw_node_id = sanitize_string(
cleaned_doc.parent_hierarchy_raw_node_id = _sanitize_string(
cleaned_doc.parent_hierarchy_raw_node_id
)
cleaned_doc.metadata = {
sanitize_string(key): (
[sanitize_string(item) for item in value]
_sanitize_string(key): (
[_sanitize_string(item) for item in value]
if isinstance(value, list)
else sanitize_string(value)
else _sanitize_string(value)
)
for key, value in cleaned_doc.metadata.items()
}
if cleaned_doc.doc_metadata is not None:
cleaned_doc.doc_metadata = sanitize_json_like(cleaned_doc.doc_metadata)
cleaned_doc.doc_metadata = _sanitize_json_like(cleaned_doc.doc_metadata)
if cleaned_doc.primary_owners is not None:
cleaned_doc.primary_owners = [
@@ -132,11 +113,11 @@ def sanitize_document_for_postgres(document: Document) -> Document:
for section in cleaned_doc.sections:
if section.link is not None:
section.link = sanitize_string(section.link)
section.link = _sanitize_string(section.link)
if section.text is not None:
section.text = sanitize_string(section.text)
section.text = _sanitize_string(section.text)
if section.image_file_id is not None:
section.image_file_id = sanitize_string(section.image_file_id)
section.image_file_id = _sanitize_string(section.image_file_id)
return cleaned_doc
@@ -148,12 +129,12 @@ def sanitize_documents_for_postgres(documents: list[Document]) -> list[Document]
def sanitize_hierarchy_node_for_postgres(node: HierarchyNode) -> HierarchyNode:
cleaned_node = node.model_copy(deep=True)
cleaned_node.raw_node_id = sanitize_string(cleaned_node.raw_node_id)
cleaned_node.display_name = sanitize_string(cleaned_node.display_name)
cleaned_node.raw_node_id = _sanitize_string(cleaned_node.raw_node_id)
cleaned_node.display_name = _sanitize_string(cleaned_node.display_name)
if cleaned_node.raw_parent_id is not None:
cleaned_node.raw_parent_id = sanitize_string(cleaned_node.raw_parent_id)
cleaned_node.raw_parent_id = _sanitize_string(cleaned_node.raw_parent_id)
if cleaned_node.link is not None:
cleaned_node.link = sanitize_string(cleaned_node.link)
cleaned_node.link = _sanitize_string(cleaned_node.link)
if cleaned_node.external_access is not None:
cleaned_node.external_access = _sanitize_external_access(

View File

@@ -22,7 +22,6 @@ class LlmProviderNames(str, Enum):
OPENROUTER = "openrouter"
AZURE = "azure"
OLLAMA_CHAT = "ollama_chat"
LM_STUDIO = "lm_studio"
MISTRAL = "mistral"
LITELLM_PROXY = "litellm_proxy"
@@ -42,7 +41,6 @@ WELL_KNOWN_PROVIDER_NAMES = [
LlmProviderNames.OPENROUTER,
LlmProviderNames.AZURE,
LlmProviderNames.OLLAMA_CHAT,
LlmProviderNames.LM_STUDIO,
]
@@ -58,7 +56,6 @@ PROVIDER_DISPLAY_NAMES: dict[str, str] = {
LlmProviderNames.AZURE: "Azure",
"ollama": "Ollama",
LlmProviderNames.OLLAMA_CHAT: "Ollama",
LlmProviderNames.LM_STUDIO: "LM Studio",
"groq": "Groq",
"anyscale": "Anyscale",
"deepseek": "DeepSeek",
@@ -106,7 +103,6 @@ AGGREGATOR_PROVIDERS: set[str] = {
LlmProviderNames.BEDROCK_CONVERSE,
LlmProviderNames.OPENROUTER,
LlmProviderNames.OLLAMA_CHAT,
LlmProviderNames.LM_STUDIO,
LlmProviderNames.VERTEX_AI,
LlmProviderNames.AZURE,
}

View File

@@ -20,9 +20,7 @@ from onyx.llm.multi_llm import LitellmLLM
from onyx.llm.override_models import LLMOverride
from onyx.llm.utils import get_max_input_tokens_from_llm_provider
from onyx.llm.utils import model_supports_image_input
from onyx.llm.well_known_providers.constants import (
PROVIDERS_WITH_SPECIAL_API_KEY_HANDLING,
)
from onyx.llm.well_known_providers.constants import OLLAMA_API_KEY_CONFIG_KEY
from onyx.natural_language_processing.utils import get_tokenizer
from onyx.server.manage.llm.models import LLMProviderView
from onyx.utils.headers import build_llm_extra_headers
@@ -34,18 +32,14 @@ logger = setup_logger()
def _build_provider_extra_headers(
provider: str, custom_config: dict[str, str] | None
) -> dict[str, str]:
if provider in PROVIDERS_WITH_SPECIAL_API_KEY_HANDLING and custom_config:
raw = custom_config.get(PROVIDERS_WITH_SPECIAL_API_KEY_HANDLING[provider])
api_key = raw.strip() if raw else None
if provider == LlmProviderNames.OLLAMA_CHAT and custom_config:
raw_api_key = custom_config.get(OLLAMA_API_KEY_CONFIG_KEY)
api_key = raw_api_key.strip() if raw_api_key else None
if not api_key:
return {}
return {
"Authorization": (
api_key
if api_key.lower().startswith("bearer ")
else f"Bearer {api_key}"
)
}
if not api_key.lower().startswith("bearer "):
api_key = f"Bearer {api_key}"
return {"Authorization": api_key}
# Passing these will put Onyx on the OpenRouter leaderboard
elif provider == LlmProviderNames.OPENROUTER:

View File

@@ -2516,10 +2516,6 @@
"model_vendor": "openai",
"model_version": "2025-10-06"
},
"gpt-5.4": {
"display_name": "GPT-5.4",
"model_vendor": "openai"
},
"gpt-5.2-pro-2025-12-11": {
"display_name": "GPT-5.2 Pro",
"model_vendor": "openai",

View File

@@ -42,7 +42,6 @@ from onyx.llm.well_known_providers.constants import AWS_SECRET_ACCESS_KEY_KWARG
from onyx.llm.well_known_providers.constants import (
AWS_SECRET_ACCESS_KEY_KWARG_ENV_VAR_FORMAT,
)
from onyx.llm.well_known_providers.constants import LM_STUDIO_API_KEY_CONFIG_KEY
from onyx.llm.well_known_providers.constants import OLLAMA_API_KEY_CONFIG_KEY
from onyx.llm.well_known_providers.constants import VERTEX_CREDENTIALS_FILE_KWARG
from onyx.llm.well_known_providers.constants import (
@@ -93,98 +92,6 @@ def _prompt_to_dicts(prompt: LanguageModelInput) -> list[dict[str, Any]]:
return [prompt.model_dump(exclude_none=True)]
def _normalize_content(raw: Any) -> str:
"""Normalize a message content field to a plain string.
Content can be a string, None, or a list of content-block dicts
(e.g. [{"type": "text", "text": "..."}]).
"""
if raw is None:
return ""
if isinstance(raw, str):
return raw
if isinstance(raw, list):
return "\n".join(
block.get("text", "") if isinstance(block, dict) else str(block)
for block in raw
)
return str(raw)
def _strip_tool_content_from_messages(
messages: list[dict[str, Any]],
) -> list[dict[str, Any]]:
"""Convert tool-related messages to plain text.
Bedrock's Converse API requires toolConfig when messages contain
toolUse/toolResult content blocks. When no tools are provided for the
current request, we must convert any tool-related history into plain text
to avoid the "toolConfig field must be defined" error.
This is the same approach used by _OllamaHistoryMessageFormatter.
"""
result: list[dict[str, Any]] = []
for msg in messages:
role = msg.get("role")
tool_calls = msg.get("tool_calls")
if role == "assistant" and tool_calls:
# Convert structured tool calls to text representation
tool_call_lines = []
for tc in tool_calls:
func = tc.get("function", {})
name = func.get("name", "unknown")
args = func.get("arguments", "{}")
tc_id = tc.get("id", "")
tool_call_lines.append(
f"[Tool Call] name={name} id={tc_id} args={args}"
)
existing_content = _normalize_content(msg.get("content"))
parts = (
[existing_content] + tool_call_lines
if existing_content
else tool_call_lines
)
new_msg = {
"role": "assistant",
"content": "\n".join(parts),
}
result.append(new_msg)
elif role == "tool":
# Convert tool response to user message with text content
tool_call_id = msg.get("tool_call_id", "")
content = _normalize_content(msg.get("content"))
tool_result_text = f"[Tool Result] id={tool_call_id}\n{content}"
# Merge into previous user message if it is also a converted
# tool result to avoid consecutive user messages (Bedrock requires
# strict user/assistant alternation).
if (
result
and result[-1]["role"] == "user"
and "[Tool Result]" in result[-1].get("content", "")
):
result[-1]["content"] += "\n\n" + tool_result_text
else:
result.append({"role": "user", "content": tool_result_text})
else:
result.append(msg)
return result
def _messages_contain_tool_content(messages: list[dict[str, Any]]) -> bool:
"""Check if any messages contain tool-related content blocks."""
for msg in messages:
if msg.get("role") == "tool":
return True
if msg.get("role") == "assistant" and msg.get("tool_calls"):
return True
return False
def _is_vertex_model_rejecting_output_config(model_name: str) -> bool:
normalized_model_name = model_name.lower()
return any(
@@ -250,9 +157,6 @@ class LitellmLLM(LLM):
elif model_provider == LlmProviderNames.OLLAMA_CHAT:
if k == OLLAMA_API_KEY_CONFIG_KEY:
model_kwargs["api_key"] = v
elif model_provider == LlmProviderNames.LM_STUDIO:
if k == LM_STUDIO_API_KEY_CONFIG_KEY:
model_kwargs["api_key"] = v
elif model_provider == LlmProviderNames.BEDROCK:
if k == AWS_REGION_NAME_KWARG:
model_kwargs[k] = v
@@ -269,19 +173,6 @@ class LitellmLLM(LLM):
elif k == AWS_SECRET_ACCESS_KEY_KWARG_ENV_VAR_FORMAT:
model_kwargs[AWS_SECRET_ACCESS_KEY_KWARG] = v
# LM Studio: LiteLLM defaults to "fake-api-key" when no key is provided,
# which LM Studio rejects. Ensure we always pass an explicit key (or empty
# string) to prevent LiteLLM from injecting its fake default.
if model_provider == LlmProviderNames.LM_STUDIO:
model_kwargs.setdefault("api_key", "")
# Users provide the server root (e.g. http://localhost:1234) but LiteLLM
# needs /v1 for OpenAI-compatible calls.
if self._api_base is not None:
base = self._api_base.rstrip("/")
self._api_base = base if base.endswith("/v1") else f"{base}/v1"
model_kwargs["api_base"] = self._api_base
# Default vertex_location to "global" if not provided for Vertex AI
# Latest gemini models are only available through the global region
if (
@@ -513,30 +404,13 @@ class LitellmLLM(LLM):
else nullcontext()
)
with env_ctx:
messages = _prompt_to_dicts(prompt)
# Bedrock's Converse API requires toolConfig when messages
# contain toolUse/toolResult content blocks. When no tools are
# provided for this request but the history contains tool
# content from previous turns, strip it to plain text.
is_bedrock = self._model_provider in {
LlmProviderNames.BEDROCK,
LlmProviderNames.BEDROCK_CONVERSE,
}
if (
is_bedrock
and not tools
and _messages_contain_tool_content(messages)
):
messages = _strip_tool_content_from_messages(messages)
response = litellm.completion(
mock_response=get_llm_mock_response() or MOCK_LLM_RESPONSE,
model=model,
base_url=self._api_base or None,
api_version=self._api_version or None,
custom_llm_provider=self._custom_llm_provider or None,
messages=messages,
messages=_prompt_to_dicts(prompt),
tools=tools,
tool_choice=tool_choice,
stream=stream,

View File

@@ -322,7 +322,7 @@ def test_llm(llm: LLM) -> str | None:
error_msg = None
for _ in range(2):
try:
llm.invoke(UserMessage(content="Do not respond"), max_tokens=50)
llm.invoke(UserMessage(content="Do not respond"))
return None
except Exception as e:
error_msg = str(e)

View File

@@ -1,5 +1,3 @@
from onyx.llm.constants import LlmProviderNames
OPENAI_PROVIDER_NAME = "openai"
# Curated list of OpenAI models to show by default in the UI
OPENAI_VISIBLE_MODEL_NAMES = {
@@ -37,15 +35,6 @@ def _fallback_bedrock_regions() -> list[str]:
OLLAMA_PROVIDER_NAME = "ollama_chat"
OLLAMA_API_KEY_CONFIG_KEY = "OLLAMA_API_KEY"
LM_STUDIO_PROVIDER_NAME = "lm_studio"
LM_STUDIO_API_KEY_CONFIG_KEY = "LM_STUDIO_API_KEY"
# Providers that use optional Bearer auth from custom_config
PROVIDERS_WITH_SPECIAL_API_KEY_HANDLING: dict[str, str] = {
LlmProviderNames.OLLAMA_CHAT: OLLAMA_API_KEY_CONFIG_KEY,
LlmProviderNames.LM_STUDIO: LM_STUDIO_API_KEY_CONFIG_KEY,
}
# OpenRouter
OPENROUTER_PROVIDER_NAME = "openrouter"

View File

@@ -15,7 +15,6 @@ from onyx.llm.well_known_providers.auto_update_service import (
from onyx.llm.well_known_providers.constants import ANTHROPIC_PROVIDER_NAME
from onyx.llm.well_known_providers.constants import AZURE_PROVIDER_NAME
from onyx.llm.well_known_providers.constants import BEDROCK_PROVIDER_NAME
from onyx.llm.well_known_providers.constants import LM_STUDIO_PROVIDER_NAME
from onyx.llm.well_known_providers.constants import OLLAMA_PROVIDER_NAME
from onyx.llm.well_known_providers.constants import OPENAI_PROVIDER_NAME
from onyx.llm.well_known_providers.constants import OPENROUTER_PROVIDER_NAME
@@ -45,7 +44,6 @@ def _get_provider_to_models_map() -> dict[str, list[str]]:
ANTHROPIC_PROVIDER_NAME: get_anthropic_model_names(),
VERTEXAI_PROVIDER_NAME: get_vertexai_model_names(),
OLLAMA_PROVIDER_NAME: [], # Dynamic - fetched from Ollama API
LM_STUDIO_PROVIDER_NAME: [], # Dynamic - fetched from LM Studio API
OPENROUTER_PROVIDER_NAME: [], # Dynamic - fetched from OpenRouter API
}
@@ -325,7 +323,6 @@ def get_provider_display_name(provider_name: str) -> str:
_ONYX_PROVIDER_DISPLAY_NAMES: dict[str, str] = {
OPENAI_PROVIDER_NAME: "ChatGPT (OpenAI)",
OLLAMA_PROVIDER_NAME: "Ollama",
LM_STUDIO_PROVIDER_NAME: "LM Studio",
ANTHROPIC_PROVIDER_NAME: "Claude (Anthropic)",
AZURE_PROVIDER_NAME: "Azure OpenAI",
BEDROCK_PROVIDER_NAME: "Amazon Bedrock",

View File

@@ -1,12 +1,12 @@
{
"version": "1.1",
"updated_at": "2026-03-05T00:00:00Z",
"updated_at": "2026-02-05T00:00:00Z",
"providers": {
"openai": {
"default_model": { "name": "gpt-5.4" },
"default_model": { "name": "gpt-5.2" },
"additional_visible_models": [
{ "name": "gpt-5.4" },
{ "name": "gpt-5.2" }
{ "name": "gpt-5-mini" },
{ "name": "gpt-4.1" }
]
},
"anthropic": {

View File

@@ -961,9 +961,9 @@
"license": "MIT"
},
"node_modules/@hono/node-server": {
"version": "1.19.10",
"resolved": "https://registry.npmjs.org/@hono/node-server/-/node-server-1.19.10.tgz",
"integrity": "sha512-hZ7nOssGqRgyV3FVVQdfi+U4q02uB23bpnYpdvNXkYTRRyWx84b7yf1ans+dnJ/7h41sGL3CeQTfO+ZGxuO+Iw==",
"version": "1.19.9",
"resolved": "https://registry.npmjs.org/@hono/node-server/-/node-server-1.19.9.tgz",
"integrity": "sha512-vHL6w3ecZsky+8P5MD+eFfaGTyCeOHUIFYMGpQGbrBTSmNNoxv0if69rEZ5giu36weC5saFuznL411gRX7bJDw==",
"license": "MIT",
"engines": {
"node": ">=18.14.1"
@@ -1573,6 +1573,27 @@
}
}
},
"node_modules/@isaacs/balanced-match": {
"version": "4.0.1",
"resolved": "https://registry.npmjs.org/@isaacs/balanced-match/-/balanced-match-4.0.1.tgz",
"integrity": "sha512-yzMTt9lEb8Gv7zRioUilSglI0c0smZ9k5D65677DLWLtWJaXIS3CqcGyUFByYKlnUj6TkjLVs54fBl6+TiGQDQ==",
"license": "MIT",
"engines": {
"node": "20 || >=22"
}
},
"node_modules/@isaacs/brace-expansion": {
"version": "5.0.1",
"resolved": "https://registry.npmjs.org/@isaacs/brace-expansion/-/brace-expansion-5.0.1.tgz",
"integrity": "sha512-WMz71T1JS624nWj2n2fnYAuPovhv7EUhk69R6i9dsVyzxt5eM3bjwvgk9L+APE1TRscGysAVMANkB0jh0LQZrQ==",
"license": "MIT",
"dependencies": {
"@isaacs/balanced-match": "^4.0.1"
},
"engines": {
"node": "20 || >=22"
}
},
"node_modules/@jridgewell/gen-mapping": {
"version": "0.3.13",
"resolved": "https://registry.npmjs.org/@jridgewell/gen-mapping/-/gen-mapping-0.3.13.tgz",
@@ -1659,9 +1680,9 @@
}
},
"node_modules/@modelcontextprotocol/sdk/node_modules/ajv": {
"version": "8.18.0",
"resolved": "https://registry.npmjs.org/ajv/-/ajv-8.18.0.tgz",
"integrity": "sha512-PlXPeEWMXMZ7sPYOHqmDyCJzcfNrUr3fGNKtezX14ykXOEIvyK81d+qydx89KY5O71FKMPaQ2vBfBFI5NHR63A==",
"version": "8.17.1",
"resolved": "https://registry.npmjs.org/ajv/-/ajv-8.17.1.tgz",
"integrity": "sha512-B/gBuNg5SiMTrPkC+A2+cW0RszwxYmn6VYxB/inlBStS5nx6xHIt/ehKRhIMhqusl7a8LjQoZnjCs5vhwxOQ1g==",
"license": "MIT",
"dependencies": {
"fast-deep-equal": "^3.1.3",
@@ -3834,27 +3855,6 @@
"path-browserify": "^1.0.1"
}
},
"node_modules/@ts-morph/common/node_modules/balanced-match": {
"version": "4.0.4",
"resolved": "https://registry.npmjs.org/balanced-match/-/balanced-match-4.0.4.tgz",
"integrity": "sha512-BLrgEcRTwX2o6gGxGOCNyMvGSp35YofuYzw9h1IMTRmKqttAZZVU67bdb9Pr2vUHA8+j3i2tJfjO6C6+4myGTA==",
"license": "MIT",
"engines": {
"node": "18 || 20 || >=22"
}
},
"node_modules/@ts-morph/common/node_modules/brace-expansion": {
"version": "5.0.3",
"resolved": "https://registry.npmjs.org/brace-expansion/-/brace-expansion-5.0.3.tgz",
"integrity": "sha512-fy6KJm2RawA5RcHkLa1z/ScpBeA762UF9KmZQxwIbDtRJrgLzM10depAiEQ+CXYcoiqW1/m96OAAoke2nE9EeA==",
"license": "MIT",
"dependencies": {
"balanced-match": "^4.0.2"
},
"engines": {
"node": "18 || 20 || >=22"
}
},
"node_modules/@ts-morph/common/node_modules/fast-glob": {
"version": "3.3.3",
"resolved": "https://registry.npmjs.org/fast-glob/-/fast-glob-3.3.3.tgz",
@@ -3884,15 +3884,15 @@
}
},
"node_modules/@ts-morph/common/node_modules/minimatch": {
"version": "10.2.4",
"resolved": "https://registry.npmjs.org/minimatch/-/minimatch-10.2.4.tgz",
"integrity": "sha512-oRjTw/97aTBN0RHbYCdtF1MQfvusSIBQM0IZEgzl6426+8jSC0nF1a/GmnVLpfB9yyr6g6FTqWqiZVbxrtaCIg==",
"version": "10.1.1",
"resolved": "https://registry.npmjs.org/minimatch/-/minimatch-10.1.1.tgz",
"integrity": "sha512-enIvLvRAFZYXJzkCYG5RKmPfrFArdLv+R+lbQ53BmIMLIry74bjKzX6iHAm8WYamJkhSSEabrWN5D97XnKObjQ==",
"license": "BlueOak-1.0.0",
"dependencies": {
"brace-expansion": "^5.0.2"
"@isaacs/brace-expansion": "^5.0.0"
},
"engines": {
"node": "18 || 20 || >=22"
"node": "20 || >=22"
},
"funding": {
"url": "https://github.com/sponsors/isaacs"
@@ -4234,13 +4234,13 @@
}
},
"node_modules/@typescript-eslint/typescript-estree/node_modules/minimatch": {
"version": "9.0.9",
"resolved": "https://registry.npmjs.org/minimatch/-/minimatch-9.0.9.tgz",
"integrity": "sha512-OBwBN9AL4dqmETlpS2zasx+vTeWclWzkblfZk7KTA5j3jeOONz/tRCnZomUyvNg83wL5Zv9Ss6HMJXAgL8R2Yg==",
"version": "9.0.5",
"resolved": "https://registry.npmjs.org/minimatch/-/minimatch-9.0.5.tgz",
"integrity": "sha512-G6T0ZX48xgozx7587koeX9Ys2NYy6Gmv//P89sEte9V9whIapMNF4idKxnW2QtCcLiTWlb/wfCabAtAFWhhBow==",
"dev": true,
"license": "ISC",
"dependencies": {
"brace-expansion": "^2.0.2"
"brace-expansion": "^2.0.1"
},
"engines": {
"node": ">=16 || 14 >=14.17"
@@ -4619,9 +4619,9 @@
}
},
"node_modules/ajv": {
"version": "6.14.0",
"resolved": "https://registry.npmjs.org/ajv/-/ajv-6.14.0.tgz",
"integrity": "sha512-IWrosm/yrn43eiKqkfkHis7QioDleaXQHdDVPKg0FSwwd/DuvyX79TZnFOnYpB7dcsFAMmtFztZuXPDvSePkFw==",
"version": "6.12.6",
"resolved": "https://registry.npmjs.org/ajv/-/ajv-6.12.6.tgz",
"integrity": "sha512-j3fVLgvTo527anyYyJOGTYJbG+vnnQYvE0m5mmkc1TK+nxAppkCLMIL0aZ4dblVCNoGShhm+kzE4ZUykBoMg4g==",
"dev": true,
"license": "MIT",
"dependencies": {
@@ -4653,9 +4653,9 @@
}
},
"node_modules/ajv-formats/node_modules/ajv": {
"version": "8.18.0",
"resolved": "https://registry.npmjs.org/ajv/-/ajv-8.18.0.tgz",
"integrity": "sha512-PlXPeEWMXMZ7sPYOHqmDyCJzcfNrUr3fGNKtezX14ykXOEIvyK81d+qydx89KY5O71FKMPaQ2vBfBFI5NHR63A==",
"version": "8.17.1",
"resolved": "https://registry.npmjs.org/ajv/-/ajv-8.17.1.tgz",
"integrity": "sha512-B/gBuNg5SiMTrPkC+A2+cW0RszwxYmn6VYxB/inlBStS5nx6xHIt/ehKRhIMhqusl7a8LjQoZnjCs5vhwxOQ1g==",
"license": "MIT",
"dependencies": {
"fast-deep-equal": "^3.1.3",
@@ -8831,9 +8831,9 @@
}
},
"node_modules/minimatch": {
"version": "3.1.5",
"resolved": "https://registry.npmjs.org/minimatch/-/minimatch-3.1.5.tgz",
"integrity": "sha512-VgjWUsnnT6n+NUk6eZq77zeFdpW2LWDzP6zFGrCbHXiYNul5Dzqk2HHQ5uFH2DNW5Xbp8+jVzaeNt94ssEEl4w==",
"version": "3.1.2",
"resolved": "https://registry.npmjs.org/minimatch/-/minimatch-3.1.2.tgz",
"integrity": "sha512-J7p63hRiAjw1NDEww1W7i37+ByIrOWO5XQQAzZ3VOcL0PNybwpfmV/N05zFAzwQ9USyEcX6t3UO+K5aqBQOIHw==",
"dev": true,
"license": "ISC",
"dependencies": {
@@ -9699,9 +9699,9 @@
}
},
"node_modules/qs": {
"version": "6.14.2",
"resolved": "https://registry.npmjs.org/qs/-/qs-6.14.2.tgz",
"integrity": "sha512-V/yCWTTF7VJ9hIh18Ugr2zhJMP01MY7c5kh4J870L7imm6/DIzBsNLTXzMwUA3yZ5b/KBqLx8Kp3uRvd7xSe3Q==",
"version": "6.14.1",
"resolved": "https://registry.npmjs.org/qs/-/qs-6.14.1.tgz",
"integrity": "sha512-4EK3+xJl8Ts67nLYNwqw/dsFVnCf+qR7RgXSK9jEEm9unao3njwMDdmsdvoKBKHzxd7tCYz5e5M+SnMjdtXGQQ==",
"license": "BSD-3-Clause",
"dependencies": {
"side-channel": "^1.1.0"

View File

@@ -48,7 +48,6 @@ from onyx.llm.utils import test_llm
from onyx.llm.well_known_providers.auto_update_service import (
fetch_llm_recommendations_from_github,
)
from onyx.llm.well_known_providers.constants import LM_STUDIO_API_KEY_CONFIG_KEY
from onyx.llm.well_known_providers.llm_provider_options import (
fetch_available_well_known_llms,
)
@@ -63,8 +62,6 @@ from onyx.server.manage.llm.models import LLMProviderDescriptor
from onyx.server.manage.llm.models import LLMProviderResponse
from onyx.server.manage.llm.models import LLMProviderUpsertRequest
from onyx.server.manage.llm.models import LLMProviderView
from onyx.server.manage.llm.models import LMStudioFinalModelResponse
from onyx.server.manage.llm.models import LMStudioModelsRequest
from onyx.server.manage.llm.models import OllamaFinalModelResponse
from onyx.server.manage.llm.models import OllamaModelDetails
from onyx.server.manage.llm.models import OllamaModelsRequest
@@ -76,7 +73,6 @@ from onyx.server.manage.llm.models import VisionProviderResponse
from onyx.server.manage.llm.utils import generate_bedrock_display_name
from onyx.server.manage.llm.utils import generate_ollama_display_name
from onyx.server.manage.llm.utils import infer_vision_support
from onyx.server.manage.llm.utils import is_reasoning_model
from onyx.server.manage.llm.utils import is_valid_bedrock_model
from onyx.server.manage.llm.utils import ModelMetadata
from onyx.server.manage.llm.utils import strip_openrouter_vendor_prefix
@@ -445,17 +441,6 @@ def put_llm_provider(
not existing_provider or not existing_provider.is_auto_mode
)
# Before the upsert, check if this provider currently owns the global
# CHAT default. The upsert may cascade-delete model_configurations
# (and their flow mappings), so we need to remember this beforehand.
was_default_provider = False
if existing_provider and transitioning_to_auto_mode:
current_default = fetch_default_llm_model(db_session)
was_default_provider = (
current_default is not None
and current_default.llm_provider_id == existing_provider.id
)
try:
result = upsert_llm_provider(
llm_provider_upsert_request=llm_provider_upsert_request,
@@ -478,20 +463,6 @@ def put_llm_provider(
updated_provider,
config,
)
# If this provider was the default before the transition,
# restore the default using the recommended model.
if was_default_provider:
recommended = config.get_default_model(
llm_provider_upsert_request.provider
)
if recommended:
update_default_provider(
provider_id=updated_provider.id,
model_name=recommended.name,
db_session=db_session,
)
# Refresh result with synced models
result = LLMProviderView.from_model(updated_provider)
@@ -1246,117 +1217,3 @@ def get_openrouter_available_models(
logger.warning(f"Failed to sync OpenRouter models to DB: {e}")
return sorted_results
@admin_router.post("/lm-studio/available-models")
def get_lm_studio_available_models(
request: LMStudioModelsRequest,
_: User = Depends(current_admin_user),
db_session: Session = Depends(get_session),
) -> list[LMStudioFinalModelResponse]:
"""Fetch available models from an LM Studio server.
Uses the LM Studio-native /api/v1/models endpoint which exposes
rich metadata including capabilities (vision, reasoning),
display names, and context lengths.
"""
cleaned_api_base = request.api_base.strip().rstrip("/")
# Strip /v1 suffix that users may copy from OpenAI-compatible tool configs;
# the native metadata endpoint lives at /api/v1/models, not /v1/api/v1/models.
cleaned_api_base = cleaned_api_base.removesuffix("/v1")
if not cleaned_api_base:
raise OnyxError(
OnyxErrorCode.VALIDATION_ERROR,
"API base URL is required to fetch LM Studio models.",
)
# If provider_name is given and the api_key hasn't been changed by the user,
# fall back to the stored API key from the database (the form value is masked).
api_key = request.api_key
if request.provider_name and not request.api_key_changed:
existing_provider = fetch_existing_llm_provider(
name=request.provider_name, db_session=db_session
)
if existing_provider and existing_provider.custom_config:
api_key = existing_provider.custom_config.get(LM_STUDIO_API_KEY_CONFIG_KEY)
url = f"{cleaned_api_base}/api/v1/models"
headers: dict[str, str] = {}
if api_key:
headers["Authorization"] = f"Bearer {api_key}"
try:
response = httpx.get(url, headers=headers, timeout=10.0)
response.raise_for_status()
response_json = response.json()
except Exception as e:
raise OnyxError(
OnyxErrorCode.BAD_GATEWAY,
f"Failed to fetch LM Studio models: {e}",
)
models = response_json.get("models", [])
if not isinstance(models, list) or len(models) == 0:
raise OnyxError(
OnyxErrorCode.VALIDATION_ERROR,
"No models found from your LM Studio server.",
)
results: list[LMStudioFinalModelResponse] = []
for item in models:
# Filter to LLM-type models only (skip embeddings, etc.)
if item.get("type") != "llm":
continue
model_key = item.get("key")
if not model_key:
continue
display_name = item.get("display_name") or model_key
max_context_length = item.get("max_context_length")
capabilities = item.get("capabilities") or {}
results.append(
LMStudioFinalModelResponse(
name=model_key,
display_name=display_name,
max_input_tokens=max_context_length,
supports_image_input=capabilities.get("vision", False),
supports_reasoning=capabilities.get("reasoning", False)
or is_reasoning_model(model_key, display_name),
)
)
if not results:
raise OnyxError(
OnyxErrorCode.VALIDATION_ERROR,
"No compatible models found from LM Studio server.",
)
sorted_results = sorted(results, key=lambda m: m.name.lower())
# Sync new models to DB if provider_name is specified
if request.provider_name:
try:
models_to_sync = [
{
"name": r.name,
"display_name": r.display_name,
"max_input_tokens": r.max_input_tokens,
"supports_image_input": r.supports_image_input,
}
for r in sorted_results
]
new_count = sync_model_configurations(
db_session=db_session,
provider_name=request.provider_name,
models=models_to_sync,
)
if new_count > 0:
logger.info(
f"Added {new_count} new LM Studio models to provider '{request.provider_name}'"
)
except ValueError as e:
logger.warning(f"Failed to sync LM Studio models to DB: {e}")
return sorted_results

View File

@@ -371,22 +371,6 @@ class OpenRouterFinalModelResponse(BaseModel):
supports_image_input: bool
# LM Studio dynamic models fetch
class LMStudioModelsRequest(BaseModel):
api_base: str
api_key: str | None = None
api_key_changed: bool = False
provider_name: str | None = None # Optional: to save models to existing provider
class LMStudioFinalModelResponse(BaseModel):
name: str # Model ID from LM Studio (e.g., "lmstudio-community/Meta-Llama-3-8B")
display_name: str # Human-readable name
max_input_tokens: int | None # From LM Studio API or None if unavailable
supports_image_input: bool
supports_reasoning: bool
class DefaultModel(BaseModel):
provider_id: int
model_name: str

View File

@@ -12,7 +12,6 @@ from typing import TypedDict
from onyx.llm.constants import BEDROCK_MODEL_NAME_MAPPINGS
from onyx.llm.constants import LlmProviderNames
from onyx.llm.constants import MODEL_PREFIX_TO_VENDOR
from onyx.llm.constants import OLLAMA_MODEL_NAME_MAPPINGS
from onyx.llm.constants import OLLAMA_MODEL_TO_VENDOR
from onyx.llm.constants import PROVIDER_DISPLAY_NAMES
@@ -24,7 +23,6 @@ DYNAMIC_LLM_PROVIDERS = frozenset(
LlmProviderNames.OPENROUTER,
LlmProviderNames.BEDROCK,
LlmProviderNames.OLLAMA_CHAT,
LlmProviderNames.LM_STUDIO,
}
)
@@ -350,19 +348,4 @@ def extract_vendor_from_model_name(model_name: str, provider: str) -> str | None
# Fallback: capitalize the base name as vendor
return base_name.split("-")[0].title()
elif provider == LlmProviderNames.LM_STUDIO:
# LM Studio model IDs can be paths like "publisher/model-name"
# or simple names. Use MODEL_PREFIX_TO_VENDOR for matching.
model_lower = model_name.lower()
# Check for slash-separated vendor prefix first
if "/" in model_lower:
vendor_key = model_lower.split("/")[0]
return PROVIDER_DISPLAY_NAMES.get(vendor_key, vendor_key.title())
# Fallback to model prefix matching
for prefix, vendor in MODEL_PREFIX_TO_VENDOR.items():
if model_lower.startswith(prefix):
return PROVIDER_DISPLAY_NAMES.get(vendor, vendor.title())
return None
return None

View File

@@ -111,26 +111,19 @@ def _normalize_text_with_mapping(text: str) -> tuple[str, list[int]]:
# Step 1: NFC normalization with position mapping
nfc_text = unicodedata.normalize("NFC", text)
# Map NFD positions original positions.
# NFD only decomposes, so each original char produces 1+ NFD chars.
nfd_to_orig: list[int] = []
for orig_idx, orig_char in enumerate(original_text):
nfd_of_char = unicodedata.normalize("NFD", orig_char)
for _ in nfd_of_char:
nfd_to_orig.append(orig_idx)
# Map NFC positions → NFD positions.
# Each NFC char, when decomposed, tells us exactly how many NFD
# chars it was composed from.
# Build mapping from NFC positions to original start positions
nfc_to_orig: list[int] = []
nfd_idx = 0
orig_idx = 0
for nfc_char in nfc_text:
if nfd_idx < len(nfd_to_orig):
nfc_to_orig.append(nfd_to_orig[nfd_idx])
nfc_to_orig.append(orig_idx)
# Find how many original chars contributed to this NFC char
for length in range(1, len(original_text) - orig_idx + 1):
substr = original_text[orig_idx : orig_idx + length]
if unicodedata.normalize("NFC", substr) == nfc_char:
orig_idx += length
break
else:
nfc_to_orig.append(len(original_text) - 1)
nfd_of_nfc = unicodedata.normalize("NFD", nfc_char)
nfd_idx += len(nfd_of_nfc)
orig_idx += 1 # Fallback
# Work with NFC text from here
text = nfc_text

View File

@@ -65,7 +65,7 @@ attrs==25.4.0
# jsonschema
# referencing
# zeep
authlib==1.6.7
authlib==1.6.6
# via fastmcp
babel==2.17.0
# via courlan
@@ -109,7 +109,9 @@ brotli==1.2.0
bytecode==0.17.0
# via ddtrace
cachetools==6.2.2
# via py-key-value-aio
# via
# google-auth
# py-key-value-aio
caio==0.9.25
# via aiofile
celery==5.5.1
@@ -188,7 +190,6 @@ courlan==1.3.2
cryptography==46.0.5
# via
# authlib
# google-auth
# msal
# msoffcrypto-tool
# pdfminer-six
@@ -229,7 +230,9 @@ distro==1.9.0
dnspython==2.8.0
# via email-validator
docstring-parser==0.17.0
# via cyclopts
# via
# cyclopts
# google-cloud-aiplatform
docutils==0.22.3
# via rich-rst
dropbox==12.0.2
@@ -294,15 +297,26 @@ gitdb==4.0.12
gitpython==3.1.45
# via braintrust
google-api-core==2.28.1
# via google-api-python-client
# via
# google-api-python-client
# google-cloud-aiplatform
# google-cloud-bigquery
# google-cloud-core
# google-cloud-resource-manager
# google-cloud-storage
google-api-python-client==2.86.0
# via onyx
google-auth==2.48.0
google-auth==2.43.0
# via
# google-api-core
# google-api-python-client
# google-auth-httplib2
# google-auth-oauthlib
# google-cloud-aiplatform
# google-cloud-bigquery
# google-cloud-core
# google-cloud-resource-manager
# google-cloud-storage
# google-genai
# kubernetes
google-auth-httplib2==0.1.0
@@ -311,16 +325,51 @@ google-auth-httplib2==0.1.0
# onyx
google-auth-oauthlib==1.0.0
# via onyx
google-genai==1.52.0
google-cloud-aiplatform==1.121.0
# via onyx
google-cloud-bigquery==3.38.0
# via google-cloud-aiplatform
google-cloud-core==2.5.0
# via
# google-cloud-bigquery
# google-cloud-storage
google-cloud-resource-manager==1.15.0
# via google-cloud-aiplatform
google-cloud-storage==2.19.0
# via google-cloud-aiplatform
google-crc32c==1.7.1
# via
# google-cloud-storage
# google-resumable-media
google-genai==1.52.0
# via
# google-cloud-aiplatform
# onyx
google-resumable-media==2.7.2
# via
# google-cloud-bigquery
# google-cloud-storage
googleapis-common-protos==1.72.0
# via
# google-api-core
# grpc-google-iam-v1
# grpcio-status
# opentelemetry-exporter-otlp-proto-http
greenlet==3.2.4
# via
# playwright
# sqlalchemy
grpc-google-iam-v1==0.14.3
# via google-cloud-resource-manager
grpcio==1.76.0
# via
# google-api-core
# google-cloud-resource-manager
# googleapis-common-protos
# grpc-google-iam-v1
# grpcio-status
grpcio-status==1.76.0
# via google-api-core
h11==0.16.0
# via
# httpcore
@@ -621,6 +670,8 @@ packaging==24.2
# dask
# distributed
# fastmcp
# google-cloud-aiplatform
# google-cloud-bigquery
# huggingface-hub
# jira
# kombu
@@ -670,12 +721,19 @@ propcache==0.4.1
# aiohttp
# yarl
proto-plus==1.26.1
# via google-api-core
# via
# google-api-core
# google-cloud-aiplatform
# google-cloud-resource-manager
protobuf==6.33.5
# via
# ddtrace
# google-api-core
# google-cloud-aiplatform
# google-cloud-resource-manager
# googleapis-common-protos
# grpc-google-iam-v1
# grpcio-status
# onnxruntime
# opentelemetry-proto
# proto-plus
@@ -713,6 +771,7 @@ pydantic==2.11.7
# exa-py
# fastapi
# fastmcp
# google-cloud-aiplatform
# google-genai
# langchain-core
# langfuse
@@ -776,6 +835,7 @@ python-dateutil==2.8.2
# botocore
# celery
# dateparser
# google-cloud-bigquery
# htmldate
# hubspot-api-client
# kubernetes
@@ -867,6 +927,8 @@ requests==2.32.5
# dropbox
# exa-py
# google-api-core
# google-cloud-bigquery
# google-cloud-storage
# google-genai
# hubspot-api-client
# huggingface-hub
@@ -940,7 +1002,9 @@ sendgrid==6.12.5
sentry-sdk==2.14.0
# via onyx
shapely==2.0.6
# via onyx
# via
# google-cloud-aiplatform
# onyx
shellingham==1.5.4
# via typer
simple-salesforce==1.12.6
@@ -1054,7 +1118,9 @@ typing-extensions==4.15.0
# exa-py
# exceptiongroup
# fastapi
# google-cloud-aiplatform
# google-genai
# grpcio
# huggingface-hub
# jira
# langchain-core

View File

@@ -59,6 +59,8 @@ botocore==1.39.11
# s3transfer
brotli==1.2.0
# via onyx
cachetools==6.2.2
# via google-auth
celery-types==0.19.0
# via onyx
certifi==2025.11.12
@@ -98,9 +100,7 @@ comm==0.2.3
contourpy==1.3.3
# via matplotlib
cryptography==46.0.5
# via
# google-auth
# pyjwt
# via pyjwt
cycler==0.12.1
# via matplotlib
debugpy==1.8.17
@@ -115,6 +115,8 @@ distlib==0.4.0
# via virtualenv
distro==1.9.0
# via openai
docstring-parser==0.17.0
# via google-cloud-aiplatform
durationpy==0.10
# via kubernetes
execnet==2.1.2
@@ -143,14 +145,65 @@ frozenlist==1.8.0
# aiosignal
fsspec==2025.10.0
# via huggingface-hub
google-auth==2.48.0
google-api-core==2.28.1
# via
# google-cloud-aiplatform
# google-cloud-bigquery
# google-cloud-core
# google-cloud-resource-manager
# google-cloud-storage
google-auth==2.43.0
# via
# google-api-core
# google-cloud-aiplatform
# google-cloud-bigquery
# google-cloud-core
# google-cloud-resource-manager
# google-cloud-storage
# google-genai
# kubernetes
google-genai==1.52.0
google-cloud-aiplatform==1.121.0
# via onyx
google-cloud-bigquery==3.38.0
# via google-cloud-aiplatform
google-cloud-core==2.5.0
# via
# google-cloud-bigquery
# google-cloud-storage
google-cloud-resource-manager==1.15.0
# via google-cloud-aiplatform
google-cloud-storage==2.19.0
# via google-cloud-aiplatform
google-crc32c==1.7.1
# via
# google-cloud-storage
# google-resumable-media
google-genai==1.52.0
# via
# google-cloud-aiplatform
# onyx
google-resumable-media==2.7.2
# via
# google-cloud-bigquery
# google-cloud-storage
googleapis-common-protos==1.72.0
# via
# google-api-core
# grpc-google-iam-v1
# grpcio-status
greenlet==3.2.4 ; platform_machine == 'AMD64' or platform_machine == 'WIN32' or platform_machine == 'aarch64' or platform_machine == 'amd64' or platform_machine == 'ppc64le' or platform_machine == 'win32' or platform_machine == 'x86_64'
# via sqlalchemy
grpc-google-iam-v1==0.14.3
# via google-cloud-resource-manager
grpcio==1.76.0
# via
# google-api-core
# google-cloud-resource-manager
# googleapis-common-protos
# grpc-google-iam-v1
# grpcio-status
grpcio-status==1.76.0
# via google-api-core
h11==0.16.0
# via
# httpcore
@@ -258,12 +311,13 @@ numpy==2.4.1
# contourpy
# matplotlib
# pandas-stubs
# shapely
# voyageai
oauthlib==3.2.2
# via
# kubernetes
# requests-oauthlib
onyx-devtools==0.6.3
onyx-devtools==0.6.2
# via onyx
openai==2.14.0
# via
@@ -276,6 +330,8 @@ openapi-generator-cli==7.17.0
packaging==24.2
# via
# black
# google-cloud-aiplatform
# google-cloud-bigquery
# hatchling
# huggingface-hub
# ipykernel
@@ -318,6 +374,20 @@ propcache==0.4.1
# via
# aiohttp
# yarl
proto-plus==1.26.1
# via
# google-api-core
# google-cloud-aiplatform
# google-cloud-resource-manager
protobuf==6.33.5
# via
# google-api-core
# google-cloud-aiplatform
# google-cloud-resource-manager
# googleapis-common-protos
# grpc-google-iam-v1
# grpcio-status
# proto-plus
psutil==7.1.3
# via ipykernel
ptyprocess==0.7.0 ; sys_platform != 'emscripten' and sys_platform != 'win32'
@@ -339,6 +409,7 @@ pydantic==2.11.7
# agent-client-protocol
# cohere
# fastapi
# google-cloud-aiplatform
# google-genai
# litellm
# mcp
@@ -379,6 +450,7 @@ python-dateutil==2.8.2
# via
# aiobotocore
# botocore
# google-cloud-bigquery
# jupyter-client
# kubernetes
# matplotlib
@@ -413,6 +485,9 @@ reorder-python-imports-black==3.14.0
requests==2.32.5
# via
# cohere
# google-api-core
# google-cloud-bigquery
# google-cloud-storage
# google-genai
# huggingface-hub
# kubernetes
@@ -435,6 +510,8 @@ s3transfer==0.13.1
# via boto3
sentry-sdk==2.14.0
# via onyx
shapely==2.0.6
# via google-cloud-aiplatform
six==1.17.0
# via
# kubernetes
@@ -525,7 +602,9 @@ typing-extensions==4.15.0
# celery-types
# cohere
# fastapi
# google-cloud-aiplatform
# google-genai
# grpcio
# huggingface-hub
# ipython
# mcp

View File

@@ -53,6 +53,8 @@ botocore==1.39.11
# s3transfer
brotli==1.2.0
# via onyx
cachetools==6.2.2
# via google-auth
certifi==2025.11.12
# via
# httpcore
@@ -77,15 +79,15 @@ colorama==0.4.6 ; sys_platform == 'win32'
# click
# tqdm
cryptography==46.0.5
# via
# google-auth
# pyjwt
# via pyjwt
decorator==5.2.1
# via retry
discord-py==2.4.0
# via onyx
distro==1.9.0
# via openai
docstring-parser==0.17.0
# via google-cloud-aiplatform
durationpy==0.10
# via kubernetes
fastapi==0.133.1
@@ -102,12 +104,63 @@ frozenlist==1.8.0
# aiosignal
fsspec==2025.10.0
# via huggingface-hub
google-auth==2.48.0
google-api-core==2.28.1
# via
# google-cloud-aiplatform
# google-cloud-bigquery
# google-cloud-core
# google-cloud-resource-manager
# google-cloud-storage
google-auth==2.43.0
# via
# google-api-core
# google-cloud-aiplatform
# google-cloud-bigquery
# google-cloud-core
# google-cloud-resource-manager
# google-cloud-storage
# google-genai
# kubernetes
google-genai==1.52.0
google-cloud-aiplatform==1.121.0
# via onyx
google-cloud-bigquery==3.38.0
# via google-cloud-aiplatform
google-cloud-core==2.5.0
# via
# google-cloud-bigquery
# google-cloud-storage
google-cloud-resource-manager==1.15.0
# via google-cloud-aiplatform
google-cloud-storage==2.19.0
# via google-cloud-aiplatform
google-crc32c==1.7.1
# via
# google-cloud-storage
# google-resumable-media
google-genai==1.52.0
# via
# google-cloud-aiplatform
# onyx
google-resumable-media==2.7.2
# via
# google-cloud-bigquery
# google-cloud-storage
googleapis-common-protos==1.72.0
# via
# google-api-core
# grpc-google-iam-v1
# grpcio-status
grpc-google-iam-v1==0.14.3
# via google-cloud-resource-manager
grpcio==1.76.0
# via
# google-api-core
# google-cloud-resource-manager
# googleapis-common-protos
# grpc-google-iam-v1
# grpcio-status
grpcio-status==1.76.0
# via google-api-core
h11==0.16.0
# via
# httpcore
@@ -168,7 +221,9 @@ multidict==6.7.0
# aiohttp
# yarl
numpy==2.4.1
# via voyageai
# via
# shapely
# voyageai
oauthlib==3.2.2
# via
# kubernetes
@@ -178,7 +233,10 @@ openai==2.14.0
# litellm
# onyx
packaging==24.2
# via huggingface-hub
# via
# google-cloud-aiplatform
# google-cloud-bigquery
# huggingface-hub
parameterized==0.9.0
# via cohere
posthog==3.7.4
@@ -193,6 +251,20 @@ propcache==0.4.1
# via
# aiohttp
# yarl
proto-plus==1.26.1
# via
# google-api-core
# google-cloud-aiplatform
# google-cloud-resource-manager
protobuf==6.33.5
# via
# google-api-core
# google-cloud-aiplatform
# google-cloud-resource-manager
# googleapis-common-protos
# grpc-google-iam-v1
# grpcio-status
# proto-plus
py==1.11.0
# via retry
pyasn1==0.6.2
@@ -208,6 +280,7 @@ pydantic==2.11.7
# agent-client-protocol
# cohere
# fastapi
# google-cloud-aiplatform
# google-genai
# litellm
# mcp
@@ -224,6 +297,7 @@ python-dateutil==2.8.2
# via
# aiobotocore
# botocore
# google-cloud-bigquery
# kubernetes
# posthog
python-dotenv==1.1.1
@@ -247,6 +321,9 @@ regex==2025.11.3
requests==2.32.5
# via
# cohere
# google-api-core
# google-cloud-bigquery
# google-cloud-storage
# google-genai
# huggingface-hub
# kubernetes
@@ -268,6 +345,8 @@ s3transfer==0.13.1
# via boto3
sentry-sdk==2.14.0
# via onyx
shapely==2.0.6
# via google-cloud-aiplatform
six==1.17.0
# via
# kubernetes
@@ -306,7 +385,9 @@ typing-extensions==4.15.0
# anyio
# cohere
# fastapi
# google-cloud-aiplatform
# google-genai
# grpcio
# huggingface-hub
# mcp
# openai

View File

@@ -57,6 +57,8 @@ botocore==1.39.11
# s3transfer
brotli==1.2.0
# via onyx
cachetools==6.2.2
# via google-auth
celery==5.5.1
# via sentry-sdk
certifi==2025.11.12
@@ -93,15 +95,15 @@ colorama==0.4.6 ; sys_platform == 'win32'
# click
# tqdm
cryptography==46.0.5
# via
# google-auth
# pyjwt
# via pyjwt
decorator==5.2.1
# via retry
discord-py==2.4.0
# via onyx
distro==1.9.0
# via openai
docstring-parser==0.17.0
# via google-cloud-aiplatform
durationpy==0.10
# via kubernetes
einops==0.8.1
@@ -127,12 +129,63 @@ fsspec==2025.10.0
# via
# huggingface-hub
# torch
google-auth==2.48.0
google-api-core==2.28.1
# via
# google-cloud-aiplatform
# google-cloud-bigquery
# google-cloud-core
# google-cloud-resource-manager
# google-cloud-storage
google-auth==2.43.0
# via
# google-api-core
# google-cloud-aiplatform
# google-cloud-bigquery
# google-cloud-core
# google-cloud-resource-manager
# google-cloud-storage
# google-genai
# kubernetes
google-genai==1.52.0
google-cloud-aiplatform==1.121.0
# via onyx
google-cloud-bigquery==3.38.0
# via google-cloud-aiplatform
google-cloud-core==2.5.0
# via
# google-cloud-bigquery
# google-cloud-storage
google-cloud-resource-manager==1.15.0
# via google-cloud-aiplatform
google-cloud-storage==2.19.0
# via google-cloud-aiplatform
google-crc32c==1.7.1
# via
# google-cloud-storage
# google-resumable-media
google-genai==1.52.0
# via
# google-cloud-aiplatform
# onyx
google-resumable-media==2.7.2
# via
# google-cloud-bigquery
# google-cloud-storage
googleapis-common-protos==1.72.0
# via
# google-api-core
# grpc-google-iam-v1
# grpcio-status
grpc-google-iam-v1==0.14.3
# via google-cloud-resource-manager
grpcio==1.76.0
# via
# google-api-core
# google-cloud-resource-manager
# googleapis-common-protos
# grpc-google-iam-v1
# grpcio-status
grpcio-status==1.76.0
# via google-api-core
h11==0.16.0
# via
# httpcore
@@ -210,6 +263,7 @@ numpy==2.4.1
# onyx
# scikit-learn
# scipy
# shapely
# transformers
# voyageai
nvidia-cublas-cu12==12.8.4.1 ; platform_machine == 'x86_64' and sys_platform == 'linux'
@@ -262,6 +316,8 @@ openai==2.14.0
packaging==24.2
# via
# accelerate
# google-cloud-aiplatform
# google-cloud-bigquery
# huggingface-hub
# kombu
# transformers
@@ -281,6 +337,20 @@ propcache==0.4.1
# via
# aiohttp
# yarl
proto-plus==1.26.1
# via
# google-api-core
# google-cloud-aiplatform
# google-cloud-resource-manager
protobuf==6.33.5
# via
# google-api-core
# google-cloud-aiplatform
# google-cloud-resource-manager
# googleapis-common-protos
# grpc-google-iam-v1
# grpcio-status
# proto-plus
psutil==7.1.3
# via accelerate
py==1.11.0
@@ -298,6 +368,7 @@ pydantic==2.11.7
# agent-client-protocol
# cohere
# fastapi
# google-cloud-aiplatform
# google-genai
# litellm
# mcp
@@ -315,6 +386,7 @@ python-dateutil==2.8.2
# aiobotocore
# botocore
# celery
# google-cloud-bigquery
# kubernetes
python-dotenv==1.1.1
# via
@@ -341,6 +413,9 @@ regex==2025.11.3
requests==2.32.5
# via
# cohere
# google-api-core
# google-cloud-bigquery
# google-cloud-storage
# google-genai
# huggingface-hub
# kubernetes
@@ -377,6 +452,8 @@ sentry-sdk==2.14.0
# via onyx
setuptools==80.9.0 ; python_full_version >= '3.12'
# via torch
shapely==2.0.6
# via google-cloud-aiplatform
six==1.17.0
# via
# kubernetes
@@ -433,7 +510,9 @@ typing-extensions==4.15.0
# anyio
# cohere
# fastapi
# google-cloud-aiplatform
# google-genai
# grpcio
# huggingface-hub
# mcp
# openai

View File

@@ -1,171 +0,0 @@
#!/usr/bin/env python3
"""A utility to interact with OpenSearch.
Usage:
python3 opensearch_debug.py --help
python3 opensearch_debug.py list
python3 opensearch_debug.py delete <index_name>
Environment Variables:
OPENSEARCH_HOST: OpenSearch host
OPENSEARCH_REST_API_PORT: OpenSearch port
OPENSEARCH_ADMIN_USERNAME: Admin username
OPENSEARCH_ADMIN_PASSWORD: Admin password
Dependencies:
backend/shared_configs/configs.py
backend/onyx/document_index/opensearch/client.py
"""
import argparse
import os
import sys
from onyx.document_index.opensearch.client import OpenSearchClient
from onyx.document_index.opensearch.client import OpenSearchIndexClient
from shared_configs.configs import MULTI_TENANT
def list_indices(client: OpenSearchClient) -> None:
indices = client.list_indices_with_info()
print(f"Found {len(indices)} indices.")
print("-" * 80)
for index in sorted(indices, key=lambda x: x.name):
print(f"Index: {index.name}")
print(f"Health: {index.health}")
print(f"Status: {index.status}")
print(f"Num Primary Shards: {index.num_primary_shards}")
print(f"Num Replica Shards: {index.num_replica_shards}")
print(f"Docs Count: {index.docs_count}")
print(f"Docs Deleted: {index.docs_deleted}")
print(f"Created At: {index.created_at}")
print(f"Total Size: {index.total_size}")
print(f"Primary Shards Size: {index.primary_shards_size}")
print("-" * 80)
def delete_index(client: OpenSearchIndexClient) -> None:
if not client.index_exists():
print(f"Index '{client._index_name}' does not exist.")
return
confirm = input(f"Delete index '{client._index_name}'? (yes/no): ")
if confirm.lower() != "yes":
print("Aborted.")
return
if client.delete_index():
print(f"Deleted index '{client._index_name}'.")
else:
print(f"Failed to delete index '{client._index_name}' for an unknown reason.")
def main() -> None:
def add_standard_arguments(parser: argparse.ArgumentParser) -> None:
parser.add_argument(
"--host",
help="OpenSearch host. If not provided, will fall back to OPENSEARCH_HOST, then prompt "
"for input.",
type=str,
default=os.environ.get("OPENSEARCH_HOST", ""),
)
parser.add_argument(
"--port",
help="OpenSearch port. If not provided, will fall back to OPENSEARCH_REST_API_PORT, "
"then prompt for input.",
type=int,
default=int(os.environ.get("OPENSEARCH_REST_API_PORT", 0)),
)
parser.add_argument(
"--username",
help="OpenSearch username. If not provided, will fall back to OPENSEARCH_ADMIN_USERNAME, "
"then prompt for input.",
type=str,
default=os.environ.get("OPENSEARCH_ADMIN_USERNAME", ""),
)
parser.add_argument(
"--password",
help="OpenSearch password. If not provided, will fall back to OPENSEARCH_ADMIN_PASSWORD, "
"then prompt for input.",
type=str,
default=os.environ.get("OPENSEARCH_ADMIN_PASSWORD", ""),
)
parser.add_argument(
"--no-ssl", help="Disable SSL.", action="store_true", default=False
)
parser.add_argument(
"--no-verify-certs",
help="Disable certificate verification (for self-signed certs).",
action="store_true",
default=False,
)
parser.add_argument(
"--use-aws-managed-opensearch",
help="Whether to use AWS-managed OpenSearch. If not provided, will fall back to checking "
"USING_AWS_MANAGED_OPENSEARCH=='true', then default to False.",
action=argparse.BooleanOptionalAction,
default=os.environ.get("USING_AWS_MANAGED_OPENSEARCH", "").lower()
== "true",
)
parser = argparse.ArgumentParser(
description="A utility to interact with OpenSearch."
)
subparsers = parser.add_subparsers(
dest="command", help="Command to execute.", required=True
)
list_parser = subparsers.add_parser("list", help="List all indices with info.")
add_standard_arguments(list_parser)
delete_parser = subparsers.add_parser("delete", help="Delete an index.")
delete_parser.add_argument("index", help="Index name.", type=str)
add_standard_arguments(delete_parser)
args = parser.parse_args()
if not (host := args.host or input("Enter the OpenSearch host: ")):
print("Error: OpenSearch host is required.")
sys.exit(1)
if not (port := args.port or int(input("Enter the OpenSearch port: "))):
print("Error: OpenSearch port is required.")
sys.exit(1)
if not (username := args.username or input("Enter the OpenSearch username: ")):
print("Error: OpenSearch username is required.")
sys.exit(1)
if not (password := args.password or input("Enter the OpenSearch password: ")):
print("Error: OpenSearch password is required.")
sys.exit(1)
print("Using AWS-managed OpenSearch: ", args.use_aws_managed_opensearch)
print(f"MULTI_TENANT: {MULTI_TENANT}")
with (
OpenSearchIndexClient(
index_name=args.index,
host=host,
port=port,
auth=(username, password),
use_ssl=not args.no_ssl,
verify_certs=not args.no_verify_certs,
)
if args.command == "delete"
else OpenSearchClient(
host=host,
port=port,
auth=(username, password),
use_ssl=not args.no_ssl,
verify_certs=not args.no_verify_certs,
)
) as client:
if not client.ping():
print("Error: Could not connect to OpenSearch.")
sys.exit(1)
if args.command == "list":
list_indices(client)
elif args.command == "delete":
delete_index(client)
if __name__ == "__main__":
main()

View File

@@ -145,10 +145,6 @@ class TestDocprocessingPriorityInDocumentExtraction:
@patch("onyx.background.indexing.run_docfetching.get_document_batch_storage")
@patch("onyx.background.indexing.run_docfetching.MemoryTracer")
@patch("onyx.background.indexing.run_docfetching._get_connector_runner")
@patch(
"onyx.background.indexing.run_docfetching.strip_null_characters",
side_effect=lambda batch: batch,
)
@patch(
"onyx.background.indexing.run_docfetching.get_recent_completed_attempts_for_cc_pair"
)
@@ -173,7 +169,6 @@ class TestDocprocessingPriorityInDocumentExtraction:
mock_save_checkpoint: MagicMock, # noqa: ARG002
mock_get_last_successful_attempt_poll_range_end: MagicMock,
mock_get_recent_completed_attempts: MagicMock,
mock_strip_null_characters: MagicMock, # noqa: ARG002
mock_get_connector_runner: MagicMock,
mock_memory_tracer_class: MagicMock,
mock_get_batch_storage: MagicMock,

View File

@@ -698,99 +698,6 @@ class TestAutoModeMissingFlows:
class TestAutoModeTransitionsAndResync:
"""Tests for auto/manual transitions, config evolution, and sync idempotency."""
def test_transition_to_auto_mode_preserves_default(
self,
db_session: Session,
provider_name: str,
) -> None:
"""When the default provider transitions from manual to auto mode,
the global default should be preserved (set to the recommended model).
Steps:
1. Create a manual-mode provider with models, set it as global default.
2. Transition to auto mode (model_configurations=[] triggers cascade
delete of old ModelConfigurations and their LLMModelFlow rows).
3. Verify the provider is still the global default, now using the
recommended default model from the GitHub config.
"""
initial_models = [
ModelConfigurationUpsertRequest(name="gpt-4o", is_visible=True),
ModelConfigurationUpsertRequest(name="gpt-4o-mini", is_visible=True),
]
auto_config = _create_mock_llm_recommendations(
provider=LlmProviderNames.OPENAI,
default_model_name="gpt-4o-mini",
additional_models=["gpt-4o"],
)
try:
# Step 1: Create manual-mode provider and set as default
put_llm_provider(
llm_provider_upsert_request=LLMProviderUpsertRequest(
name=provider_name,
provider=LlmProviderNames.OPENAI,
api_key="sk-test-key-00000000000000000000000000000000000",
api_key_changed=True,
is_auto_mode=False,
model_configurations=initial_models,
),
is_creation=True,
_=_create_mock_admin(),
db_session=db_session,
)
db_session.expire_all()
provider = fetch_existing_llm_provider(
name=provider_name, db_session=db_session
)
assert provider is not None
update_default_provider(provider.id, "gpt-4o", db_session)
default_before = fetch_default_llm_model(db_session)
assert default_before is not None
assert default_before.name == "gpt-4o"
assert default_before.llm_provider_id == provider.id
# Step 2: Transition to auto mode
with patch(
"onyx.server.manage.llm.api.fetch_llm_recommendations_from_github",
return_value=auto_config,
):
put_llm_provider(
llm_provider_upsert_request=LLMProviderUpsertRequest(
id=provider.id,
name=provider_name,
provider=LlmProviderNames.OPENAI,
api_key=None,
api_key_changed=False,
is_auto_mode=True,
model_configurations=[],
),
is_creation=False,
_=_create_mock_admin(),
db_session=db_session,
)
# Step 3: Default should be preserved on this provider
db_session.expire_all()
default_after = fetch_default_llm_model(db_session)
assert default_after is not None, (
"Default model should not be None after transitioning to auto mode — "
"the provider was the default before and should remain so"
)
assert (
default_after.llm_provider_id == provider.id
), "Default should still belong to the same provider after transition"
assert default_after.name == "gpt-4o-mini", (
f"Default should be updated to the recommended model 'gpt-4o-mini', "
f"got '{default_after.name}'"
)
finally:
db_session.rollback()
_cleanup_provider(db_session, provider_name)
def test_auto_to_manual_mode_preserves_models_and_stops_syncing(
self,
db_session: Session,
@@ -1135,19 +1042,14 @@ class TestAutoModeTransitionsAndResync:
assert visibility["gpt-4o"] is False, "Removed default should be hidden"
assert visibility["gpt-4o-mini"] is True, "New default should be visible"
# The old default (gpt-4o) is now hidden. sync_auto_mode_models
# should update the global default to the new recommended default
# (gpt-4o-mini) so that it is not silently lost.
# The LLMModelFlow row for gpt-4o still exists (is_default=True),
# but the model is hidden. fetch_default_llm_model filters on
# is_visible=True, so it should NOT return gpt-4o.
db_session.expire_all()
default_after = fetch_default_llm_model(db_session)
assert default_after is not None, (
"Default model should not be None — sync should set the new "
"recommended default when the old one is hidden"
)
assert default_after.name == "gpt-4o-mini", (
f"Default should be updated to the new recommended model "
f"'gpt-4o-mini', but got '{default_after.name}'"
)
assert (
default_after is None or default_after.name != "gpt-4o"
), "Hidden model should not be returned as the default"
finally:
db_session.rollback()

View File

@@ -21,8 +21,6 @@ from onyx.db.oauth_config import get_tools_by_oauth_config
from onyx.db.oauth_config import get_user_oauth_token
from onyx.db.oauth_config import update_oauth_config
from onyx.db.oauth_config import upsert_user_oauth_token
from onyx.db.tools import delete_tool__no_commit
from onyx.db.tools import update_tool
from tests.external_dependency_unit.conftest import create_test_user
@@ -314,85 +312,6 @@ class TestOAuthConfigCRUD:
# Tool should still exist but oauth_config_id should be NULL
assert tool.oauth_config_id is None
def test_update_tool_cleans_up_orphaned_oauth_config(
self, db_session: Session
) -> None:
"""Test that changing a tool's oauth_config_id deletes the old config if no other tool uses it."""
old_config = _create_test_oauth_config(db_session)
new_config = _create_test_oauth_config(db_session)
tool = _create_test_tool_with_oauth(db_session, old_config)
old_config_id = old_config.id
update_tool(
tool_id=tool.id,
name=None,
description=None,
openapi_schema=None,
custom_headers=None,
user_id=None,
db_session=db_session,
passthrough_auth=None,
oauth_config_id=new_config.id,
)
assert tool.oauth_config_id == new_config.id
assert get_oauth_config(old_config_id, db_session) is None
def test_delete_tool_cleans_up_orphaned_oauth_config(
self, db_session: Session
) -> None:
"""Test that deleting the last tool referencing an OAuthConfig also deletes the config."""
config = _create_test_oauth_config(db_session)
tool = _create_test_tool_with_oauth(db_session, config)
config_id = config.id
delete_tool__no_commit(tool.id, db_session)
db_session.commit()
assert get_oauth_config(config_id, db_session) is None
def test_update_tool_preserves_shared_oauth_config(
self, db_session: Session
) -> None:
"""Test that updating one tool's oauth_config_id preserves the config when another tool still uses it."""
shared_config = _create_test_oauth_config(db_session)
new_config = _create_test_oauth_config(db_session)
tool_a = _create_test_tool_with_oauth(db_session, shared_config)
tool_b = _create_test_tool_with_oauth(db_session, shared_config)
shared_config_id = shared_config.id
# Move tool_a to a new config; tool_b still references shared_config
update_tool(
tool_id=tool_a.id,
name=None,
description=None,
openapi_schema=None,
custom_headers=None,
user_id=None,
db_session=db_session,
passthrough_auth=None,
oauth_config_id=new_config.id,
)
assert tool_a.oauth_config_id == new_config.id
assert tool_b.oauth_config_id == shared_config_id
assert get_oauth_config(shared_config_id, db_session) is not None
def test_delete_tool_preserves_shared_oauth_config(
self, db_session: Session
) -> None:
"""Test that deleting one tool preserves the config when another tool still uses it."""
shared_config = _create_test_oauth_config(db_session)
tool_a = _create_test_tool_with_oauth(db_session, shared_config)
tool_b = _create_test_tool_with_oauth(db_session, shared_config)
shared_config_id = shared_config.id
delete_tool__no_commit(tool_a.id, db_session)
db_session.commit()
assert tool_b.oauth_config_id == shared_config_id
assert get_oauth_config(shared_config_id, db_session) is not None
class TestOAuthUserTokenCRUD:
"""Tests for OAuth user token CRUD operations"""

View File

@@ -2,6 +2,7 @@
import pytest
from onyx.chat.llm_loop import _should_keep_bedrock_tool_definitions
from onyx.chat.llm_loop import _try_fallback_tool_extraction
from onyx.chat.llm_loop import construct_message_history
from onyx.chat.models import ChatLoadedFile
@@ -13,11 +14,22 @@ from onyx.chat.models import LlmStepResult
from onyx.chat.models import ToolCallSimple
from onyx.configs.constants import MessageType
from onyx.file_store.models import ChatFileType
from onyx.llm.constants import LlmProviderNames
from onyx.llm.interfaces import ToolChoiceOptions
from onyx.server.query_and_chat.placement import Placement
from onyx.tools.models import ToolCallKickoff
class _StubConfig:
def __init__(self, model_provider: str) -> None:
self.model_provider = model_provider
class _StubLLM:
def __init__(self, model_provider: str) -> None:
self.config = _StubConfig(model_provider=model_provider)
def create_message(
content: str, message_type: MessageType, token_count: int | None = None
) -> ChatMessageSimple:
@@ -934,6 +946,37 @@ class TestForgottenFileMetadata:
assert "moby_dick.txt" in forgotten.message
class TestBedrockToolConfigGuard:
def test_bedrock_with_tool_history_keeps_tool_definitions(self) -> None:
llm = _StubLLM(LlmProviderNames.BEDROCK)
history = [
create_message("Question", MessageType.USER, 5),
create_assistant_with_tool_call("tc_1", "search", 5),
create_tool_response("tc_1", "Tool output", 5),
]
assert _should_keep_bedrock_tool_definitions(llm, history) is True
def test_bedrock_without_tool_history_does_not_keep_tool_definitions(self) -> None:
llm = _StubLLM(LlmProviderNames.BEDROCK)
history = [
create_message("Question", MessageType.USER, 5),
create_message("Answer", MessageType.ASSISTANT, 5),
]
assert _should_keep_bedrock_tool_definitions(llm, history) is False
def test_non_bedrock_with_tool_history_does_not_keep_tool_definitions(self) -> None:
llm = _StubLLM(LlmProviderNames.OPENAI)
history = [
create_message("Question", MessageType.USER, 5),
create_assistant_with_tool_call("tc_1", "search", 5),
create_tool_response("tc_1", "Tool output", 5),
]
assert _should_keep_bedrock_tool_definitions(llm, history) is False
class TestFallbackToolExtraction:
def _tool_defs(self) -> list[dict]:
return [

View File

@@ -8,6 +8,7 @@ from onyx.chat.llm_step import _extract_tool_call_kickoffs
from onyx.chat.llm_step import _increment_turns
from onyx.chat.llm_step import _parse_tool_args_to_dict
from onyx.chat.llm_step import _resolve_tool_arguments
from onyx.chat.llm_step import _sanitize_llm_output
from onyx.chat.llm_step import _XmlToolCallContentFilter
from onyx.chat.llm_step import extract_tool_calls_from_response_text
from onyx.chat.llm_step import translate_history_to_llm_format
@@ -20,49 +21,48 @@ from onyx.llm.models import AssistantMessage
from onyx.llm.models import ToolMessage
from onyx.llm.models import UserMessage
from onyx.server.query_and_chat.placement import Placement
from onyx.utils.postgres_sanitization import sanitize_string
class TestSanitizeLlmOutput:
"""Tests for the sanitize_string function."""
"""Tests for the _sanitize_llm_output function."""
def test_removes_null_bytes(self) -> None:
"""Test that NULL bytes are removed from strings."""
assert sanitize_string("hello\x00world") == "helloworld"
assert sanitize_string("\x00start") == "start"
assert sanitize_string("end\x00") == "end"
assert sanitize_string("\x00\x00\x00") == ""
assert _sanitize_llm_output("hello\x00world") == "helloworld"
assert _sanitize_llm_output("\x00start") == "start"
assert _sanitize_llm_output("end\x00") == "end"
assert _sanitize_llm_output("\x00\x00\x00") == ""
def test_removes_surrogates(self) -> None:
"""Test that UTF-16 surrogates are removed from strings."""
# Low surrogate
assert sanitize_string("hello\ud800world") == "helloworld"
assert _sanitize_llm_output("hello\ud800world") == "helloworld"
# High surrogate
assert sanitize_string("hello\udfffworld") == "helloworld"
assert _sanitize_llm_output("hello\udfffworld") == "helloworld"
# Middle of surrogate range
assert sanitize_string("test\uda00value") == "testvalue"
assert _sanitize_llm_output("test\uda00value") == "testvalue"
def test_removes_mixed_bad_characters(self) -> None:
"""Test removal of both NULL bytes and surrogates together."""
assert sanitize_string("a\x00b\ud800c\udfffd") == "abcd"
assert _sanitize_llm_output("a\x00b\ud800c\udfffd") == "abcd"
def test_preserves_valid_unicode(self) -> None:
"""Test that valid Unicode characters are preserved."""
# Emojis
assert sanitize_string("hello 👋 world") == "hello 👋 world"
assert _sanitize_llm_output("hello 👋 world") == "hello 👋 world"
# Chinese characters
assert sanitize_string("你好世界") == "你好世界"
assert _sanitize_llm_output("你好世界") == "你好世界"
# Mixed scripts
assert sanitize_string("Hello мир 世界") == "Hello мир 世界"
assert _sanitize_llm_output("Hello мир 世界") == "Hello мир 世界"
def test_empty_string(self) -> None:
"""Test that empty strings are handled correctly."""
assert sanitize_string("") == ""
assert _sanitize_llm_output("") == ""
def test_normal_ascii(self) -> None:
"""Test that normal ASCII strings pass through unchanged."""
assert sanitize_string("hello world") == "hello world"
assert sanitize_string('{"key": "value"}') == '{"key": "value"}'
assert _sanitize_llm_output("hello world") == "hello world"
assert _sanitize_llm_output('{"key": "value"}') == '{"key": "value"}'
class TestParseToolArgsToDict:

View File

@@ -1,13 +1,10 @@
"""Tests for save_chat.py.
"""Tests for _extract_referenced_file_descriptors in save_chat.py.
Covers _extract_referenced_file_descriptors and sanitization in save_chat_turn.
Verifies that only code interpreter generated files actually referenced
in the assistant's message text are extracted as FileDescriptors for
cross-turn persistence.
"""
from unittest.mock import MagicMock
from pytest import MonkeyPatch
from onyx.chat import save_chat
from onyx.chat.save_chat import _extract_referenced_file_descriptors
from onyx.file_store.models import ChatFileType
from onyx.tools.models import PythonExecutionFile
@@ -32,9 +29,6 @@ def _make_tool_call_info(
)
# ---- _extract_referenced_file_descriptors tests ----
def test_returns_empty_when_no_generated_files() -> None:
tool_call = _make_tool_call_info(generated_files=None)
result = _extract_referenced_file_descriptors([tool_call], "some message")
@@ -182,34 +176,3 @@ def test_skips_tool_calls_without_generated_files() -> None:
assert len(result) == 1
assert result[0]["id"] == file_id
# ---- save_chat_turn sanitization test ----
def test_save_chat_turn_sanitizes_message_and_reasoning(
monkeypatch: MonkeyPatch,
) -> None:
mock_tokenizer = MagicMock()
mock_tokenizer.encode.return_value = [1, 2, 3]
monkeypatch.setattr(save_chat, "get_tokenizer", lambda *_a, **_kw: mock_tokenizer)
mock_msg = MagicMock()
mock_msg.id = 1
mock_msg.chat_session_id = "test"
mock_msg.files = None
mock_session = MagicMock()
save_chat.save_chat_turn(
message_text="hello\x00world\ud800",
reasoning_tokens="think\x00ing\udfff",
tool_calls=[],
citation_to_doc={},
all_search_docs={},
db_session=mock_session,
assistant_message=mock_msg,
)
assert mock_msg.message == "helloworld"
assert mock_msg.reasoning_tokens == "thinking"

View File

@@ -1,27 +0,0 @@
from unittest.mock import MagicMock
from uuid import uuid4
from onyx.db import tools as tools_mod
def test_create_tool_call_no_commit_sanitizes_fields() -> None:
mock_session = MagicMock()
tool_call = tools_mod.create_tool_call_no_commit(
chat_session_id=uuid4(),
parent_chat_message_id=1,
turn_number=0,
tool_id=1,
tool_call_id="tc-1",
tool_call_arguments={"task\x00": "research\ud800 topic"},
tool_call_response="report\x00 text\udfff here",
tool_call_tokens=10,
db_session=mock_session,
reasoning_tokens="reason\x00ing\ud800",
generated_images=[{"url": "img\x00.png\udfff"}],
)
assert tool_call.tool_call_response == "report text here"
assert tool_call.reasoning_tokens == "reasoning"
assert tool_call.tool_call_arguments == {"task": "research topic"}
assert tool_call.generated_images == [{"url": "img.png"}]

View File

@@ -9,79 +9,8 @@ from onyx.connectors.models import IndexAttemptMetadata
from onyx.connectors.models import TextSection
from onyx.db.enums import HierarchyNodeType
from onyx.indexing import indexing_pipeline
from onyx.utils.postgres_sanitization import sanitize_document_for_postgres
from onyx.utils.postgres_sanitization import sanitize_hierarchy_node_for_postgres
from onyx.utils.postgres_sanitization import sanitize_json_like
from onyx.utils.postgres_sanitization import sanitize_string
# ---- sanitize_string tests ----
def test_sanitize_string_strips_nul_bytes() -> None:
assert sanitize_string("hello\x00world") == "helloworld"
assert sanitize_string("\x00\x00\x00") == ""
assert sanitize_string("clean") == "clean"
def test_sanitize_string_strips_high_surrogates() -> None:
assert sanitize_string("before\ud800after") == "beforeafter"
assert sanitize_string("a\udbffb") == "ab"
def test_sanitize_string_strips_low_surrogates() -> None:
assert sanitize_string("before\udc00after") == "beforeafter"
assert sanitize_string("a\udfffb") == "ab"
def test_sanitize_string_strips_nul_and_surrogates_together() -> None:
assert sanitize_string("he\x00llo\ud800 wo\udfffrld\x00") == "hello world"
def test_sanitize_string_preserves_valid_unicode() -> None:
assert sanitize_string("café ☕ 日本語 😀") == "café ☕ 日本語 😀"
def test_sanitize_string_empty_input() -> None:
assert sanitize_string("") == ""
# ---- sanitize_json_like tests ----
def test_sanitize_json_like_handles_plain_string() -> None:
assert sanitize_json_like("he\x00llo\ud800") == "hello"
def test_sanitize_json_like_handles_nested_dict() -> None:
dirty = {
"ke\x00y": "va\ud800lue",
"nested": {"inne\x00r": "de\udfffep"},
}
assert sanitize_json_like(dirty) == {
"key": "value",
"nested": {"inner": "deep"},
}
def test_sanitize_json_like_handles_list_with_surrogates() -> None:
dirty = ["a\x00", "b\ud800", {"c\udc00": "d\udfff"}]
assert sanitize_json_like(dirty) == ["a", "b", {"c": "d"}]
def test_sanitize_json_like_handles_tuple() -> None:
dirty = ("a\x00", "b\ud800")
assert sanitize_json_like(dirty) == ("a", "b")
def test_sanitize_json_like_passes_through_non_strings() -> None:
assert sanitize_json_like(42) == 42
assert sanitize_json_like(3.14) == 3.14
assert sanitize_json_like(True) is True
assert sanitize_json_like(None) is None
# ---- sanitize_document_for_postgres tests ----
from onyx.indexing.postgres_sanitization import sanitize_document_for_postgres
from onyx.indexing.postgres_sanitization import sanitize_hierarchy_node_for_postgres
def test_sanitize_document_for_postgres_removes_nul_bytes() -> None:

View File

@@ -1214,218 +1214,3 @@ def test_multithreaded_invoke_without_custom_config_skips_env_lock() -> None:
# The env lock context manager should never have been called
mock_env_lock.assert_not_called()
# ---- Tests for Bedrock tool content stripping ----
def test_messages_contain_tool_content_with_tool_role() -> None:
from onyx.llm.multi_llm import _messages_contain_tool_content
messages: list[dict[str, Any]] = [
{"role": "user", "content": "Hello"},
{"role": "assistant", "content": "I'll search for that."},
{"role": "tool", "content": "search results", "tool_call_id": "tc_1"},
]
assert _messages_contain_tool_content(messages) is True
def test_messages_contain_tool_content_with_tool_calls() -> None:
from onyx.llm.multi_llm import _messages_contain_tool_content
messages: list[dict[str, Any]] = [
{"role": "user", "content": "Hello"},
{
"role": "assistant",
"content": None,
"tool_calls": [
{
"id": "tc_1",
"type": "function",
"function": {"name": "search", "arguments": "{}"},
}
],
},
]
assert _messages_contain_tool_content(messages) is True
def test_messages_contain_tool_content_without_tools() -> None:
from onyx.llm.multi_llm import _messages_contain_tool_content
messages: list[dict[str, Any]] = [
{"role": "user", "content": "Hello"},
{"role": "assistant", "content": "Hi there!"},
]
assert _messages_contain_tool_content(messages) is False
def test_strip_tool_content_converts_assistant_tool_calls_to_text() -> None:
from onyx.llm.multi_llm import _strip_tool_content_from_messages
messages: list[dict[str, Any]] = [
{"role": "user", "content": "Search for cats"},
{
"role": "assistant",
"content": "Let me search.",
"tool_calls": [
{
"id": "tc_1",
"type": "function",
"function": {
"name": "search",
"arguments": '{"query": "cats"}',
},
}
],
},
{
"role": "tool",
"content": "Found 3 results about cats.",
"tool_call_id": "tc_1",
},
{"role": "assistant", "content": "Here are the results."},
]
result = _strip_tool_content_from_messages(messages)
assert len(result) == 4
# First message unchanged
assert result[0] == {"role": "user", "content": "Search for cats"}
# Assistant with tool calls → plain text
assert result[1]["role"] == "assistant"
assert "tool_calls" not in result[1]
assert "Let me search." in result[1]["content"]
assert "[Tool Call]" in result[1]["content"]
assert "search" in result[1]["content"]
assert "tc_1" in result[1]["content"]
# Tool response → user message
assert result[2]["role"] == "user"
assert "[Tool Result]" in result[2]["content"]
assert "tc_1" in result[2]["content"]
assert "Found 3 results about cats." in result[2]["content"]
# Final assistant message unchanged
assert result[3] == {"role": "assistant", "content": "Here are the results."}
def test_strip_tool_content_handles_assistant_with_no_text_content() -> None:
from onyx.llm.multi_llm import _strip_tool_content_from_messages
messages: list[dict[str, Any]] = [
{
"role": "assistant",
"content": None,
"tool_calls": [
{
"id": "tc_1",
"type": "function",
"function": {"name": "search", "arguments": "{}"},
}
],
},
]
result = _strip_tool_content_from_messages(messages)
assert result[0]["role"] == "assistant"
assert "[Tool Call]" in result[0]["content"]
assert "tool_calls" not in result[0]
def test_strip_tool_content_passes_through_non_tool_messages() -> None:
from onyx.llm.multi_llm import _strip_tool_content_from_messages
messages: list[dict[str, Any]] = [
{"role": "system", "content": "You are helpful."},
{"role": "user", "content": "Hello"},
{"role": "assistant", "content": "Hi!"},
]
result = _strip_tool_content_from_messages(messages)
assert result == messages
def test_strip_tool_content_handles_list_content_blocks() -> None:
from onyx.llm.multi_llm import _strip_tool_content_from_messages
messages: list[dict[str, Any]] = [
{
"role": "assistant",
"content": [{"type": "text", "text": "Searching now."}],
"tool_calls": [
{
"id": "tc_1",
"type": "function",
"function": {"name": "search", "arguments": "{}"},
}
],
},
{
"role": "tool",
"content": [
{"type": "text", "text": "result A"},
{"type": "text", "text": "result B"},
],
"tool_call_id": "tc_1",
},
]
result = _strip_tool_content_from_messages(messages)
# Assistant: list content flattened + tool call appended
assert result[0]["role"] == "assistant"
assert "Searching now." in result[0]["content"]
assert "[Tool Call]" in result[0]["content"]
assert isinstance(result[0]["content"], str)
# Tool: list content flattened into user message
assert result[1]["role"] == "user"
assert "result A" in result[1]["content"]
assert "result B" in result[1]["content"]
assert isinstance(result[1]["content"], str)
def test_strip_tool_content_merges_consecutive_tool_results() -> None:
"""Bedrock requires strict user/assistant alternation. Multiple parallel
tool results must be merged into a single user message."""
from onyx.llm.multi_llm import _strip_tool_content_from_messages
messages: list[dict[str, Any]] = [
{"role": "user", "content": "weather and news?"},
{
"role": "assistant",
"content": None,
"tool_calls": [
{
"id": "tc_1",
"type": "function",
"function": {"name": "search_weather", "arguments": "{}"},
},
{
"id": "tc_2",
"type": "function",
"function": {"name": "search_news", "arguments": "{}"},
},
],
},
{"role": "tool", "content": "sunny 72F", "tool_call_id": "tc_1"},
{"role": "tool", "content": "headline news", "tool_call_id": "tc_2"},
{"role": "assistant", "content": "Here are the results."},
]
result = _strip_tool_content_from_messages(messages)
# user, assistant (flattened), user (merged tool results), assistant
assert len(result) == 4
roles = [m["role"] for m in result]
assert roles == ["user", "assistant", "user", "assistant"]
# Both tool results merged into one user message
merged = result[2]["content"]
assert "tc_1" in merged
assert "sunny 72F" in merged
assert "tc_2" in merged
assert "headline news" in merged

View File

@@ -10,8 +10,6 @@ from unittest.mock import patch
import pytest
from onyx.server.manage.llm.models import LMStudioFinalModelResponse
from onyx.server.manage.llm.models import LMStudioModelsRequest
from onyx.server.manage.llm.models import OllamaFinalModelResponse
from onyx.server.manage.llm.models import OllamaModelsRequest
from onyx.server.manage.llm.models import OpenRouterFinalModelResponse
@@ -319,298 +317,3 @@ class TestGetOpenRouterAvailableModels:
# No DB operations should happen
mock_session.execute.assert_not_called()
mock_session.commit.assert_not_called()
class TestGetLMStudioAvailableModels:
"""Tests for the LM Studio model fetch endpoint."""
@pytest.fixture
def mock_lm_studio_response(self) -> dict:
"""Mock response from LM Studio /api/v1/models endpoint."""
return {
"models": [
{
"key": "lmstudio-community/Meta-Llama-3-8B",
"type": "llm",
"display_name": "Meta Llama 3 8B",
"max_context_length": 8192,
"capabilities": {"vision": False},
},
{
"key": "lmstudio-community/Qwen2.5-VL-7B",
"type": "llm",
"display_name": "Qwen 2.5 VL 7B",
"max_context_length": 32768,
"capabilities": {"vision": True},
},
{
"key": "text-embedding-nomic-embed-text-v1.5",
"type": "embedding",
"display_name": "Nomic Embed Text v1.5",
"max_context_length": 2048,
"capabilities": {},
},
{
"key": "lmstudio-community/DeepSeek-R1-8B",
"type": "llm",
"display_name": "DeepSeek R1 8B",
"max_context_length": 65536,
"capabilities": {"vision": False},
},
]
}
def test_returns_model_list(self, mock_lm_studio_response: dict) -> None:
"""Test that endpoint returns properly formatted LLM-only model list."""
from onyx.server.manage.llm.api import get_lm_studio_available_models
mock_session = MagicMock()
with patch("onyx.server.manage.llm.api.httpx") as mock_httpx:
mock_response = MagicMock()
mock_response.json.return_value = mock_lm_studio_response
mock_response.raise_for_status = MagicMock()
mock_httpx.get.return_value = mock_response
request = LMStudioModelsRequest(api_base="http://localhost:1234")
results = get_lm_studio_available_models(request, MagicMock(), mock_session)
# Only LLM-type models should be returned (embedding filtered out)
assert len(results) == 3
assert all(isinstance(r, LMStudioFinalModelResponse) for r in results)
names = [r.name for r in results]
assert "text-embedding-nomic-embed-text-v1.5" not in names
# Results should be alphabetically sorted by model name
assert names == sorted(names, key=str.lower)
def test_infers_vision_support(self, mock_lm_studio_response: dict) -> None:
"""Test that vision support is correctly read from capabilities."""
from onyx.server.manage.llm.api import get_lm_studio_available_models
mock_session = MagicMock()
with patch("onyx.server.manage.llm.api.httpx") as mock_httpx:
mock_response = MagicMock()
mock_response.json.return_value = mock_lm_studio_response
mock_response.raise_for_status = MagicMock()
mock_httpx.get.return_value = mock_response
request = LMStudioModelsRequest(api_base="http://localhost:1234")
results = get_lm_studio_available_models(request, MagicMock(), mock_session)
qwen = next(r for r in results if "Qwen" in r.display_name)
llama = next(r for r in results if "Llama" in r.display_name)
assert qwen.supports_image_input is True
assert llama.supports_image_input is False
def test_infers_reasoning_from_model_name(self) -> None:
"""Test that reasoning is inferred from model name when not in capabilities."""
from onyx.server.manage.llm.api import get_lm_studio_available_models
mock_session = MagicMock()
response = {
"models": [
{
"key": "lmstudio-community/DeepSeek-R1-8B",
"type": "llm",
"display_name": "DeepSeek R1 8B",
"max_context_length": 65536,
"capabilities": {},
},
{
"key": "lmstudio-community/Meta-Llama-3-8B",
"type": "llm",
"display_name": "Meta Llama 3 8B",
"max_context_length": 8192,
"capabilities": {},
},
]
}
with patch("onyx.server.manage.llm.api.httpx") as mock_httpx:
mock_response = MagicMock()
mock_response.json.return_value = response
mock_response.raise_for_status = MagicMock()
mock_httpx.get.return_value = mock_response
request = LMStudioModelsRequest(api_base="http://localhost:1234")
results = get_lm_studio_available_models(request, MagicMock(), mock_session)
deepseek = next(r for r in results if "DeepSeek" in r.display_name)
llama = next(r for r in results if "Llama" in r.display_name)
assert deepseek.supports_reasoning is True
assert llama.supports_reasoning is False
def test_uses_display_name_from_api(self, mock_lm_studio_response: dict) -> None:
"""Test that display_name from the API is used directly."""
from onyx.server.manage.llm.api import get_lm_studio_available_models
mock_session = MagicMock()
with patch("onyx.server.manage.llm.api.httpx") as mock_httpx:
mock_response = MagicMock()
mock_response.json.return_value = mock_lm_studio_response
mock_response.raise_for_status = MagicMock()
mock_httpx.get.return_value = mock_response
request = LMStudioModelsRequest(api_base="http://localhost:1234")
results = get_lm_studio_available_models(request, MagicMock(), mock_session)
llama = next(r for r in results if "Llama" in r.name)
assert llama.display_name == "Meta Llama 3 8B"
assert llama.max_input_tokens == 8192
def test_strips_trailing_v1_from_api_base(self) -> None:
"""Test that /v1 suffix is stripped before building the native API URL."""
from onyx.server.manage.llm.api import get_lm_studio_available_models
mock_session = MagicMock()
response = {
"models": [
{
"key": "test-model",
"type": "llm",
"display_name": "Test",
"max_context_length": 4096,
"capabilities": {},
},
]
}
with patch("onyx.server.manage.llm.api.httpx") as mock_httpx:
mock_response = MagicMock()
mock_response.json.return_value = response
mock_response.raise_for_status = MagicMock()
mock_httpx.get.return_value = mock_response
request = LMStudioModelsRequest(api_base="http://localhost:1234/v1")
get_lm_studio_available_models(request, MagicMock(), mock_session)
# Should hit /api/v1/models, not /v1/api/v1/models
mock_httpx.get.assert_called_once()
called_url = mock_httpx.get.call_args[0][0]
assert called_url == "http://localhost:1234/api/v1/models"
def test_falls_back_to_stored_api_key(self) -> None:
"""Test that stored API key is used when api_key_changed is False."""
from onyx.server.manage.llm.api import get_lm_studio_available_models
mock_session = MagicMock()
mock_provider = MagicMock()
mock_provider.custom_config = {"LM_STUDIO_API_KEY": "stored-secret"}
response = {
"models": [
{
"key": "test-model",
"type": "llm",
"display_name": "Test",
"max_context_length": 4096,
"capabilities": {},
},
]
}
with (
patch("onyx.server.manage.llm.api.httpx") as mock_httpx,
patch(
"onyx.server.manage.llm.api.fetch_existing_llm_provider",
return_value=mock_provider,
),
):
mock_response = MagicMock()
mock_response.json.return_value = response
mock_response.raise_for_status = MagicMock()
mock_httpx.get.return_value = mock_response
request = LMStudioModelsRequest(
api_base="http://localhost:1234",
api_key="masked-value",
api_key_changed=False,
provider_name="my-lm-studio",
)
get_lm_studio_available_models(request, MagicMock(), mock_session)
headers = mock_httpx.get.call_args[1]["headers"]
assert headers["Authorization"] == "Bearer stored-secret"
def test_uses_submitted_api_key_when_changed(self) -> None:
"""Test that submitted API key is used when api_key_changed is True."""
from onyx.server.manage.llm.api import get_lm_studio_available_models
mock_session = MagicMock()
response = {
"models": [
{
"key": "test-model",
"type": "llm",
"display_name": "Test",
"max_context_length": 4096,
"capabilities": {},
},
]
}
with patch("onyx.server.manage.llm.api.httpx") as mock_httpx:
mock_response = MagicMock()
mock_response.json.return_value = response
mock_response.raise_for_status = MagicMock()
mock_httpx.get.return_value = mock_response
request = LMStudioModelsRequest(
api_base="http://localhost:1234",
api_key="new-secret",
api_key_changed=True,
provider_name="my-lm-studio",
)
get_lm_studio_available_models(request, MagicMock(), mock_session)
headers = mock_httpx.get.call_args[1]["headers"]
assert headers["Authorization"] == "Bearer new-secret"
def test_raises_on_empty_models(self) -> None:
"""Test that an error is raised when no models are returned."""
from onyx.error_handling.exceptions import OnyxError
from onyx.server.manage.llm.api import get_lm_studio_available_models
mock_session = MagicMock()
with patch("onyx.server.manage.llm.api.httpx") as mock_httpx:
mock_response = MagicMock()
mock_response.json.return_value = {"models": []}
mock_response.raise_for_status = MagicMock()
mock_httpx.get.return_value = mock_response
request = LMStudioModelsRequest(api_base="http://localhost:1234")
with pytest.raises(OnyxError):
get_lm_studio_available_models(request, MagicMock(), mock_session)
def test_raises_on_only_non_llm_models(self) -> None:
"""Test that an error is raised when all models are non-LLM type."""
from onyx.error_handling.exceptions import OnyxError
from onyx.server.manage.llm.api import get_lm_studio_available_models
mock_session = MagicMock()
response = {
"models": [
{
"key": "embedding-model",
"type": "embedding",
"display_name": "Embedding",
"max_context_length": 2048,
"capabilities": {},
},
]
}
with patch("onyx.server.manage.llm.api.httpx") as mock_httpx:
mock_response = MagicMock()
mock_response.json.return_value = response
mock_response.raise_for_status = MagicMock()
mock_httpx.get.return_value = mock_response
request = LMStudioModelsRequest(api_base="http://localhost:1234")
with pytest.raises(OnyxError):
get_lm_studio_available_models(request, MagicMock(), mock_session)

View File

@@ -1,7 +1,6 @@
from __future__ import annotations
import json
import unicodedata # used to verify NFC expansion test preconditions
from pathlib import Path
import pytest
@@ -159,47 +158,3 @@ def test_snippet_finding(test_data: TestSchema) -> None:
f"end_idx mismatch: expected {test_data.expected_result.expected_end_idx}, "
f"got {result.end_idx}"
)
# Characters confirmed to expand from 1 → 2 codepoints under NFC
NFC_EXPANDING_CHARS = [
("\u0958", "Devanagari letter qa"),
("\u0959", "Devanagari letter khha"),
("\u095a", "Devanagari letter ghha"),
]
@pytest.mark.parametrize(
"char,description",
NFC_EXPANDING_CHARS,
)
def test_nfc_expanding_char_snippet_match(char: str, description: str) -> None:
"""Snippet matching should produce valid indices for content
containing characters that expand under NFC normalization."""
nfc = unicodedata.normalize("NFC", char)
if len(nfc) <= 1:
pytest.skip(f"{description} does not expand under NFC on this platform")
content = f"before {char} after"
snippet = f"{char} after"
result = find_snippet_in_content(content, snippet)
assert result.snippet_located, f"[{description}] Snippet should be found in content"
assert (
0 <= result.start_idx < len(content)
), f"[{description}] start_idx {result.start_idx} out of bounds"
assert (
0 <= result.end_idx < len(content)
), f"[{description}] end_idx {result.end_idx} out of bounds"
assert (
result.start_idx <= result.end_idx
), f"[{description}] start_idx {result.start_idx} > end_idx {result.end_idx}"
matched = content[result.start_idx : result.end_idx + 1]
matched_nfc = unicodedata.normalize("NFC", matched)
snippet_nfc = unicodedata.normalize("NFC", snippet)
assert snippet_nfc in matched_nfc or matched_nfc in snippet_nfc, (
f"[{description}] Matched span '{matched}' does not overlap "
f"with expected snippet '{snippet}'"
)

3
cli/.gitignore vendored
View File

@@ -1,3 +0,0 @@
onyx-cli
cli
onyx.cli

View File

@@ -1,118 +0,0 @@
# Onyx CLI
A terminal interface for chatting with your [Onyx](https://github.com/onyx-dot-app/onyx) agent. Built with Go using [Bubble Tea](https://github.com/charmbracelet/bubbletea) for the TUI framework.
## Installation
```shell
pip install onyx-cli
```
Or with uv:
```shell
uv pip install onyx-cli
```
## Setup
Run the interactive setup:
```shell
onyx-cli configure
```
This prompts for your Onyx server URL and API key, tests the connection, and saves config to `~/.config/onyx-cli/config.json`.
Environment variables override config file values:
| Variable | Required | Description |
|----------|----------|-------------|
| `ONYX_SERVER_URL` | No | Server base URL (default: `http://localhost:3000`) |
| `ONYX_API_KEY` | Yes | API key for authentication |
| `ONYX_PERSONA_ID` | No | Default agent/persona ID |
## Usage
### Interactive chat (default)
```shell
onyx-cli
```
### One-shot question
```shell
onyx-cli ask "What is our company's PTO policy?"
onyx-cli ask --agent-id 5 "Summarize this topic"
onyx-cli ask --json "Hello"
```
| Flag | Description |
|------|-------------|
| `--agent-id <int>` | Agent ID to use (overrides default) |
| `--json` | Output raw NDJSON events instead of plain text |
### List agents
```shell
onyx-cli agents
onyx-cli agents --json
```
## Commands
| Command | Description |
|---------|-------------|
| `chat` | Launch the interactive chat TUI (default) |
| `ask` | Ask a one-shot question (non-interactive) |
| `agents` | List available agents |
| `configure` | Configure server URL and API key |
## Slash Commands (in TUI)
| Command | Description |
|---------|-------------|
| `/help` | Show help message |
| `/new` | Start a new chat session |
| `/agent` | List and switch agents |
| `/attach <path>` | Attach a file to next message |
| `/sessions` | List recent chat sessions |
| `/clear` | Clear the chat display |
| `/configure` | Re-run connection setup |
| `/connectors` | Open connectors in browser |
| `/settings` | Open settings in browser |
| `/quit` | Exit Onyx CLI |
## Keyboard Shortcuts
| Key | Action |
|-----|--------|
| `Enter` | Send message |
| `Escape` | Cancel current generation |
| `Ctrl+O` | Toggle source citations |
| `Ctrl+D` | Quit (press twice) |
| `Scroll` / `Shift+Up/Down` | Scroll chat history |
| `Page Up` / `Page Down` | Scroll half page |
## Building from Source
Requires [Go 1.24+](https://go.dev/dl/).
```shell
cd cli
go build -o onyx-cli .
```
## Development
```shell
# Run tests
go test ./...
# Build
go build -o onyx-cli .
# Lint
staticcheck ./...
```

View File

@@ -1,63 +0,0 @@
package cmd
import (
"encoding/json"
"fmt"
"text/tabwriter"
"github.com/onyx-dot-app/onyx/cli/internal/api"
"github.com/onyx-dot-app/onyx/cli/internal/config"
"github.com/spf13/cobra"
)
func newAgentsCmd() *cobra.Command {
var agentsJSON bool
cmd := &cobra.Command{
Use: "agents",
Short: "List available agents",
RunE: func(cmd *cobra.Command, args []string) error {
cfg := config.Load()
if !cfg.IsConfigured() {
return fmt.Errorf("onyx CLI is not configured — run 'onyx-cli configure' first")
}
client := api.NewClient(cfg)
agents, err := client.ListAgents(cmd.Context())
if err != nil {
return fmt.Errorf("failed to list agents: %w", err)
}
if agentsJSON {
data, err := json.MarshalIndent(agents, "", " ")
if err != nil {
return fmt.Errorf("failed to marshal agents: %w", err)
}
fmt.Println(string(data))
return nil
}
if len(agents) == 0 {
fmt.Println("No agents available.")
return nil
}
w := tabwriter.NewWriter(cmd.OutOrStdout(), 0, 4, 2, ' ', 0)
_, _ = fmt.Fprintln(w, "ID\tNAME\tDESCRIPTION")
for _, a := range agents {
desc := a.Description
if len(desc) > 60 {
desc = desc[:57] + "..."
}
_, _ = fmt.Fprintf(w, "%d\t%s\t%s\n", a.ID, a.Name, desc)
}
_ = w.Flush()
return nil
},
}
cmd.Flags().BoolVar(&agentsJSON, "json", false, "Output agents as JSON")
return cmd
}

View File

@@ -1,124 +0,0 @@
package cmd
import (
"context"
"encoding/json"
"fmt"
"os"
"os/signal"
"syscall"
"github.com/onyx-dot-app/onyx/cli/internal/api"
"github.com/onyx-dot-app/onyx/cli/internal/config"
"github.com/onyx-dot-app/onyx/cli/internal/models"
"github.com/spf13/cobra"
)
func newAskCmd() *cobra.Command {
var (
askAgentID int
askJSON bool
)
cmd := &cobra.Command{
Use: "ask [question]",
Short: "Ask a one-shot question (non-interactive)",
Args: cobra.ExactArgs(1),
RunE: func(cmd *cobra.Command, args []string) error {
cfg := config.Load()
if !cfg.IsConfigured() {
return fmt.Errorf("onyx CLI is not configured — run 'onyx-cli configure' first")
}
question := args[0]
agentID := cfg.DefaultAgentID
if cmd.Flags().Changed("agent-id") {
agentID = askAgentID
}
ctx, stop := signal.NotifyContext(cmd.Context(), os.Interrupt, syscall.SIGTERM)
defer stop()
client := api.NewClient(cfg)
parentID := -1
ch := client.SendMessageStream(
ctx,
question,
nil,
agentID,
&parentID,
nil,
)
var sessionID string
var lastErr error
gotStop := false
for event := range ch {
if e, ok := event.(models.SessionCreatedEvent); ok {
sessionID = e.ChatSessionID
}
if askJSON {
wrapped := struct {
Type string `json:"type"`
Event models.StreamEvent `json:"event"`
}{
Type: event.EventType(),
Event: event,
}
data, err := json.Marshal(wrapped)
if err != nil {
return fmt.Errorf("error marshaling event: %w", err)
}
fmt.Println(string(data))
if _, ok := event.(models.ErrorEvent); ok {
lastErr = fmt.Errorf("%s", event.(models.ErrorEvent).Error)
}
if _, ok := event.(models.StopEvent); ok {
gotStop = true
}
continue
}
switch e := event.(type) {
case models.MessageDeltaEvent:
fmt.Print(e.Content)
case models.ErrorEvent:
return fmt.Errorf("%s", e.Error)
case models.StopEvent:
fmt.Println()
return nil
}
}
if ctx.Err() != nil {
if sessionID != "" {
client.StopChatSession(context.Background(), sessionID)
}
if !askJSON {
fmt.Println()
}
return nil
}
if lastErr != nil {
return lastErr
}
if !gotStop {
if !askJSON {
fmt.Println()
}
return fmt.Errorf("stream ended unexpectedly")
}
if !askJSON {
fmt.Println()
}
return nil
},
}
cmd.Flags().IntVar(&askAgentID, "agent-id", 0, "Agent ID to use")
cmd.Flags().BoolVar(&askJSON, "json", false, "Output raw JSON events")
// Suppress cobra's default error/usage on RunE errors
return cmd
}

View File

@@ -1,33 +0,0 @@
package cmd
import (
tea "github.com/charmbracelet/bubbletea"
"github.com/onyx-dot-app/onyx/cli/internal/config"
"github.com/onyx-dot-app/onyx/cli/internal/onboarding"
"github.com/onyx-dot-app/onyx/cli/internal/tui"
"github.com/spf13/cobra"
)
func newChatCmd() *cobra.Command {
return &cobra.Command{
Use: "chat",
Short: "Launch the interactive chat TUI (default)",
RunE: func(cmd *cobra.Command, args []string) error {
cfg := config.Load()
// First-run: onboarding
if !config.ConfigExists() || !cfg.IsConfigured() {
result := onboarding.Run(&cfg)
if result == nil {
return nil
}
cfg = *result
}
m := tui.NewModel(cfg)
p := tea.NewProgram(m, tea.WithAltScreen(), tea.WithMouseCellMotion())
_, err := p.Run()
return err
},
}
}

View File

@@ -1,19 +0,0 @@
package cmd
import (
"github.com/onyx-dot-app/onyx/cli/internal/config"
"github.com/onyx-dot-app/onyx/cli/internal/onboarding"
"github.com/spf13/cobra"
)
func newConfigureCmd() *cobra.Command {
return &cobra.Command{
Use: "configure",
Short: "Configure server URL and API key",
RunE: func(cmd *cobra.Command, args []string) error {
cfg := config.Load()
onboarding.Run(&cfg)
return nil
},
}
}

View File

@@ -1,40 +0,0 @@
// Package cmd implements Cobra CLI commands for the Onyx CLI.
package cmd
import "github.com/spf13/cobra"
// Version and Commit are set via ldflags at build time.
var (
Version string
Commit string
)
func fullVersion() string {
if Commit != "" && Commit != "none" && len(Commit) > 7 {
return Version + " (" + Commit[:7] + ")"
}
return Version
}
// Execute creates and runs the root command.
func Execute() error {
rootCmd := &cobra.Command{
Use: "onyx-cli",
Short: "Terminal UI for chatting with Onyx",
Long: "Onyx CLI — a terminal interface for chatting with your Onyx agent.",
Version: fullVersion(),
}
// Register subcommands
chatCmd := newChatCmd()
rootCmd.AddCommand(chatCmd)
rootCmd.AddCommand(newAskCmd())
rootCmd.AddCommand(newAgentsCmd())
rootCmd.AddCommand(newConfigureCmd())
rootCmd.AddCommand(newValidateConfigCmd())
// Default command is chat
rootCmd.RunE = chatCmd.RunE
return rootCmd.Execute()
}

View File

@@ -1,41 +0,0 @@
package cmd
import (
"fmt"
"github.com/onyx-dot-app/onyx/cli/internal/api"
"github.com/onyx-dot-app/onyx/cli/internal/config"
"github.com/spf13/cobra"
)
func newValidateConfigCmd() *cobra.Command {
return &cobra.Command{
Use: "validate-config",
Short: "Validate configuration and test server connection",
RunE: func(cmd *cobra.Command, args []string) error {
// Check config file
if !config.ConfigExists() {
return fmt.Errorf("config file not found at %s\n Run 'onyx-cli configure' to set up", config.ConfigFilePath())
}
cfg := config.Load()
// Check API key
if !cfg.IsConfigured() {
return fmt.Errorf("API key is missing\n Run 'onyx-cli configure' to set up")
}
_, _ = fmt.Fprintf(cmd.OutOrStdout(), "Config: %s\n", config.ConfigFilePath())
_, _ = fmt.Fprintf(cmd.OutOrStdout(), "Server: %s\n", cfg.ServerURL)
// Test connection
client := api.NewClient(cfg)
if err := client.TestConnection(cmd.Context()); err != nil {
return fmt.Errorf("connection failed: %w", err)
}
_, _ = fmt.Fprintln(cmd.OutOrStdout(), "Status: connected and authenticated")
return nil
},
}
}

View File

@@ -1,45 +0,0 @@
module github.com/onyx-dot-app/onyx/cli
go 1.26.0
require (
github.com/charmbracelet/bubbles v0.20.0
github.com/charmbracelet/bubbletea v1.3.4
github.com/charmbracelet/glamour v0.8.0
github.com/charmbracelet/lipgloss v1.1.0
github.com/spf13/cobra v1.9.1
golang.org/x/term v0.22.0
golang.org/x/text v0.34.0
)
require (
github.com/alecthomas/chroma/v2 v2.14.0 // indirect
github.com/atotto/clipboard v0.1.4 // indirect
github.com/aymanbagabas/go-osc52/v2 v2.0.1 // indirect
github.com/aymerick/douceur v0.2.0 // indirect
github.com/charmbracelet/colorprofile v0.2.3-0.20250311203215-f60798e515dc // indirect
github.com/charmbracelet/x/ansi v0.8.0 // indirect
github.com/charmbracelet/x/cellbuf v0.0.13-0.20250311204145-2c3ea96c31dd // indirect
github.com/charmbracelet/x/term v0.2.1 // indirect
github.com/dlclark/regexp2 v1.11.0 // indirect
github.com/erikgeiser/coninput v0.0.0-20211004153227-1c3628e74d0f // indirect
github.com/gorilla/css v1.0.1 // indirect
github.com/inconshreveable/mousetrap v1.1.0 // indirect
github.com/lucasb-eyer/go-colorful v1.2.0 // indirect
github.com/mattn/go-isatty v0.0.20 // indirect
github.com/mattn/go-localereader v0.0.1 // indirect
github.com/mattn/go-runewidth v0.0.16 // indirect
github.com/microcosm-cc/bluemonday v1.0.27 // indirect
github.com/muesli/ansi v0.0.0-20230316100256-276c6243b2f6 // indirect
github.com/muesli/cancelreader v0.2.2 // indirect
github.com/muesli/reflow v0.3.0 // indirect
github.com/muesli/termenv v0.16.0 // indirect
github.com/rivo/uniseg v0.4.7 // indirect
github.com/spf13/pflag v1.0.6 // indirect
github.com/xo/terminfo v0.0.0-20220910002029-abceb7e1c41e // indirect
github.com/yuin/goldmark v1.7.4 // indirect
github.com/yuin/goldmark-emoji v1.0.3 // indirect
golang.org/x/net v0.27.0 // indirect
golang.org/x/sync v0.19.0 // indirect
golang.org/x/sys v0.30.0 // indirect
)

View File

@@ -1,94 +0,0 @@
github.com/alecthomas/assert/v2 v2.7.0 h1:QtqSACNS3tF7oasA8CU6A6sXZSBDqnm7RfpLl9bZqbE=
github.com/alecthomas/assert/v2 v2.7.0/go.mod h1:Bze95FyfUr7x34QZrjL+XP+0qgp/zg8yS+TtBj1WA3k=
github.com/alecthomas/chroma/v2 v2.14.0 h1:R3+wzpnUArGcQz7fCETQBzO5n9IMNi13iIs46aU4V9E=
github.com/alecthomas/chroma/v2 v2.14.0/go.mod h1:QolEbTfmUHIMVpBqxeDnNBj2uoeI4EbYP4i6n68SG4I=
github.com/alecthomas/repr v0.4.0 h1:GhI2A8MACjfegCPVq9f1FLvIBS+DrQ2KQBFZP1iFzXc=
github.com/alecthomas/repr v0.4.0/go.mod h1:Fr0507jx4eOXV7AlPV6AVZLYrLIuIeSOWtW57eE/O/4=
github.com/atotto/clipboard v0.1.4 h1:EH0zSVneZPSuFR11BlR9YppQTVDbh5+16AmcJi4g1z4=
github.com/atotto/clipboard v0.1.4/go.mod h1:ZY9tmq7sm5xIbd9bOK4onWV4S6X0u6GY7Vn0Yu86PYI=
github.com/aymanbagabas/go-osc52/v2 v2.0.1 h1:HwpRHbFMcZLEVr42D4p7XBqjyuxQH5SMiErDT4WkJ2k=
github.com/aymanbagabas/go-osc52/v2 v2.0.1/go.mod h1:uYgXzlJ7ZpABp8OJ+exZzJJhRNQ2ASbcXHWsFqH8hp8=
github.com/aymanbagabas/go-udiff v0.2.0 h1:TK0fH4MteXUDspT88n8CKzvK0X9O2xu9yQjWpi6yML8=
github.com/aymanbagabas/go-udiff v0.2.0/go.mod h1:RE4Ex0qsGkTAJoQdQQCA0uG+nAzJO/pI/QwceO5fgrA=
github.com/aymerick/douceur v0.2.0 h1:Mv+mAeH1Q+n9Fr+oyamOlAkUNPWPlA8PPGR0QAaYuPk=
github.com/aymerick/douceur v0.2.0/go.mod h1:wlT5vV2O3h55X9m7iVYN0TBM0NH/MmbLnd30/FjWUq4=
github.com/charmbracelet/bubbles v0.20.0 h1:jSZu6qD8cRQ6k9OMfR1WlM+ruM8fkPWkHvQWD9LIutE=
github.com/charmbracelet/bubbles v0.20.0/go.mod h1:39slydyswPy+uVOHZ5x/GjwVAFkCsV8IIVy+4MhzwwU=
github.com/charmbracelet/bubbletea v1.3.4 h1:kCg7B+jSCFPLYRA52SDZjr51kG/fMUEoPoZrkaDHyoI=
github.com/charmbracelet/bubbletea v1.3.4/go.mod h1:dtcUCyCGEX3g9tosuYiut3MXgY/Jsv9nKVdibKKRRXo=
github.com/charmbracelet/colorprofile v0.2.3-0.20250311203215-f60798e515dc h1:4pZI35227imm7yK2bGPcfpFEmuY1gc2YSTShr4iJBfs=
github.com/charmbracelet/colorprofile v0.2.3-0.20250311203215-f60798e515dc/go.mod h1:X4/0JoqgTIPSFcRA/P6INZzIuyqdFY5rm8tb41s9okk=
github.com/charmbracelet/glamour v0.8.0 h1:tPrjL3aRcQbn++7t18wOpgLyl8wrOHUEDS7IZ68QtZs=
github.com/charmbracelet/glamour v0.8.0/go.mod h1:ViRgmKkf3u5S7uakt2czJ272WSg2ZenlYEZXT2x7Bjw=
github.com/charmbracelet/lipgloss v1.1.0 h1:vYXsiLHVkK7fp74RkV7b2kq9+zDLoEU4MZoFqR/noCY=
github.com/charmbracelet/lipgloss v1.1.0/go.mod h1:/6Q8FR2o+kj8rz4Dq0zQc3vYf7X+B0binUUBwA0aL30=
github.com/charmbracelet/x/ansi v0.8.0 h1:9GTq3xq9caJW8ZrBTe0LIe2fvfLR/bYXKTx2llXn7xE=
github.com/charmbracelet/x/ansi v0.8.0/go.mod h1:wdYl/ONOLHLIVmQaxbIYEC/cRKOQyjTkowiI4blgS9Q=
github.com/charmbracelet/x/cellbuf v0.0.13-0.20250311204145-2c3ea96c31dd h1:vy0GVL4jeHEwG5YOXDmi86oYw2yuYUGqz6a8sLwg0X8=
github.com/charmbracelet/x/cellbuf v0.0.13-0.20250311204145-2c3ea96c31dd/go.mod h1:xe0nKWGd3eJgtqZRaN9RjMtK7xUYchjzPr7q6kcvCCs=
github.com/charmbracelet/x/exp/golden v0.0.0-20240815200342-61de596daa2b h1:MnAMdlwSltxJyULnrYbkZpp4k58Co7Tah3ciKhSNo0Q=
github.com/charmbracelet/x/exp/golden v0.0.0-20240815200342-61de596daa2b/go.mod h1:wDlXFlCrmJ8J+swcL/MnGUuYnqgQdW9rhSD61oNMb6U=
github.com/charmbracelet/x/term v0.2.1 h1:AQeHeLZ1OqSXhrAWpYUtZyX1T3zVxfpZuEQMIQaGIAQ=
github.com/charmbracelet/x/term v0.2.1/go.mod h1:oQ4enTYFV7QN4m0i9mzHrViD7TQKvNEEkHUMCmsxdUg=
github.com/cpuguy83/go-md2man/v2 v2.0.6/go.mod h1:oOW0eioCTA6cOiMLiUPZOpcVxMig6NIQQ7OS05n1F4g=
github.com/dlclark/regexp2 v1.11.0 h1:G/nrcoOa7ZXlpoa/91N3X7mM3r8eIlMBBJZvsz/mxKI=
github.com/dlclark/regexp2 v1.11.0/go.mod h1:DHkYz0B9wPfa6wondMfaivmHpzrQ3v9q8cnmRbL6yW8=
github.com/erikgeiser/coninput v0.0.0-20211004153227-1c3628e74d0f h1:Y/CXytFA4m6baUTXGLOoWe4PQhGxaX0KpnayAqC48p4=
github.com/erikgeiser/coninput v0.0.0-20211004153227-1c3628e74d0f/go.mod h1:vw97MGsxSvLiUE2X8qFplwetxpGLQrlU1Q9AUEIzCaM=
github.com/gorilla/css v1.0.1 h1:ntNaBIghp6JmvWnxbZKANoLyuXTPZ4cAMlo6RyhlbO8=
github.com/gorilla/css v1.0.1/go.mod h1:BvnYkspnSzMmwRK+b8/xgNPLiIuNZr6vbZBTPQ2A3b0=
github.com/hexops/gotextdiff v1.0.3 h1:gitA9+qJrrTCsiCl7+kh75nPqQt1cx4ZkudSTLoUqJM=
github.com/hexops/gotextdiff v1.0.3/go.mod h1:pSWU5MAI3yDq+fZBTazCSJysOMbxWL1BSow5/V2vxeg=
github.com/inconshreveable/mousetrap v1.1.0 h1:wN+x4NVGpMsO7ErUn/mUI3vEoE6Jt13X2s0bqwp9tc8=
github.com/inconshreveable/mousetrap v1.1.0/go.mod h1:vpF70FUmC8bwa3OWnCshd2FqLfsEA9PFc4w1p2J65bw=
github.com/lucasb-eyer/go-colorful v1.2.0 h1:1nnpGOrhyZZuNyfu1QjKiUICQ74+3FNCN69Aj6K7nkY=
github.com/lucasb-eyer/go-colorful v1.2.0/go.mod h1:R4dSotOR9KMtayYi1e77YzuveK+i7ruzyGqttikkLy0=
github.com/mattn/go-isatty v0.0.20 h1:xfD0iDuEKnDkl03q4limB+vH+GxLEtL/jb4xVJSWWEY=
github.com/mattn/go-isatty v0.0.20/go.mod h1:W+V8PltTTMOvKvAeJH7IuucS94S2C6jfK/D7dTCTo3Y=
github.com/mattn/go-localereader v0.0.1 h1:ygSAOl7ZXTx4RdPYinUpg6W99U8jWvWi9Ye2JC/oIi4=
github.com/mattn/go-localereader v0.0.1/go.mod h1:8fBrzywKY7BI3czFoHkuzRoWE9C+EiG4R1k4Cjx5p88=
github.com/mattn/go-runewidth v0.0.12/go.mod h1:RAqKPSqVFrSLVXbA8x7dzmKdmGzieGRCM46jaSJTDAk=
github.com/mattn/go-runewidth v0.0.16 h1:E5ScNMtiwvlvB5paMFdw9p4kSQzbXFikJ5SQO6TULQc=
github.com/mattn/go-runewidth v0.0.16/go.mod h1:Jdepj2loyihRzMpdS35Xk/zdY8IAYHsh153qUoGf23w=
github.com/microcosm-cc/bluemonday v1.0.27 h1:MpEUotklkwCSLeH+Qdx1VJgNqLlpY2KXwXFM08ygZfk=
github.com/microcosm-cc/bluemonday v1.0.27/go.mod h1:jFi9vgW+H7c3V0lb6nR74Ib/DIB5OBs92Dimizgw2cA=
github.com/muesli/ansi v0.0.0-20230316100256-276c6243b2f6 h1:ZK8zHtRHOkbHy6Mmr5D264iyp3TiX5OmNcI5cIARiQI=
github.com/muesli/ansi v0.0.0-20230316100256-276c6243b2f6/go.mod h1:CJlz5H+gyd6CUWT45Oy4q24RdLyn7Md9Vj2/ldJBSIo=
github.com/muesli/cancelreader v0.2.2 h1:3I4Kt4BQjOR54NavqnDogx/MIoWBFa0StPA8ELUXHmA=
github.com/muesli/cancelreader v0.2.2/go.mod h1:3XuTXfFS2VjM+HTLZY9Ak0l6eUKfijIfMUZ4EgX0QYo=
github.com/muesli/reflow v0.3.0 h1:IFsN6K9NfGtjeggFP+68I4chLZV2yIKsXJFNZ+eWh6s=
github.com/muesli/reflow v0.3.0/go.mod h1:pbwTDkVPibjO2kyvBQRBxTWEEGDGq0FlB1BIKtnHY/8=
github.com/muesli/termenv v0.16.0 h1:S5AlUN9dENB57rsbnkPyfdGuWIlkmzJjbFf0Tf5FWUc=
github.com/muesli/termenv v0.16.0/go.mod h1:ZRfOIKPFDYQoDFF4Olj7/QJbW60Ol/kL1pU3VfY/Cnk=
github.com/rivo/uniseg v0.1.0/go.mod h1:J6wj4VEh+S6ZtnVlnTBMWIodfgj8LQOQFoIToxlJtxc=
github.com/rivo/uniseg v0.2.0/go.mod h1:J6wj4VEh+S6ZtnVlnTBMWIodfgj8LQOQFoIToxlJtxc=
github.com/rivo/uniseg v0.4.7 h1:WUdvkW8uEhrYfLC4ZzdpI2ztxP1I582+49Oc5Mq64VQ=
github.com/rivo/uniseg v0.4.7/go.mod h1:FN3SvrM+Zdj16jyLfmOkMNblXMcoc8DfTHruCPUcx88=
github.com/russross/blackfriday/v2 v2.1.0/go.mod h1:+Rmxgy9KzJVeS9/2gXHxylqXiyQDYRxCVz55jmeOWTM=
github.com/spf13/cobra v1.9.1 h1:CXSaggrXdbHK9CF+8ywj8Amf7PBRmPCOJugH954Nnlo=
github.com/spf13/cobra v1.9.1/go.mod h1:nDyEzZ8ogv936Cinf6g1RU9MRY64Ir93oCnqb9wxYW0=
github.com/spf13/pflag v1.0.6 h1:jFzHGLGAlb3ruxLB8MhbI6A8+AQX/2eW4qeyNZXNp2o=
github.com/spf13/pflag v1.0.6/go.mod h1:McXfInJRrz4CZXVZOBLb0bTZqETkiAhM9Iw0y3An2Bg=
github.com/xo/terminfo v0.0.0-20220910002029-abceb7e1c41e h1:JVG44RsyaB9T2KIHavMF/ppJZNG9ZpyihvCd0w101no=
github.com/xo/terminfo v0.0.0-20220910002029-abceb7e1c41e/go.mod h1:RbqR21r5mrJuqunuUZ/Dhy/avygyECGrLceyNeo4LiM=
github.com/yuin/goldmark v1.7.1/go.mod h1:uzxRWxtg69N339t3louHJ7+O03ezfj6PlliRlaOzY1E=
github.com/yuin/goldmark v1.7.4 h1:BDXOHExt+A7gwPCJgPIIq7ENvceR7we7rOS9TNoLZeg=
github.com/yuin/goldmark v1.7.4/go.mod h1:uzxRWxtg69N339t3louHJ7+O03ezfj6PlliRlaOzY1E=
github.com/yuin/goldmark-emoji v1.0.3 h1:aLRkLHOuBR2czCY4R8olwMjID+tENfhyFDMCRhbIQY4=
github.com/yuin/goldmark-emoji v1.0.3/go.mod h1:tTkZEbwu5wkPmgTcitqddVxY9osFZiavD+r4AzQrh1U=
golang.org/x/exp v0.0.0-20220909182711-5c715a9e8561 h1:MDc5xs78ZrZr3HMQugiXOAkSZtfTpbJLDr/lwfgO53E=
golang.org/x/exp v0.0.0-20220909182711-5c715a9e8561/go.mod h1:cyybsKvd6eL0RnXn6p/Grxp8F5bW7iYuBgsNCOHpMYE=
golang.org/x/net v0.27.0 h1:5K3Njcw06/l2y9vpGCSdcxWOYHOUk3dVNGDXN+FvAys=
golang.org/x/net v0.27.0/go.mod h1:dDi0PyhWNoiUOrAS8uXv/vnScO4wnHQO4mj9fn/RytE=
golang.org/x/sync v0.19.0 h1:vV+1eWNmZ5geRlYjzm2adRgW2/mcpevXNg50YZtPCE4=
golang.org/x/sync v0.19.0/go.mod h1:9KTHXmSnoGruLpwFjVSX0lNNA75CykiMECbovNTZqGI=
golang.org/x/sys v0.0.0-20210809222454-d867a43fc93e/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.6.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.30.0 h1:QjkSwP/36a20jFYWkSue1YwXzLmsV5Gfq7Eiy72C1uc=
golang.org/x/sys v0.30.0/go.mod h1:/VUhepiaJMQUp4+oa/7Zr1D23ma6VTLIYjOOTFZPUcA=
golang.org/x/term v0.22.0 h1:BbsgPEJULsl2fV/AT3v15Mjva5yXKQDyKf+TbDz7QJk=
golang.org/x/term v0.22.0/go.mod h1:F3qCibpT5AMpCRfhfT53vVJwhLtIVHhB9XDjfFvnMI4=
golang.org/x/text v0.34.0 h1:oL/Qq0Kdaqxa1KbNeMKwQq0reLCCaFtqu2eNuSeNHbk=
golang.org/x/text v0.34.0/go.mod h1:homfLqTYRFyVYemLBFl5GgL/DWEiH5wcsQ5gSh1yziA=
gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
gopkg.in/yaml.v3 v3.0.1/go.mod h1:K4uyk7z7BCEPqu6E+C64Yfv1cQ7kz7rIZviUmN+EgEM=

View File

@@ -1,284 +0,0 @@
// Package api provides the HTTP client for communicating with the Onyx server.
package api
import (
"bytes"
"context"
"encoding/json"
"fmt"
"io"
"mime/multipart"
"net/http"
"os"
"path/filepath"
"strings"
"time"
"github.com/onyx-dot-app/onyx/cli/internal/config"
"github.com/onyx-dot-app/onyx/cli/internal/models"
)
// Client is the Onyx API client.
type Client struct {
baseURL string
apiKey string
httpClient *http.Client // default 30s timeout for quick requests
longHTTPClient *http.Client // 5min timeout for streaming/uploads
}
// NewClient creates a new API client from config.
func NewClient(cfg config.OnyxCliConfig) *Client {
var transport *http.Transport
if t, ok := http.DefaultTransport.(*http.Transport); ok {
transport = t.Clone()
} else {
transport = &http.Transport{}
}
return &Client{
baseURL: strings.TrimRight(cfg.ServerURL, "/"),
apiKey: cfg.APIKey,
httpClient: &http.Client{
Timeout: 30 * time.Second,
Transport: transport,
},
longHTTPClient: &http.Client{
Timeout: 5 * time.Minute,
Transport: transport,
},
}
}
// UpdateConfig replaces the client's config.
func (c *Client) UpdateConfig(cfg config.OnyxCliConfig) {
c.baseURL = strings.TrimRight(cfg.ServerURL, "/")
c.apiKey = cfg.APIKey
}
func (c *Client) newRequest(ctx context.Context, method, path string, body io.Reader) (*http.Request, error) {
req, err := http.NewRequestWithContext(ctx, method, c.baseURL+path, body)
if err != nil {
return nil, err
}
if c.apiKey != "" {
bearer := "Bearer " + c.apiKey
req.Header.Set("Authorization", bearer)
req.Header.Set("X-Onyx-Authorization", bearer)
}
return req, nil
}
func (c *Client) doJSON(ctx context.Context, method, path string, reqBody any, result any) error {
var body io.Reader
if reqBody != nil {
data, err := json.Marshal(reqBody)
if err != nil {
return err
}
body = bytes.NewReader(data)
}
req, err := c.newRequest(ctx, method, path, body)
if err != nil {
return err
}
if reqBody != nil {
req.Header.Set("Content-Type", "application/json")
}
resp, err := c.httpClient.Do(req)
if err != nil {
return err
}
defer func() { _ = resp.Body.Close() }()
if resp.StatusCode < 200 || resp.StatusCode >= 300 {
respBody, _ := io.ReadAll(resp.Body)
return &OnyxAPIError{StatusCode: resp.StatusCode, Detail: string(respBody)}
}
if result != nil {
return json.NewDecoder(resp.Body).Decode(result)
}
return nil
}
// TestConnection checks if the server is reachable and credentials are valid.
// Returns nil on success, or an error with a descriptive message on failure.
func (c *Client) TestConnection(ctx context.Context) error {
// Step 1: Basic reachability
req, err := c.newRequest(ctx, "GET", "/", nil)
if err != nil {
return fmt.Errorf("cannot connect to %s: %w", c.baseURL, err)
}
resp, err := c.httpClient.Do(req)
if err != nil {
return fmt.Errorf("cannot connect to %s — is the server running?", c.baseURL)
}
_ = resp.Body.Close()
serverHeader := strings.ToLower(resp.Header.Get("Server"))
if resp.StatusCode == 403 {
if strings.Contains(serverHeader, "awselb") || strings.Contains(serverHeader, "amazons3") {
return fmt.Errorf("blocked by AWS load balancer (HTTP 403 on all requests).\n Your IP address may not be in the ALB's security group or WAF allowlist")
}
return fmt.Errorf("HTTP 403 on base URL — the server is blocking all traffic.\n This is likely a firewall, WAF, or IP allowlist restriction")
}
// Step 2: Authenticated check
req2, err := c.newRequest(ctx, "GET", "/api/me", nil)
if err != nil {
return fmt.Errorf("server reachable but API error: %w", err)
}
resp2, err := c.httpClient.Do(req2)
if err != nil {
return fmt.Errorf("server reachable but API error: %w", err)
}
defer func() { _ = resp2.Body.Close() }()
if resp2.StatusCode == 200 {
return nil
}
bodyBytes, _ := io.ReadAll(io.LimitReader(resp2.Body, 300))
body := string(bodyBytes)
isHTML := strings.HasPrefix(strings.TrimSpace(body), "<")
respServer := strings.ToLower(resp2.Header.Get("Server"))
if resp2.StatusCode == 401 || resp2.StatusCode == 403 {
if isHTML || strings.Contains(respServer, "awselb") {
return fmt.Errorf("HTTP %d from a reverse proxy (not the Onyx backend).\n Check your deployment's ingress / proxy configuration", resp2.StatusCode)
}
if resp2.StatusCode == 401 {
return fmt.Errorf("invalid API key or token.\n %s", body)
}
return fmt.Errorf("access denied — check that the API key is valid.\n %s", body)
}
detail := fmt.Sprintf("HTTP %d", resp2.StatusCode)
if body != "" {
detail += fmt.Sprintf("\n Response: %s", body)
}
return fmt.Errorf("%s", detail)
}
// ListAgents returns visible agents.
func (c *Client) ListAgents(ctx context.Context) ([]models.AgentSummary, error) {
var raw []models.AgentSummary
if err := c.doJSON(ctx, "GET", "/api/persona", nil, &raw); err != nil {
return nil, err
}
var result []models.AgentSummary
for _, p := range raw {
if p.IsVisible {
result = append(result, p)
}
}
return result, nil
}
// ListChatSessions returns recent chat sessions.
func (c *Client) ListChatSessions(ctx context.Context) ([]models.ChatSessionDetails, error) {
var resp struct {
Sessions []models.ChatSessionDetails `json:"sessions"`
}
if err := c.doJSON(ctx, "GET", "/api/chat/get-user-chat-sessions", nil, &resp); err != nil {
return nil, err
}
return resp.Sessions, nil
}
// GetChatSession returns full details for a session.
func (c *Client) GetChatSession(ctx context.Context, sessionID string) (*models.ChatSessionDetailResponse, error) {
var resp models.ChatSessionDetailResponse
if err := c.doJSON(ctx, "GET", "/api/chat/get-chat-session/"+sessionID, nil, &resp); err != nil {
return nil, err
}
return &resp, nil
}
// RenameChatSession renames a session. If name is empty, the backend auto-generates one.
func (c *Client) RenameChatSession(ctx context.Context, sessionID string, name *string) (string, error) {
payload := map[string]any{
"chat_session_id": sessionID,
}
if name != nil {
payload["name"] = *name
}
var resp struct {
NewName string `json:"new_name"`
}
if err := c.doJSON(ctx, "PUT", "/api/chat/rename-chat-session", payload, &resp); err != nil {
return "", err
}
return resp.NewName, nil
}
// UploadFile uploads a file and returns a file descriptor.
func (c *Client) UploadFile(ctx context.Context, filePath string) (*models.FileDescriptorPayload, error) {
file, err := os.Open(filePath)
if err != nil {
return nil, err
}
defer func() { _ = file.Close() }()
var buf bytes.Buffer
writer := multipart.NewWriter(&buf)
part, err := writer.CreateFormFile("files", filepath.Base(filePath))
if err != nil {
return nil, err
}
if _, err := io.Copy(part, file); err != nil {
return nil, err
}
_ = writer.Close()
req, err := c.newRequest(ctx, "POST", "/api/user/projects/file/upload", &buf)
if err != nil {
return nil, err
}
req.Header.Set("Content-Type", writer.FormDataContentType())
resp, err := c.longHTTPClient.Do(req)
if err != nil {
return nil, err
}
defer func() { _ = resp.Body.Close() }()
if resp.StatusCode < 200 || resp.StatusCode >= 300 {
body, _ := io.ReadAll(resp.Body)
return nil, &OnyxAPIError{StatusCode: resp.StatusCode, Detail: string(body)}
}
var snapshot models.CategorizedFilesSnapshot
if err := json.NewDecoder(resp.Body).Decode(&snapshot); err != nil {
return nil, err
}
if len(snapshot.UserFiles) == 0 {
return nil, &OnyxAPIError{StatusCode: 400, Detail: "File upload returned no files"}
}
uf := snapshot.UserFiles[0]
return &models.FileDescriptorPayload{
ID: uf.FileID,
Type: uf.ChatFileType,
Name: filepath.Base(filePath),
}, nil
}
// StopChatSession sends a stop signal for a streaming session (best-effort).
func (c *Client) StopChatSession(ctx context.Context, sessionID string) {
req, err := c.newRequest(ctx, "POST", "/api/chat/stop-chat-session/"+sessionID, nil)
if err != nil {
return
}
resp, err := c.httpClient.Do(req)
if err != nil {
return
}
_ = resp.Body.Close()
}

View File

@@ -1,13 +0,0 @@
package api
import "fmt"
// OnyxAPIError is returned when an Onyx API call fails.
type OnyxAPIError struct {
StatusCode int
Detail string
}
func (e *OnyxAPIError) Error() string {
return fmt.Sprintf("HTTP %d: %s", e.StatusCode, e.Detail)
}

View File

@@ -1,136 +0,0 @@
package api
import (
"bufio"
"bytes"
"context"
"encoding/json"
"fmt"
"io"
"net/http"
tea "github.com/charmbracelet/bubbletea"
"github.com/onyx-dot-app/onyx/cli/internal/models"
"github.com/onyx-dot-app/onyx/cli/internal/parser"
)
// StreamEventMsg wraps a StreamEvent for Bubble Tea.
type StreamEventMsg struct {
Event models.StreamEvent
}
// StreamDoneMsg signals the stream has ended.
type StreamDoneMsg struct {
Err error
}
// SendMessageStream starts streaming a chat message response.
// It reads NDJSON lines, parses them, and sends events on the returned channel.
// The goroutine stops when ctx is cancelled or the stream ends.
func (c *Client) SendMessageStream(
ctx context.Context,
message string,
chatSessionID *string,
agentID int,
parentMessageID *int,
fileDescriptors []models.FileDescriptorPayload,
) <-chan models.StreamEvent {
ch := make(chan models.StreamEvent, 64)
go func() {
defer close(ch)
payload := models.SendMessagePayload{
Message: message,
ParentMessageID: parentMessageID,
FileDescriptors: fileDescriptors,
Origin: "api",
IncludeCitations: true,
Stream: true,
}
if payload.FileDescriptors == nil {
payload.FileDescriptors = []models.FileDescriptorPayload{}
}
if chatSessionID != nil {
payload.ChatSessionID = chatSessionID
} else {
payload.ChatSessionInfo = &models.ChatSessionCreationInfo{AgentID: agentID}
}
body, err := json.Marshal(payload)
if err != nil {
ch <- models.ErrorEvent{Error: fmt.Sprintf("marshal error: %v", err), IsRetryable: false}
return
}
req, err := http.NewRequestWithContext(ctx, "POST", c.baseURL+"/api/chat/send-chat-message", nil)
if err != nil {
ch <- models.ErrorEvent{Error: fmt.Sprintf("request error: %v", err), IsRetryable: false}
return
}
req.Body = io.NopCloser(bytes.NewReader(body))
req.ContentLength = int64(len(body))
req.Header.Set("Content-Type", "application/json")
if c.apiKey != "" {
bearer := "Bearer " + c.apiKey
req.Header.Set("Authorization", bearer)
req.Header.Set("X-Onyx-Authorization", bearer)
}
resp, err := c.longHTTPClient.Do(req)
if err != nil {
if ctx.Err() != nil {
return // cancelled
}
ch <- models.ErrorEvent{Error: fmt.Sprintf("connection error: %v", err), IsRetryable: true}
return
}
defer func() { _ = resp.Body.Close() }()
if resp.StatusCode != 200 {
var respBody [4096]byte
n, _ := resp.Body.Read(respBody[:])
ch <- models.ErrorEvent{
Error: fmt.Sprintf("HTTP %d: %s", resp.StatusCode, string(respBody[:n])),
IsRetryable: resp.StatusCode >= 500,
}
return
}
scanner := bufio.NewScanner(resp.Body)
scanner.Buffer(make([]byte, 0, 1024*1024), 1024*1024)
for scanner.Scan() {
if ctx.Err() != nil {
return
}
event := parser.ParseStreamLine(scanner.Text())
if event != nil {
select {
case ch <- event:
case <-ctx.Done():
return
}
}
}
if err := scanner.Err(); err != nil && ctx.Err() == nil {
ch <- models.ErrorEvent{Error: fmt.Sprintf("stream read error: %v", err), IsRetryable: true}
}
}()
return ch
}
// WaitForStreamEvent returns a tea.Cmd that reads one event from the channel.
// On channel close, it returns StreamDoneMsg.
func WaitForStreamEvent(ch <-chan models.StreamEvent) tea.Cmd {
return func() tea.Msg {
event, ok := <-ch
if !ok {
return StreamDoneMsg{}
}
return StreamEventMsg{Event: event}
}
}

View File

@@ -1,101 +0,0 @@
package config
import (
"encoding/json"
"fmt"
"os"
"path/filepath"
"strconv"
)
const (
EnvServerURL = "ONYX_SERVER_URL"
EnvAPIKey = "ONYX_API_KEY"
EnvAgentID = "ONYX_PERSONA_ID"
)
// OnyxCliConfig holds the CLI configuration.
type OnyxCliConfig struct {
ServerURL string `json:"server_url"`
APIKey string `json:"api_key"`
DefaultAgentID int `json:"default_persona_id"`
}
// DefaultConfig returns a config with default values.
func DefaultConfig() OnyxCliConfig {
return OnyxCliConfig{
ServerURL: "https://cloud.onyx.app",
APIKey: "",
DefaultAgentID: 0,
}
}
// IsConfigured returns true if the config has an API key.
func (c OnyxCliConfig) IsConfigured() bool {
return c.APIKey != ""
}
// configDir returns ~/.config/onyx-cli
func configDir() string {
if xdg := os.Getenv("XDG_CONFIG_HOME"); xdg != "" {
return filepath.Join(xdg, "onyx-cli")
}
home, err := os.UserHomeDir()
if err != nil {
return filepath.Join(".", ".config", "onyx-cli")
}
return filepath.Join(home, ".config", "onyx-cli")
}
// ConfigFilePath returns the full path to the config file.
func ConfigFilePath() string {
return filepath.Join(configDir(), "config.json")
}
// ConfigExists checks if the config file exists on disk.
func ConfigExists() bool {
_, err := os.Stat(ConfigFilePath())
return err == nil
}
// Load reads config from file and applies environment variable overrides.
func Load() OnyxCliConfig {
cfg := DefaultConfig()
data, err := os.ReadFile(ConfigFilePath())
if err == nil {
if jsonErr := json.Unmarshal(data, &cfg); jsonErr != nil {
fmt.Fprintf(os.Stderr, "warning: config file %s is malformed: %v (using defaults)\n", ConfigFilePath(), jsonErr)
}
}
// Environment overrides
if v := os.Getenv(EnvServerURL); v != "" {
cfg.ServerURL = v
}
if v := os.Getenv(EnvAPIKey); v != "" {
cfg.APIKey = v
}
if v := os.Getenv(EnvAgentID); v != "" {
if id, err := strconv.Atoi(v); err == nil {
cfg.DefaultAgentID = id
}
}
return cfg
}
// Save writes the config to disk, creating parent directories if needed.
func Save(cfg OnyxCliConfig) error {
dir := configDir()
if err := os.MkdirAll(dir, 0o755); err != nil {
return err
}
data, err := json.MarshalIndent(cfg, "", " ")
if err != nil {
return err
}
return os.WriteFile(ConfigFilePath(), data, 0o600)
}

View File

@@ -1,215 +0,0 @@
package config
import (
"encoding/json"
"os"
"path/filepath"
"testing"
)
func clearEnvVars(t *testing.T) {
t.Helper()
for _, key := range []string{EnvServerURL, EnvAPIKey, EnvAgentID} {
t.Setenv(key, "")
if err := os.Unsetenv(key); err != nil {
t.Fatal(err)
}
}
}
func writeConfig(t *testing.T, dir string, data []byte) {
t.Helper()
onyxDir := filepath.Join(dir, "onyx-cli")
if err := os.MkdirAll(onyxDir, 0o755); err != nil {
t.Fatal(err)
}
if err := os.WriteFile(filepath.Join(onyxDir, "config.json"), data, 0o644); err != nil {
t.Fatal(err)
}
}
func TestDefaultConfig(t *testing.T) {
cfg := DefaultConfig()
if cfg.ServerURL != "https://cloud.onyx.app" {
t.Errorf("expected default server URL, got %s", cfg.ServerURL)
}
if cfg.APIKey != "" {
t.Errorf("expected empty API key, got %s", cfg.APIKey)
}
if cfg.DefaultAgentID != 0 {
t.Errorf("expected default agent ID 0, got %d", cfg.DefaultAgentID)
}
}
func TestIsConfigured(t *testing.T) {
cfg := DefaultConfig()
if cfg.IsConfigured() {
t.Error("empty config should not be configured")
}
cfg.APIKey = "some-key"
if !cfg.IsConfigured() {
t.Error("config with API key should be configured")
}
}
func TestLoadDefaults(t *testing.T) {
clearEnvVars(t)
dir := t.TempDir()
t.Setenv("XDG_CONFIG_HOME", dir)
cfg := Load()
if cfg.ServerURL != "https://cloud.onyx.app" {
t.Errorf("expected default URL, got %s", cfg.ServerURL)
}
if cfg.APIKey != "" {
t.Errorf("expected empty key, got %s", cfg.APIKey)
}
}
func TestLoadFromFile(t *testing.T) {
clearEnvVars(t)
dir := t.TempDir()
t.Setenv("XDG_CONFIG_HOME", dir)
data, _ := json.Marshal(map[string]interface{}{
"server_url": "https://my-onyx.example.com",
"api_key": "test-key-123",
"default_persona_id": 5,
})
writeConfig(t, dir, data)
cfg := Load()
if cfg.ServerURL != "https://my-onyx.example.com" {
t.Errorf("got %s", cfg.ServerURL)
}
if cfg.APIKey != "test-key-123" {
t.Errorf("got %s", cfg.APIKey)
}
if cfg.DefaultAgentID != 5 {
t.Errorf("got %d", cfg.DefaultAgentID)
}
}
func TestLoadCorruptFile(t *testing.T) {
clearEnvVars(t)
dir := t.TempDir()
t.Setenv("XDG_CONFIG_HOME", dir)
writeConfig(t, dir, []byte("not valid json {{{"))
cfg := Load()
if cfg.ServerURL != "https://cloud.onyx.app" {
t.Errorf("expected default URL on corrupt file, got %s", cfg.ServerURL)
}
}
func TestEnvOverrideServerURL(t *testing.T) {
clearEnvVars(t)
dir := t.TempDir()
t.Setenv("XDG_CONFIG_HOME", dir)
t.Setenv(EnvServerURL, "https://env-override.com")
cfg := Load()
if cfg.ServerURL != "https://env-override.com" {
t.Errorf("got %s", cfg.ServerURL)
}
}
func TestEnvOverrideAPIKey(t *testing.T) {
clearEnvVars(t)
dir := t.TempDir()
t.Setenv("XDG_CONFIG_HOME", dir)
t.Setenv(EnvAPIKey, "env-key")
cfg := Load()
if cfg.APIKey != "env-key" {
t.Errorf("got %s", cfg.APIKey)
}
}
func TestEnvOverrideAgentID(t *testing.T) {
clearEnvVars(t)
dir := t.TempDir()
t.Setenv("XDG_CONFIG_HOME", dir)
t.Setenv(EnvAgentID, "42")
cfg := Load()
if cfg.DefaultAgentID != 42 {
t.Errorf("got %d", cfg.DefaultAgentID)
}
}
func TestEnvOverrideInvalidAgentID(t *testing.T) {
clearEnvVars(t)
dir := t.TempDir()
t.Setenv("XDG_CONFIG_HOME", dir)
t.Setenv(EnvAgentID, "not-a-number")
cfg := Load()
if cfg.DefaultAgentID != 0 {
t.Errorf("got %d", cfg.DefaultAgentID)
}
}
func TestEnvOverridesFileValues(t *testing.T) {
clearEnvVars(t)
dir := t.TempDir()
t.Setenv("XDG_CONFIG_HOME", dir)
data, _ := json.Marshal(map[string]interface{}{
"server_url": "https://file-url.com",
"api_key": "file-key",
})
writeConfig(t, dir, data)
t.Setenv(EnvServerURL, "https://env-url.com")
cfg := Load()
if cfg.ServerURL != "https://env-url.com" {
t.Errorf("env should override file, got %s", cfg.ServerURL)
}
if cfg.APIKey != "file-key" {
t.Errorf("file value should be kept, got %s", cfg.APIKey)
}
}
func TestSaveAndReload(t *testing.T) {
clearEnvVars(t)
dir := t.TempDir()
t.Setenv("XDG_CONFIG_HOME", dir)
cfg := OnyxCliConfig{
ServerURL: "https://saved.example.com",
APIKey: "saved-key",
DefaultAgentID: 10,
}
if err := Save(cfg); err != nil {
t.Fatal(err)
}
loaded := Load()
if loaded.ServerURL != "https://saved.example.com" {
t.Errorf("got %s", loaded.ServerURL)
}
if loaded.APIKey != "saved-key" {
t.Errorf("got %s", loaded.APIKey)
}
if loaded.DefaultAgentID != 10 {
t.Errorf("got %d", loaded.DefaultAgentID)
}
}
func TestSaveCreatesParentDirs(t *testing.T) {
clearEnvVars(t)
dir := t.TempDir()
nested := filepath.Join(dir, "deep", "nested")
t.Setenv("XDG_CONFIG_HOME", nested)
if err := Save(OnyxCliConfig{APIKey: "test"}); err != nil {
t.Fatal(err)
}
if !ConfigExists() {
t.Error("config file should exist after save")
}
}

View File

@@ -1,193 +0,0 @@
package models
// StreamEvent is the interface for all parsed stream events.
type StreamEvent interface {
EventType() string
}
// Event type constants matching the Python StreamEventType enum.
const (
EventSessionCreated = "session_created"
EventMessageIDInfo = "message_id_info"
EventStop = "stop"
EventError = "error"
EventMessageStart = "message_start"
EventMessageDelta = "message_delta"
EventSearchStart = "search_tool_start"
EventSearchQueries = "search_tool_queries_delta"
EventSearchDocuments = "search_tool_documents_delta"
EventReasoningStart = "reasoning_start"
EventReasoningDelta = "reasoning_delta"
EventReasoningDone = "reasoning_done"
EventCitationInfo = "citation_info"
EventOpenURLStart = "open_url_start"
EventImageGenStart = "image_generation_start"
EventPythonToolStart = "python_tool_start"
EventCustomToolStart = "custom_tool_start"
EventFileReaderStart = "file_reader_start"
EventDeepResearchPlan = "deep_research_plan_start"
EventDeepResearchDelta = "deep_research_plan_delta"
EventResearchAgentStart = "research_agent_start"
EventIntermediateReport = "intermediate_report_start"
EventIntermediateReportDt = "intermediate_report_delta"
EventUnknown = "unknown"
)
// SessionCreatedEvent is emitted when a new chat session is created.
type SessionCreatedEvent struct {
ChatSessionID string `json:"chat_session_id"`
}
func (e SessionCreatedEvent) EventType() string { return EventSessionCreated }
// MessageIDEvent carries the user and agent message IDs.
type MessageIDEvent struct {
UserMessageID *int `json:"user_message_id,omitempty"`
ReservedAgentMessageID int `json:"reserved_agent_message_id"`
}
func (e MessageIDEvent) EventType() string { return EventMessageIDInfo }
// StopEvent signals the end of a stream.
type StopEvent struct {
Placement *Placement `json:"placement,omitempty"`
StopReason *string `json:"stop_reason,omitempty"`
}
func (e StopEvent) EventType() string { return EventStop }
// ErrorEvent signals an error.
type ErrorEvent struct {
Placement *Placement `json:"placement,omitempty"`
Error string `json:"error"`
StackTrace *string `json:"stack_trace,omitempty"`
IsRetryable bool `json:"is_retryable"`
}
func (e ErrorEvent) EventType() string { return EventError }
// MessageStartEvent signals the beginning of an agent message.
type MessageStartEvent struct {
Placement *Placement `json:"placement,omitempty"`
Documents []SearchDoc `json:"documents,omitempty"`
}
func (e MessageStartEvent) EventType() string { return EventMessageStart }
// MessageDeltaEvent carries a token of agent content.
type MessageDeltaEvent struct {
Placement *Placement `json:"placement,omitempty"`
Content string `json:"content"`
}
func (e MessageDeltaEvent) EventType() string { return EventMessageDelta }
// SearchStartEvent signals the beginning of a search.
type SearchStartEvent struct {
Placement *Placement `json:"placement,omitempty"`
IsInternetSearch bool `json:"is_internet_search"`
}
func (e SearchStartEvent) EventType() string { return EventSearchStart }
// SearchQueriesEvent carries search queries.
type SearchQueriesEvent struct {
Placement *Placement `json:"placement,omitempty"`
Queries []string `json:"queries"`
}
func (e SearchQueriesEvent) EventType() string { return EventSearchQueries }
// SearchDocumentsEvent carries found documents.
type SearchDocumentsEvent struct {
Placement *Placement `json:"placement,omitempty"`
Documents []SearchDoc `json:"documents"`
}
func (e SearchDocumentsEvent) EventType() string { return EventSearchDocuments }
// ReasoningStartEvent signals the beginning of a reasoning block.
type ReasoningStartEvent struct {
Placement *Placement `json:"placement,omitempty"`
}
func (e ReasoningStartEvent) EventType() string { return EventReasoningStart }
// ReasoningDeltaEvent carries reasoning text.
type ReasoningDeltaEvent struct {
Placement *Placement `json:"placement,omitempty"`
Reasoning string `json:"reasoning"`
}
func (e ReasoningDeltaEvent) EventType() string { return EventReasoningDelta }
// ReasoningDoneEvent signals the end of reasoning.
type ReasoningDoneEvent struct {
Placement *Placement `json:"placement,omitempty"`
}
func (e ReasoningDoneEvent) EventType() string { return EventReasoningDone }
// CitationEvent carries citation info.
type CitationEvent struct {
Placement *Placement `json:"placement,omitempty"`
CitationNumber int `json:"citation_number"`
DocumentID string `json:"document_id"`
}
func (e CitationEvent) EventType() string { return EventCitationInfo }
// ToolStartEvent signals the start of a tool usage.
type ToolStartEvent struct {
Placement *Placement `json:"placement,omitempty"`
Type string `json:"type"`
ToolName string `json:"tool_name"`
}
func (e ToolStartEvent) EventType() string { return e.Type }
// DeepResearchPlanStartEvent signals the start of a deep research plan.
type DeepResearchPlanStartEvent struct {
Placement *Placement `json:"placement,omitempty"`
}
func (e DeepResearchPlanStartEvent) EventType() string { return EventDeepResearchPlan }
// DeepResearchPlanDeltaEvent carries deep research plan content.
type DeepResearchPlanDeltaEvent struct {
Placement *Placement `json:"placement,omitempty"`
Content string `json:"content"`
}
func (e DeepResearchPlanDeltaEvent) EventType() string { return EventDeepResearchDelta }
// ResearchAgentStartEvent signals a research sub-task.
type ResearchAgentStartEvent struct {
Placement *Placement `json:"placement,omitempty"`
ResearchTask string `json:"research_task"`
}
func (e ResearchAgentStartEvent) EventType() string { return EventResearchAgentStart }
// IntermediateReportStartEvent signals the start of an intermediate report.
type IntermediateReportStartEvent struct {
Placement *Placement `json:"placement,omitempty"`
}
func (e IntermediateReportStartEvent) EventType() string { return EventIntermediateReport }
// IntermediateReportDeltaEvent carries intermediate report content.
type IntermediateReportDeltaEvent struct {
Placement *Placement `json:"placement,omitempty"`
Content string `json:"content"`
}
func (e IntermediateReportDeltaEvent) EventType() string { return EventIntermediateReportDt }
// UnknownEvent is a catch-all for unrecognized stream data.
type UnknownEvent struct {
Placement *Placement `json:"placement,omitempty"`
RawData map[string]any `json:"raw_data,omitempty"`
}
func (e UnknownEvent) EventType() string { return EventUnknown }

View File

@@ -1,112 +0,0 @@
// Package models defines API request/response types for the Onyx CLI.
package models
import "time"
// AgentSummary represents an agent from the API.
type AgentSummary struct {
ID int `json:"id"`
Name string `json:"name"`
Description string `json:"description"`
IsDefaultPersona bool `json:"is_default_persona"`
IsVisible bool `json:"is_visible"`
}
// ChatSessionSummary is a brief session listing.
type ChatSessionSummary struct {
ID string `json:"id"`
Name *string `json:"name"`
AgentID *int `json:"persona_id"`
Created time.Time `json:"time_created"`
}
// ChatSessionDetails is a session with timestamps as strings.
type ChatSessionDetails struct {
ID string `json:"id"`
Name *string `json:"name"`
AgentID *int `json:"persona_id"`
Created string `json:"time_created"`
Updated string `json:"time_updated"`
}
// ChatMessageDetail is a single message in a session.
type ChatMessageDetail struct {
MessageID int `json:"message_id"`
ParentMessage *int `json:"parent_message"`
LatestChildMessage *int `json:"latest_child_message"`
Message string `json:"message"`
MessageType string `json:"message_type"`
TimeSent string `json:"time_sent"`
Error *string `json:"error"`
}
// ChatSessionDetailResponse is the full session detail from the API.
type ChatSessionDetailResponse struct {
ChatSessionID string `json:"chat_session_id"`
Description *string `json:"description"`
AgentID *int `json:"persona_id"`
AgentName *string `json:"persona_name"`
Messages []ChatMessageDetail `json:"messages"`
}
// ChatFileType represents a file type for uploads.
type ChatFileType string
const (
ChatFileImage ChatFileType = "image"
ChatFileDoc ChatFileType = "document"
ChatFilePlainText ChatFileType = "plain_text"
ChatFileCSV ChatFileType = "csv"
)
// FileDescriptorPayload is a file descriptor for send-message requests.
type FileDescriptorPayload struct {
ID string `json:"id"`
Type ChatFileType `json:"type"`
Name string `json:"name,omitempty"`
}
// UserFileSnapshot represents an uploaded file.
type UserFileSnapshot struct {
ID string `json:"id"`
Name string `json:"name"`
FileID string `json:"file_id"`
ChatFileType ChatFileType `json:"chat_file_type"`
}
// CategorizedFilesSnapshot is the response from file upload.
type CategorizedFilesSnapshot struct {
UserFiles []UserFileSnapshot `json:"user_files"`
}
// ChatSessionCreationInfo is included when creating a new session inline.
type ChatSessionCreationInfo struct {
AgentID int `json:"persona_id"`
}
// SendMessagePayload is the request body for POST /api/chat/send-chat-message.
type SendMessagePayload struct {
Message string `json:"message"`
ChatSessionID *string `json:"chat_session_id,omitempty"`
ChatSessionInfo *ChatSessionCreationInfo `json:"chat_session_info,omitempty"`
ParentMessageID *int `json:"parent_message_id"`
FileDescriptors []FileDescriptorPayload `json:"file_descriptors"`
Origin string `json:"origin"`
IncludeCitations bool `json:"include_citations"`
Stream bool `json:"stream"`
}
// SearchDoc represents a document found during search.
type SearchDoc struct {
DocumentID string `json:"document_id"`
SemanticIdentifier string `json:"semantic_identifier"`
Link *string `json:"link"`
SourceType string `json:"source_type"`
}
// Placement indicates where a stream event belongs in the conversation.
type Placement struct {
TurnIndex int `json:"turn_index"`
TabIndex int `json:"tab_index"`
SubTurnIndex *int `json:"sub_turn_index"`
}

View File

@@ -1,170 +0,0 @@
// Package onboarding handles the first-run setup flow for Onyx CLI.
package onboarding
import (
"bufio"
"context"
"fmt"
"os"
"strings"
"github.com/onyx-dot-app/onyx/cli/internal/api"
"github.com/onyx-dot-app/onyx/cli/internal/config"
"github.com/onyx-dot-app/onyx/cli/internal/tui"
"github.com/onyx-dot-app/onyx/cli/internal/util"
"golang.org/x/term"
)
// Aliases for shared styles.
var (
boldStyle = util.BoldStyle
dimStyle = util.DimStyle
greenStyle = util.GreenStyle
redStyle = util.RedStyle
yellowStyle = util.YellowStyle
)
func getTermSize() (int, int) {
w, h, err := term.GetSize(int(os.Stdout.Fd()))
if err != nil {
return 80, 24
}
return w, h
}
// Run executes the interactive onboarding flow.
// Returns the validated config, or nil if the user cancels.
func Run(existing *config.OnyxCliConfig) *config.OnyxCliConfig {
cfg := config.DefaultConfig()
if existing != nil {
cfg = *existing
}
w, h := getTermSize()
fmt.Print(tui.RenderSplashOnboarding(w, h))
fmt.Println()
fmt.Println(" Welcome to " + boldStyle.Render("Onyx CLI") + ".")
fmt.Println()
reader := bufio.NewReader(os.Stdin)
// Server URL
serverURL := prompt(reader, " Onyx server URL", cfg.ServerURL)
if serverURL == "" {
return nil
}
if !strings.HasPrefix(serverURL, "http://") && !strings.HasPrefix(serverURL, "https://") {
fmt.Println(" " + redStyle.Render("Server URL must start with http:// or https://"))
return nil
}
// API Key
fmt.Println()
fmt.Println(" " + dimStyle.Render("Need an API key? Press Enter to open the admin panel in your browser,"))
fmt.Println(" " + dimStyle.Render("or paste your key below."))
fmt.Println()
apiKey := promptSecret(" API key", cfg.APIKey)
if apiKey == "" {
// Open browser to API key page
url := strings.TrimRight(serverURL, "/") + "/app/settings/accounts-access"
fmt.Printf("\n Opening %s ...\n", url)
util.OpenBrowser(url)
fmt.Println(" " + dimStyle.Render("Copy your API key, then paste it here."))
fmt.Println()
apiKey = promptSecret(" API key", "")
if apiKey == "" {
fmt.Println("\n " + redStyle.Render("No API key provided. Exiting."))
return nil
}
}
// Test connection
cfg = config.OnyxCliConfig{
ServerURL: serverURL,
APIKey: apiKey,
DefaultAgentID: cfg.DefaultAgentID,
}
fmt.Println("\n " + yellowStyle.Render("Testing connection..."))
client := api.NewClient(cfg)
if err := client.TestConnection(context.Background()); err != nil {
fmt.Println(" " + redStyle.Render("Connection failed.") + " " + err.Error())
fmt.Println()
fmt.Println(" " + dimStyle.Render("Run ") + boldStyle.Render("onyx-cli configure") + dimStyle.Render(" to try again."))
return nil
}
if err := config.Save(cfg); err != nil {
fmt.Println(" " + redStyle.Render("Could not save config: "+err.Error()))
return nil
}
fmt.Println(" " + greenStyle.Render("Connected and authenticated."))
fmt.Println()
printQuickStart()
return &cfg
}
func promptSecret(label, defaultVal string) string {
if defaultVal != "" {
fmt.Printf("%s %s: ", label, dimStyle.Render("[hidden]"))
} else {
fmt.Printf("%s: ", label)
}
password, err := term.ReadPassword(int(os.Stdin.Fd()))
fmt.Println() // ReadPassword doesn't echo a newline
if err != nil {
return defaultVal
}
line := strings.TrimSpace(string(password))
if line == "" {
return defaultVal
}
return line
}
func prompt(reader *bufio.Reader, label, defaultVal string) string {
if defaultVal != "" {
fmt.Printf("%s %s: ", label, dimStyle.Render("["+defaultVal+"]"))
} else {
fmt.Printf("%s: ", label)
}
line, err := reader.ReadString('\n')
// ReadString may return partial data along with an error (e.g. EOF without newline)
line = strings.TrimSpace(line)
if line != "" {
return line
}
if err != nil {
return defaultVal
}
return defaultVal
}
func printQuickStart() {
fmt.Println(" " + boldStyle.Render("Quick start"))
fmt.Println()
fmt.Println(" Just type to chat with your Onyx agent.")
fmt.Println()
rows := [][2]string{
{"/help", "Show all commands"},
{"/attach", "Attach a file"},
{"/agent", "Switch agent"},
{"/new", "New conversation"},
{"/sessions", "Browse previous chats"},
{"Esc", "Cancel generation"},
{"Ctrl+D", "Quit"},
}
for _, r := range rows {
fmt.Printf(" %-12s %s\n", boldStyle.Render(r[0]), dimStyle.Render(r[1]))
}
fmt.Println()
}

View File

@@ -1,248 +0,0 @@
// Package parser handles NDJSON stream parsing for Onyx chat responses.
package parser
import (
"encoding/json"
"fmt"
"strings"
"github.com/onyx-dot-app/onyx/cli/internal/models"
"golang.org/x/text/cases"
"golang.org/x/text/language"
)
// ParseStreamLine parses a single NDJSON line into a typed StreamEvent.
// Returns nil for empty lines or unparseable content.
func ParseStreamLine(line string) models.StreamEvent {
line = strings.TrimSpace(line)
if line == "" {
return nil
}
var data map[string]any
if err := json.Unmarshal([]byte(line), &data); err != nil {
return models.ErrorEvent{Error: fmt.Sprintf("malformed stream data: %v", err), IsRetryable: false}
}
// Case 1: CreateChatSessionID
if _, ok := data["chat_session_id"]; ok {
if _, hasPlacement := data["placement"]; !hasPlacement {
sid, _ := data["chat_session_id"].(string)
return models.SessionCreatedEvent{ChatSessionID: sid}
}
}
// Case 2: MessageResponseIDInfo
if _, ok := data["reserved_assistant_message_id"]; ok {
reservedID := jsonInt(data["reserved_assistant_message_id"])
var userMsgID *int
if v, ok := data["user_message_id"]; ok && v != nil {
id := jsonInt(v)
userMsgID = &id
}
return models.MessageIDEvent{
UserMessageID: userMsgID,
ReservedAgentMessageID: reservedID,
}
}
// Case 3: StreamingError (top-level error without placement)
if _, ok := data["error"]; ok {
if _, hasPlacement := data["placement"]; !hasPlacement {
errStr, _ := data["error"].(string)
var stackTrace *string
if st, ok := data["stack_trace"].(string); ok {
stackTrace = &st
}
isRetryable := true
if v, ok := data["is_retryable"].(bool); ok {
isRetryable = v
}
return models.ErrorEvent{
Error: errStr,
StackTrace: stackTrace,
IsRetryable: isRetryable,
}
}
}
// Case 4: Packet with placement + obj
if rawPlacement, ok := data["placement"]; ok {
if rawObj, ok := data["obj"]; ok {
placement := parsePlacement(rawPlacement)
obj, _ := rawObj.(map[string]any)
if obj == nil {
return models.UnknownEvent{Placement: placement, RawData: data}
}
return parsePacketObj(obj, placement)
}
}
// Fallback
return models.UnknownEvent{RawData: data}
}
func parsePlacement(raw interface{}) *models.Placement {
m, ok := raw.(map[string]any)
if !ok {
return nil
}
p := &models.Placement{
TurnIndex: jsonInt(m["turn_index"]),
TabIndex: jsonInt(m["tab_index"]),
}
if v, ok := m["sub_turn_index"]; ok && v != nil {
st := jsonInt(v)
p.SubTurnIndex = &st
}
return p
}
func parsePacketObj(obj map[string]any, placement *models.Placement) models.StreamEvent {
objType, _ := obj["type"].(string)
switch objType {
case "stop":
var reason *string
if r, ok := obj["stop_reason"].(string); ok {
reason = &r
}
return models.StopEvent{Placement: placement, StopReason: reason}
case "error":
errMsg := "Unknown error"
if e, ok := obj["exception"]; ok {
errMsg = toString(e)
}
return models.ErrorEvent{Placement: placement, Error: errMsg, IsRetryable: true}
case "message_start":
var docs []models.SearchDoc
if rawDocs, ok := obj["final_documents"].([]any); ok {
docs = parseSearchDocs(rawDocs)
}
return models.MessageStartEvent{Placement: placement, Documents: docs}
case "message_delta":
content, _ := obj["content"].(string)
return models.MessageDeltaEvent{Placement: placement, Content: content}
case "search_tool_start":
isInternet, _ := obj["is_internet_search"].(bool)
return models.SearchStartEvent{Placement: placement, IsInternetSearch: isInternet}
case "search_tool_queries_delta":
var queries []string
if raw, ok := obj["queries"].([]any); ok {
for _, q := range raw {
if s, ok := q.(string); ok {
queries = append(queries, s)
}
}
}
return models.SearchQueriesEvent{Placement: placement, Queries: queries}
case "search_tool_documents_delta":
var docs []models.SearchDoc
if rawDocs, ok := obj["documents"].([]any); ok {
docs = parseSearchDocs(rawDocs)
}
return models.SearchDocumentsEvent{Placement: placement, Documents: docs}
case "reasoning_start":
return models.ReasoningStartEvent{Placement: placement}
case "reasoning_delta":
reasoning, _ := obj["reasoning"].(string)
return models.ReasoningDeltaEvent{Placement: placement, Reasoning: reasoning}
case "reasoning_done":
return models.ReasoningDoneEvent{Placement: placement}
case "citation_info":
return models.CitationEvent{
Placement: placement,
CitationNumber: jsonInt(obj["citation_number"]),
DocumentID: jsonString(obj["document_id"]),
}
case "open_url_start", "image_generation_start", "python_tool_start", "file_reader_start":
toolName := strings.ReplaceAll(strings.TrimSuffix(objType, "_start"), "_", " ")
toolName = cases.Title(language.English).String(toolName)
return models.ToolStartEvent{Placement: placement, Type: objType, ToolName: toolName}
case "custom_tool_start":
toolName := jsonString(obj["tool_name"])
if toolName == "" {
toolName = "Custom Tool"
}
return models.ToolStartEvent{Placement: placement, Type: models.EventCustomToolStart, ToolName: toolName}
case "deep_research_plan_start":
return models.DeepResearchPlanStartEvent{Placement: placement}
case "deep_research_plan_delta":
content, _ := obj["content"].(string)
return models.DeepResearchPlanDeltaEvent{Placement: placement, Content: content}
case "research_agent_start":
task, _ := obj["research_task"].(string)
return models.ResearchAgentStartEvent{Placement: placement, ResearchTask: task}
case "intermediate_report_start":
return models.IntermediateReportStartEvent{Placement: placement}
case "intermediate_report_delta":
content, _ := obj["content"].(string)
return models.IntermediateReportDeltaEvent{Placement: placement, Content: content}
default:
return models.UnknownEvent{Placement: placement, RawData: obj}
}
}
func parseSearchDocs(raw []any) []models.SearchDoc {
var docs []models.SearchDoc
for _, item := range raw {
m, ok := item.(map[string]any)
if !ok {
continue
}
doc := models.SearchDoc{
DocumentID: jsonString(m["document_id"]),
SemanticIdentifier: jsonString(m["semantic_identifier"]),
SourceType: jsonString(m["source_type"]),
}
if link, ok := m["link"].(string); ok {
doc.Link = &link
}
docs = append(docs, doc)
}
return docs
}
func jsonInt(v any) int {
switch n := v.(type) {
case float64:
return int(n)
case int:
return n
default:
return 0
}
}
func jsonString(v any) string {
s, _ := v.(string)
return s
}
func toString(v any) string {
switch s := v.(type) {
case string:
return s
default:
b, _ := json.Marshal(v)
return string(b)
}
}

View File

@@ -1,419 +0,0 @@
package parser
import (
"encoding/json"
"testing"
"github.com/onyx-dot-app/onyx/cli/internal/models"
)
func TestEmptyLineReturnsNil(t *testing.T) {
for _, line := range []string{"", " ", "\n"} {
if ParseStreamLine(line) != nil {
t.Errorf("expected nil for %q", line)
}
}
}
func TestInvalidJSONReturnsErrorEvent(t *testing.T) {
for _, line := range []string{"not json", "{broken"} {
event := ParseStreamLine(line)
if event == nil {
t.Errorf("expected ErrorEvent for %q, got nil", line)
continue
}
if _, ok := event.(models.ErrorEvent); !ok {
t.Errorf("expected ErrorEvent for %q, got %T", line, event)
}
}
}
func TestSessionCreated(t *testing.T) {
line := mustJSON(map[string]interface{}{
"chat_session_id": "550e8400-e29b-41d4-a716-446655440000",
})
event := ParseStreamLine(line)
e, ok := event.(models.SessionCreatedEvent)
if !ok {
t.Fatalf("expected SessionCreatedEvent, got %T", event)
}
if e.ChatSessionID != "550e8400-e29b-41d4-a716-446655440000" {
t.Errorf("got %s", e.ChatSessionID)
}
}
func TestMessageIDInfo(t *testing.T) {
line := mustJSON(map[string]interface{}{
"user_message_id": 1,
"reserved_assistant_message_id": 2,
})
event := ParseStreamLine(line)
e, ok := event.(models.MessageIDEvent)
if !ok {
t.Fatalf("expected MessageIDEvent, got %T", event)
}
if e.UserMessageID == nil || *e.UserMessageID != 1 {
t.Errorf("expected user_message_id=1")
}
if e.ReservedAgentMessageID != 2 {
t.Errorf("got %d", e.ReservedAgentMessageID)
}
}
func TestMessageIDInfoNullUserID(t *testing.T) {
line := mustJSON(map[string]interface{}{
"user_message_id": nil,
"reserved_assistant_message_id": 5,
})
event := ParseStreamLine(line)
e, ok := event.(models.MessageIDEvent)
if !ok {
t.Fatalf("expected MessageIDEvent, got %T", event)
}
if e.UserMessageID != nil {
t.Error("expected nil user_message_id")
}
if e.ReservedAgentMessageID != 5 {
t.Errorf("got %d", e.ReservedAgentMessageID)
}
}
func TestTopLevelError(t *testing.T) {
line := mustJSON(map[string]interface{}{
"error": "Rate limit exceeded",
"stack_trace": "...",
"is_retryable": true,
})
event := ParseStreamLine(line)
e, ok := event.(models.ErrorEvent)
if !ok {
t.Fatalf("expected ErrorEvent, got %T", event)
}
if e.Error != "Rate limit exceeded" {
t.Errorf("got %s", e.Error)
}
if e.StackTrace == nil || *e.StackTrace != "..." {
t.Error("expected stack_trace")
}
if !e.IsRetryable {
t.Error("expected retryable")
}
}
func TestTopLevelErrorMinimal(t *testing.T) {
line := mustJSON(map[string]interface{}{
"error": "Something broke",
})
event := ParseStreamLine(line)
e, ok := event.(models.ErrorEvent)
if !ok {
t.Fatalf("expected ErrorEvent, got %T", event)
}
if e.Error != "Something broke" {
t.Errorf("got %s", e.Error)
}
if !e.IsRetryable {
t.Error("expected default retryable=true")
}
}
func makePacket(obj map[string]interface{}, turnIndex, tabIndex int) string {
return mustJSON(map[string]interface{}{
"placement": map[string]interface{}{"turn_index": turnIndex, "tab_index": tabIndex},
"obj": obj,
})
}
func TestStopPacket(t *testing.T) {
line := makePacket(map[string]interface{}{"type": "stop", "stop_reason": "completed"}, 0, 0)
event := ParseStreamLine(line)
e, ok := event.(models.StopEvent)
if !ok {
t.Fatalf("expected StopEvent, got %T", event)
}
if e.StopReason == nil || *e.StopReason != "completed" {
t.Error("expected stop_reason=completed")
}
if e.Placement == nil || e.Placement.TurnIndex != 0 {
t.Error("expected placement")
}
}
func TestStopPacketNoReason(t *testing.T) {
line := makePacket(map[string]interface{}{"type": "stop"}, 0, 0)
event := ParseStreamLine(line)
e, ok := event.(models.StopEvent)
if !ok {
t.Fatalf("expected StopEvent, got %T", event)
}
if e.StopReason != nil {
t.Error("expected nil stop_reason")
}
}
func TestMessageStart(t *testing.T) {
line := makePacket(map[string]interface{}{"type": "message_start"}, 0, 0)
event := ParseStreamLine(line)
_, ok := event.(models.MessageStartEvent)
if !ok {
t.Fatalf("expected MessageStartEvent, got %T", event)
}
}
func TestMessageStartWithDocuments(t *testing.T) {
line := makePacket(map[string]interface{}{
"type": "message_start",
"final_documents": []interface{}{
map[string]interface{}{"document_id": "doc1", "semantic_identifier": "Doc 1"},
},
}, 0, 0)
event := ParseStreamLine(line)
e, ok := event.(models.MessageStartEvent)
if !ok {
t.Fatalf("expected MessageStartEvent, got %T", event)
}
if len(e.Documents) != 1 || e.Documents[0].DocumentID != "doc1" {
t.Error("expected 1 document with id doc1")
}
}
func TestMessageDelta(t *testing.T) {
line := makePacket(map[string]interface{}{"type": "message_delta", "content": "Hello"}, 0, 0)
event := ParseStreamLine(line)
e, ok := event.(models.MessageDeltaEvent)
if !ok {
t.Fatalf("expected MessageDeltaEvent, got %T", event)
}
if e.Content != "Hello" {
t.Errorf("got %s", e.Content)
}
}
func TestMessageDeltaEmpty(t *testing.T) {
line := makePacket(map[string]interface{}{"type": "message_delta", "content": ""}, 0, 0)
event := ParseStreamLine(line)
e, ok := event.(models.MessageDeltaEvent)
if !ok {
t.Fatalf("expected MessageDeltaEvent, got %T", event)
}
if e.Content != "" {
t.Errorf("expected empty, got %s", e.Content)
}
}
func TestSearchToolStart(t *testing.T) {
line := makePacket(map[string]interface{}{
"type": "search_tool_start", "is_internet_search": true,
}, 0, 0)
event := ParseStreamLine(line)
e, ok := event.(models.SearchStartEvent)
if !ok {
t.Fatalf("expected SearchStartEvent, got %T", event)
}
if !e.IsInternetSearch {
t.Error("expected internet search")
}
}
func TestSearchToolQueries(t *testing.T) {
line := makePacket(map[string]interface{}{
"type": "search_tool_queries_delta",
"queries": []interface{}{"query 1", "query 2"},
}, 0, 0)
event := ParseStreamLine(line)
e, ok := event.(models.SearchQueriesEvent)
if !ok {
t.Fatalf("expected SearchQueriesEvent, got %T", event)
}
if len(e.Queries) != 2 || e.Queries[0] != "query 1" {
t.Error("unexpected queries")
}
}
func TestSearchToolDocuments(t *testing.T) {
line := makePacket(map[string]interface{}{
"type": "search_tool_documents_delta",
"documents": []interface{}{
map[string]interface{}{"document_id": "d1", "semantic_identifier": "First Doc", "link": "http://example.com"},
map[string]interface{}{"document_id": "d2", "semantic_identifier": "Second Doc"},
},
}, 0, 0)
event := ParseStreamLine(line)
e, ok := event.(models.SearchDocumentsEvent)
if !ok {
t.Fatalf("expected SearchDocumentsEvent, got %T", event)
}
if len(e.Documents) != 2 {
t.Errorf("expected 2 docs, got %d", len(e.Documents))
}
if e.Documents[0].Link == nil || *e.Documents[0].Link != "http://example.com" {
t.Error("expected link on first doc")
}
}
func TestReasoningStart(t *testing.T) {
line := makePacket(map[string]interface{}{"type": "reasoning_start"}, 0, 0)
event := ParseStreamLine(line)
if _, ok := event.(models.ReasoningStartEvent); !ok {
t.Fatalf("expected ReasoningStartEvent, got %T", event)
}
}
func TestReasoningDelta(t *testing.T) {
line := makePacket(map[string]interface{}{
"type": "reasoning_delta", "reasoning": "Let me think...",
}, 0, 0)
event := ParseStreamLine(line)
e, ok := event.(models.ReasoningDeltaEvent)
if !ok {
t.Fatalf("expected ReasoningDeltaEvent, got %T", event)
}
if e.Reasoning != "Let me think..." {
t.Errorf("got %s", e.Reasoning)
}
}
func TestReasoningDone(t *testing.T) {
line := makePacket(map[string]interface{}{"type": "reasoning_done"}, 0, 0)
event := ParseStreamLine(line)
if _, ok := event.(models.ReasoningDoneEvent); !ok {
t.Fatalf("expected ReasoningDoneEvent, got %T", event)
}
}
func TestCitationInfo(t *testing.T) {
line := makePacket(map[string]interface{}{
"type": "citation_info", "citation_number": 1, "document_id": "doc_abc",
}, 0, 0)
event := ParseStreamLine(line)
e, ok := event.(models.CitationEvent)
if !ok {
t.Fatalf("expected CitationEvent, got %T", event)
}
if e.CitationNumber != 1 || e.DocumentID != "doc_abc" {
t.Errorf("got %d, %s", e.CitationNumber, e.DocumentID)
}
}
func TestOpenURLStart(t *testing.T) {
line := makePacket(map[string]interface{}{"type": "open_url_start"}, 0, 0)
event := ParseStreamLine(line)
e, ok := event.(models.ToolStartEvent)
if !ok {
t.Fatalf("expected ToolStartEvent, got %T", event)
}
if e.Type != "open_url_start" {
t.Errorf("got type %s", e.Type)
}
}
func TestPythonToolStart(t *testing.T) {
line := makePacket(map[string]interface{}{
"type": "python_tool_start", "code": "print('hi')",
}, 0, 0)
event := ParseStreamLine(line)
e, ok := event.(models.ToolStartEvent)
if !ok {
t.Fatalf("expected ToolStartEvent, got %T", event)
}
if e.ToolName != "Python Tool" {
t.Errorf("got %s", e.ToolName)
}
}
func TestCustomToolStart(t *testing.T) {
line := makePacket(map[string]interface{}{
"type": "custom_tool_start", "tool_name": "MyTool",
}, 0, 0)
event := ParseStreamLine(line)
e, ok := event.(models.ToolStartEvent)
if !ok {
t.Fatalf("expected ToolStartEvent, got %T", event)
}
if e.ToolName != "MyTool" {
t.Errorf("got %s", e.ToolName)
}
}
func TestDeepResearchPlanDelta(t *testing.T) {
line := makePacket(map[string]interface{}{
"type": "deep_research_plan_delta", "content": "Step 1: ...",
}, 0, 0)
event := ParseStreamLine(line)
e, ok := event.(models.DeepResearchPlanDeltaEvent)
if !ok {
t.Fatalf("expected DeepResearchPlanDeltaEvent, got %T", event)
}
if e.Content != "Step 1: ..." {
t.Errorf("got %s", e.Content)
}
}
func TestResearchAgentStart(t *testing.T) {
line := makePacket(map[string]interface{}{
"type": "research_agent_start", "research_task": "Find info about X",
}, 0, 0)
event := ParseStreamLine(line)
e, ok := event.(models.ResearchAgentStartEvent)
if !ok {
t.Fatalf("expected ResearchAgentStartEvent, got %T", event)
}
if e.ResearchTask != "Find info about X" {
t.Errorf("got %s", e.ResearchTask)
}
}
func TestIntermediateReportDelta(t *testing.T) {
line := makePacket(map[string]interface{}{
"type": "intermediate_report_delta", "content": "Report text",
}, 0, 0)
event := ParseStreamLine(line)
e, ok := event.(models.IntermediateReportDeltaEvent)
if !ok {
t.Fatalf("expected IntermediateReportDeltaEvent, got %T", event)
}
if e.Content != "Report text" {
t.Errorf("got %s", e.Content)
}
}
func TestUnknownPacketType(t *testing.T) {
line := makePacket(map[string]interface{}{"type": "section_end"}, 0, 0)
event := ParseStreamLine(line)
if _, ok := event.(models.UnknownEvent); !ok {
t.Fatalf("expected UnknownEvent, got %T", event)
}
}
func TestUnknownTopLevel(t *testing.T) {
line := mustJSON(map[string]interface{}{"some_unknown_field": "value"})
event := ParseStreamLine(line)
if _, ok := event.(models.UnknownEvent); !ok {
t.Fatalf("expected UnknownEvent, got %T", event)
}
}
func TestPlacementPreserved(t *testing.T) {
line := makePacket(map[string]interface{}{
"type": "message_delta", "content": "x",
}, 3, 1)
event := ParseStreamLine(line)
e, ok := event.(models.MessageDeltaEvent)
if !ok {
t.Fatalf("expected MessageDeltaEvent, got %T", event)
}
if e.Placement == nil {
t.Fatal("expected placement")
}
if e.Placement.TurnIndex != 3 || e.Placement.TabIndex != 1 {
t.Errorf("got turn=%d tab=%d", e.Placement.TurnIndex, e.Placement.TabIndex)
}
}
func mustJSON(v interface{}) string {
b, err := json.Marshal(v)
if err != nil {
panic(err)
}
return string(b)
}

View File

@@ -1,639 +0,0 @@
// Package tui implements the Bubble Tea TUI for Onyx CLI.
package tui
import (
"context"
"fmt"
"strconv"
"strings"
"time"
tea "github.com/charmbracelet/bubbletea"
"github.com/charmbracelet/lipgloss"
"github.com/onyx-dot-app/onyx/cli/internal/api"
"github.com/onyx-dot-app/onyx/cli/internal/config"
"github.com/onyx-dot-app/onyx/cli/internal/models"
)
// Model is the root Bubble Tea model.
type Model struct {
config config.OnyxCliConfig
client *api.Client
viewport *viewport
input inputModel
status statusBar
width int
height int
// Chat state
chatSessionID *string
agentID int
agentName string
agents []models.AgentSummary
parentMessageID *int
isStreaming bool
streamCancel context.CancelFunc
streamCh <-chan models.StreamEvent
citations map[int]string
attachedFiles []models.FileDescriptorPayload
needsRename bool
agentStarted bool
// Quit state
quitPending bool
splashShown bool
initInputReady bool // true once terminal init responses have passed
}
// NewModel creates a new TUI model.
func NewModel(cfg config.OnyxCliConfig) Model {
client := api.NewClient(cfg)
parentID := -1
return Model{
config: cfg,
client: client,
viewport: newViewport(80),
input: newInputModel(),
status: newStatusBar(),
agentID: cfg.DefaultAgentID,
agentName: "Default",
parentMessageID: &parentID,
citations: make(map[int]string),
}
}
// Init initializes the model.
func (m Model) Init() tea.Cmd {
return loadAgentsCmd(m.client)
}
// Update handles messages.
func (m Model) Update(msg tea.Msg) (tea.Model, tea.Cmd) {
// Filter out terminal query responses (OSC 11 background color, cursor
// position reports, etc.) that arrive as key events with raw escape content.
// These arrive split across multiple key events, so we use a brief window
// after startup to swallow them all.
if keyMsg, ok := msg.(tea.KeyMsg); ok && !m.initInputReady {
// During init, drop ALL key events — they're terminal query responses
_ = keyMsg
return m, nil
}
switch msg := msg.(type) {
case tea.WindowSizeMsg:
m.width = msg.Width
m.height = msg.Height
m.viewport.setWidth(msg.Width)
m.status.setWidth(msg.Width)
m.input.textInput.Width = msg.Width - 4
if !m.splashShown {
m.splashShown = true
// bottomHeight = sep + input + sep + status = 4 (approx)
viewportHeight := msg.Height - 4
if viewportHeight < 1 {
viewportHeight = msg.Height
}
m.viewport.addSplash(viewportHeight)
// Delay input focus to let terminal query responses flush
return m, tea.Tick(100*time.Millisecond, func(time.Time) tea.Msg {
return inputReadyMsg{}
})
}
return m, nil
case tea.MouseMsg:
switch msg.Button {
case tea.MouseButtonWheelUp:
m.viewport.scrollUp(3, m.viewportHeight())
return m, nil
case tea.MouseButtonWheelDown:
m.viewport.scrollDown(3)
return m, nil
}
case tea.KeyMsg:
return m.handleKey(msg)
case submitMsg:
return m.handleSubmit(msg.text)
case fileDropMsg:
return m.handleFileDrop(msg.path)
case InitDoneMsg:
return m.handleInitDone(msg)
case api.StreamEventMsg:
return m.handleStreamEvent(msg)
case api.StreamDoneMsg:
return m.handleStreamDone(msg)
case AgentsLoadedMsg:
return m.handleAgentsLoaded(msg)
case SessionsLoadedMsg:
return m.handleSessionsLoaded(msg)
case SessionResumedMsg:
return m.handleSessionResumed(msg)
case FileUploadedMsg:
return m.handleFileUploaded(msg)
case inputReadyMsg:
m.initInputReady = true
m.input.textInput.Focus()
m.input.textInput.SetValue("")
return m, m.input.textInput.Cursor.BlinkCmd()
case resetQuitMsg:
m.quitPending = false
return m, nil
}
// Only forward messages to the text input after it's been focused
if m.splashShown {
var cmd tea.Cmd
m.input, cmd = m.input.update(msg)
return m, cmd
}
return m, nil
}
// View renders the UI.
// viewportHeight returns the number of visible chat rows, accounting for the
// dynamic bottom area (separator, menu, file badges, input, status bar).
func (m Model) viewportHeight() int {
menuHeight := 0
if m.input.menuVisible {
menuHeight = len(m.input.menuItems)
}
fileHeight := 0
if len(m.input.attachedFiles) > 0 {
fileHeight = 1
}
h := m.height - (1 + menuHeight + fileHeight + 1 + 1 + 1)
if h < 1 {
return 1
}
return h
}
func (m Model) View() string {
if m.width == 0 || m.height == 0 {
return ""
}
separator := lipgloss.NewStyle().Foreground(separatorColor).Render(
strings.Repeat("─", m.width),
)
menuView := m.input.viewMenu(m.width)
viewportHeight := m.viewportHeight()
var parts []string
parts = append(parts, m.viewport.view(viewportHeight))
parts = append(parts, separator)
if menuView != "" {
parts = append(parts, menuView)
}
parts = append(parts, m.input.viewInput())
parts = append(parts, separator)
parts = append(parts, m.status.view())
return strings.Join(parts, "\n")
}
// handleKey processes keyboard input.
func (m Model) handleKey(msg tea.KeyMsg) (tea.Model, tea.Cmd) {
switch msg.Type {
case tea.KeyEscape:
// Cancel streaming or close menu
if m.input.menuVisible {
m.input.menuVisible = false
return m, nil
}
if m.isStreaming {
return m.cancelStream()
}
// Dismiss picker
if m.viewport.pickerActive {
m.viewport.pickerActive = false
return m, nil
}
return m, nil
case tea.KeyCtrlD:
// If streaming, cancel first; require a fresh Ctrl+D pair to quit
if m.isStreaming {
return m.cancelStream()
}
if m.quitPending {
return m, tea.Quit
}
m.quitPending = true
m.viewport.addInfo("Press Ctrl+D again to quit.")
return m, tea.Tick(2*time.Second, func(t time.Time) tea.Msg {
return resetQuitMsg{}
})
case tea.KeyCtrlO:
m.viewport.showSources = !m.viewport.showSources
return m, nil
case tea.KeyEnter:
if m.viewport.pickerActive {
if len(m.viewport.pickerItems) > 0 {
item := m.viewport.pickerItems[m.viewport.pickerIndex]
if item.id == "" {
return m, nil
}
m.viewport.pickerActive = false
switch m.viewport.pickerType {
case pickerSession:
return cmdResume(m, item.id)
case pickerAgent:
return cmdSelectAgent(m, item.id)
}
}
return m, nil
}
case tea.KeyUp:
if m.viewport.pickerActive {
if m.viewport.pickerIndex > 0 {
m.viewport.pickerIndex--
}
return m, nil
}
case tea.KeyDown:
if m.viewport.pickerActive {
if m.viewport.pickerIndex < len(m.viewport.pickerItems)-1 {
m.viewport.pickerIndex++
}
return m, nil
}
case tea.KeyPgUp:
m.viewport.scrollUp(m.viewportHeight()/2, m.viewportHeight())
return m, nil
case tea.KeyPgDown:
m.viewport.scrollDown(m.viewportHeight() / 2)
return m, nil
case tea.KeyShiftUp:
m.viewport.scrollUp(3, m.viewportHeight())
return m, nil
case tea.KeyShiftDown:
m.viewport.scrollDown(3)
return m, nil
}
// Pass to input
var cmd tea.Cmd
m.input, cmd = m.input.update(msg)
return m, cmd
}
func (m Model) handleSubmit(text string) (tea.Model, tea.Cmd) {
if strings.HasPrefix(text, "/") {
return handleSlashCommand(m, text)
}
return m.sendMessage(text)
}
func (m Model) handleFileDrop(path string) (tea.Model, tea.Cmd) {
return cmdAttach(m, path)
}
func (m Model) cancelStream() (Model, tea.Cmd) {
if m.streamCancel != nil {
m.streamCancel()
}
if m.chatSessionID != nil {
sid := *m.chatSessionID
go m.client.StopChatSession(context.Background(), sid)
}
m, cmd := m.finishStream(nil)
m.viewport.addInfo("Generation stopped.")
return m, cmd
}
func (m Model) sendMessage(message string) (Model, tea.Cmd) {
if m.isStreaming {
return m, nil
}
m.viewport.addUserMessage(message)
m.viewport.startAgent()
// Prepare file descriptors
fileDescs := make([]models.FileDescriptorPayload, len(m.attachedFiles))
copy(fileDescs, m.attachedFiles)
m.attachedFiles = nil
m.input.clearFiles()
m.isStreaming = true
m.agentStarted = false
m.citations = make(map[int]string)
m.status.setStreaming(true)
ctx, cancel := context.WithCancel(context.Background())
m.streamCancel = cancel
ch := m.client.SendMessageStream(
ctx,
message,
m.chatSessionID,
m.agentID,
m.parentMessageID,
fileDescs,
)
m.streamCh = ch
return m, api.WaitForStreamEvent(ch)
}
func (m Model) handleStreamEvent(msg api.StreamEventMsg) (tea.Model, tea.Cmd) {
// Ignore stale events after cancellation
if !m.isStreaming {
return m, nil
}
switch e := msg.Event.(type) {
case models.SessionCreatedEvent:
m.chatSessionID = &e.ChatSessionID
m.needsRename = true
m.status.setSession(e.ChatSessionID)
case models.MessageIDEvent:
m.parentMessageID = &e.ReservedAgentMessageID
case models.MessageStartEvent:
m.agentStarted = true
case models.MessageDeltaEvent:
m.agentStarted = true
m.viewport.appendToken(e.Content)
case models.SearchStartEvent:
if e.IsInternetSearch {
m.viewport.addInfo("Web search…")
} else {
m.viewport.addInfo("Searching…")
}
case models.SearchQueriesEvent:
if len(e.Queries) > 0 {
queries := e.Queries
if len(queries) > 3 {
queries = queries[:3]
}
parts := make([]string, len(queries))
for i, q := range queries {
parts[i] = "\"" + q + "\""
}
m.viewport.addInfo("Searching: " + strings.Join(parts, ", "))
}
case models.SearchDocumentsEvent:
count := len(e.Documents)
suffix := "s"
if count == 1 {
suffix = ""
}
m.viewport.addInfo("Found " + strconv.Itoa(count) + " document" + suffix)
case models.ReasoningStartEvent:
m.viewport.addInfo("Thinking…")
case models.ReasoningDeltaEvent:
// We don't display reasoning text, just the indicator
case models.ReasoningDoneEvent:
// No-op
case models.CitationEvent:
m.citations[e.CitationNumber] = e.DocumentID
case models.ToolStartEvent:
m.viewport.addInfo("Using " + e.ToolName + "…")
case models.ResearchAgentStartEvent:
m.viewport.addInfo("Researching: " + e.ResearchTask)
case models.DeepResearchPlanDeltaEvent:
m.viewport.appendToken(e.Content)
case models.IntermediateReportDeltaEvent:
m.viewport.appendToken(e.Content)
case models.StopEvent:
return m.finishStream(nil)
case models.ErrorEvent:
m.viewport.addError(e.Error)
return m.finishStream(nil)
}
return m, api.WaitForStreamEvent(m.streamCh)
}
func (m Model) handleStreamDone(msg api.StreamDoneMsg) (tea.Model, tea.Cmd) {
// Ignore if already cancelled
if !m.isStreaming {
return m, nil
}
return m.finishStream(msg.Err)
}
func (m Model) finishStream(err error) (Model, tea.Cmd) {
m.viewport.finishAgent()
if m.agentStarted && len(m.citations) > 0 {
m.viewport.addCitations(m.citations)
}
m.isStreaming = false
m.agentStarted = false
m.status.setStreaming(false)
if m.streamCancel != nil {
m.streamCancel()
}
m.streamCancel = nil
m.streamCh = nil
// Auto-rename new sessions
if m.needsRename && m.chatSessionID != nil {
m.needsRename = false
sessionID := *m.chatSessionID
client := m.client
go func() {
_, _ = client.RenameChatSession(context.Background(), sessionID, nil)
}()
}
return m, nil
}
func (m Model) handleInitDone(msg InitDoneMsg) (tea.Model, tea.Cmd) {
if msg.Err != nil {
m.viewport.addWarning("Could not load agents. Using default.")
} else {
m.agents = msg.Agents
for _, p := range m.agents {
if p.ID == m.agentID {
m.agentName = p.Name
break
}
}
}
m.status.setServer(m.config.ServerURL)
m.status.setAgent(m.agentName)
return m, nil
}
func (m Model) handleAgentsLoaded(msg AgentsLoadedMsg) (tea.Model, tea.Cmd) {
if msg.Err != nil {
m.viewport.addError("Could not load agents: " + msg.Err.Error())
return m, nil
}
m.agents = msg.Agents
if len(m.agents) == 0 {
m.viewport.addInfo("No agents available.")
return m, nil
}
m.viewport.addInfo("Select an agent (Enter to select, Esc to cancel):")
var items []pickerItem
for _, p := range m.agents {
label := fmt.Sprintf("%d: %s", p.ID, p.Name)
if p.ID == m.agentID {
label += " *"
}
desc := p.Description
if len(desc) > 50 {
desc = desc[:50] + "..."
}
if desc != "" {
label += " - " + desc
}
items = append(items, pickerItem{
id: strconv.Itoa(p.ID),
label: label,
})
}
m.viewport.showPicker(pickerAgent, items)
return m, nil
}
func (m Model) handleSessionsLoaded(msg SessionsLoadedMsg) (tea.Model, tea.Cmd) {
if msg.Err != nil {
m.viewport.addError("Could not load sessions: " + msg.Err.Error())
return m, nil
}
if len(msg.Sessions) == 0 {
m.viewport.addInfo("No previous sessions found.")
return m, nil
}
m.viewport.addInfo("Select a session to resume (Enter to select, Esc to cancel):")
const maxSessions = 15
var items []pickerItem
for i, s := range msg.Sessions {
if i >= maxSessions {
break
}
name := "Untitled"
if s.Name != nil && *s.Name != "" {
name = *s.Name
}
sid := s.ID
if len(sid) > 8 {
sid = sid[:8]
}
items = append(items, pickerItem{
id: s.ID,
label: sid + " " + name + " (" + s.Created + ")",
})
}
if len(msg.Sessions) > maxSessions {
items = append(items, pickerItem{
id: "",
label: fmt.Sprintf("… and %d more (use /resume <id> to open)", len(msg.Sessions)-maxSessions),
})
}
m.viewport.showPicker(pickerSession, items)
return m, nil
}
func (m Model) handleSessionResumed(msg SessionResumedMsg) (tea.Model, tea.Cmd) {
if msg.Err != nil {
m.viewport.addError("Could not load session: " + msg.Err.Error())
return m, nil
}
// Cancel any in-progress stream before replacing the session
if m.isStreaming {
m, _ = m.cancelStream()
}
detail := msg.Detail
m.chatSessionID = &detail.ChatSessionID
m.viewport.clearDisplay()
m.status.setSession(detail.ChatSessionID)
if detail.AgentName != nil {
m.agentName = *detail.AgentName
m.status.setAgent(*detail.AgentName)
}
if detail.AgentID != nil {
m.agentID = *detail.AgentID
}
// Replay messages
for _, chatMsg := range detail.Messages {
switch chatMsg.MessageType {
case "user":
m.viewport.addUserMessage(chatMsg.Message)
case "assistant":
m.viewport.startAgent()
m.viewport.appendToken(chatMsg.Message)
m.viewport.finishAgent()
}
}
// Set parent to last message
if len(detail.Messages) > 0 {
lastID := detail.Messages[len(detail.Messages)-1].MessageID
m.parentMessageID = &lastID
}
desc := "Untitled"
if detail.Description != nil && *detail.Description != "" {
desc = *detail.Description
}
m.viewport.addInfo("Resumed session: " + desc)
return m, nil
}
func (m Model) handleFileUploaded(msg FileUploadedMsg) (tea.Model, tea.Cmd) {
if msg.Err != nil {
m.viewport.addError("Upload failed: " + msg.Err.Error())
return m, nil
}
m.attachedFiles = append(m.attachedFiles, *msg.Descriptor)
m.input.addFile(msg.FileName)
m.viewport.addInfo("Attached: " + msg.FileName)
return m, nil
}
type inputReadyMsg struct{}
type resetQuitMsg struct{}

View File

@@ -1,200 +0,0 @@
package tui
import (
"context"
"fmt"
"strconv"
"strings"
tea "github.com/charmbracelet/bubbletea"
"github.com/onyx-dot-app/onyx/cli/internal/api"
"github.com/onyx-dot-app/onyx/cli/internal/config"
"github.com/onyx-dot-app/onyx/cli/internal/models"
"github.com/onyx-dot-app/onyx/cli/internal/util"
)
// handleSlashCommand dispatches slash commands and returns updated model + cmd.
func handleSlashCommand(m Model, text string) (Model, tea.Cmd) {
parts := strings.SplitN(text, " ", 2)
command := strings.ToLower(parts[0])
arg := ""
if len(parts) > 1 {
arg = parts[1]
}
switch command {
case "/help":
m.viewport.addInfo(helpText)
return m, nil
case "/agent":
if arg != "" {
return cmdSelectAgent(m, arg)
}
return cmdShowAgents(m)
case "/attach":
return cmdAttach(m, arg)
case "/sessions", "/resume":
if strings.TrimSpace(arg) != "" {
return cmdResume(m, arg)
}
return cmdSessions(m)
case "/configure":
m.viewport.addInfo("Run 'onyx-cli configure' to change connection settings.")
return m, nil
case "/clear", "/new":
return cmdNew(m)
case "/connectors":
url := m.config.ServerURL + "/admin/indexing/status"
if util.OpenBrowser(url) {
m.viewport.addInfo("Opened " + url + " in browser")
} else {
m.viewport.addWarning("Failed to open browser. Visit: " + url)
}
return m, nil
case "/settings":
url := m.config.ServerURL + "/app/settings/general"
if util.OpenBrowser(url) {
m.viewport.addInfo("Opened " + url + " in browser")
} else {
m.viewport.addWarning("Failed to open browser. Visit: " + url)
}
return m, nil
case "/quit":
return m, tea.Quit
default:
m.viewport.addWarning(fmt.Sprintf("Unknown command: %s. Type /help for available commands.", command))
return m, nil
}
}
func cmdNew(m Model) (Model, tea.Cmd) {
if m.isStreaming {
m, _ = m.cancelStream()
}
m.chatSessionID = nil
parentID := -1
m.parentMessageID = &parentID
m.needsRename = false
m.citations = nil
m.viewport.clearAll()
// Re-add splash as a scrollable entry
viewportHeight := m.viewportHeight()
if viewportHeight < 1 {
viewportHeight = m.height
}
m.viewport.addSplash(viewportHeight)
m.status.setSession("")
return m, nil
}
func cmdShowAgents(m Model) (Model, tea.Cmd) {
m.viewport.addInfo("Loading agents...")
client := m.client
return m, func() tea.Msg {
agents, err := client.ListAgents(context.Background())
return AgentsLoadedMsg{Agents: agents, Err: err}
}
}
func cmdSelectAgent(m Model, idStr string) (Model, tea.Cmd) {
pid, err := strconv.Atoi(strings.TrimSpace(idStr))
if err != nil {
m.viewport.addWarning("Invalid agent ID. Use a number.")
return m, nil
}
var target *models.AgentSummary
for i := range m.agents {
if m.agents[i].ID == pid {
target = &m.agents[i]
break
}
}
if target == nil {
m.viewport.addWarning(fmt.Sprintf("Agent %d not found. Use /agent to see available agents.", pid))
return m, nil
}
m.agentID = target.ID
m.agentName = target.Name
m.status.setAgent(target.Name)
m.viewport.addInfo("Switched to agent: " + target.Name)
// Save preference
m.config.DefaultAgentID = target.ID
_ = config.Save(m.config)
return m, nil
}
func cmdAttach(m Model, pathStr string) (Model, tea.Cmd) {
if pathStr == "" {
m.viewport.addWarning("Usage: /attach <file_path>")
return m, nil
}
m.viewport.addInfo("Uploading " + pathStr + "...")
client := m.client
return m, func() tea.Msg {
fd, err := client.UploadFile(context.Background(), pathStr)
if err != nil {
return FileUploadedMsg{Err: err, FileName: pathStr}
}
return FileUploadedMsg{Descriptor: fd, FileName: pathStr}
}
}
func cmdSessions(m Model) (Model, tea.Cmd) {
m.viewport.addInfo("Loading sessions...")
client := m.client
return m, func() tea.Msg {
sessions, err := client.ListChatSessions(context.Background())
return SessionsLoadedMsg{Sessions: sessions, Err: err}
}
}
func cmdResume(m Model, sessionIDStr string) (Model, tea.Cmd) {
client := m.client
return m, func() tea.Msg {
targetID := sessionIDStr
// Short prefix — scan the list for a match
if len(sessionIDStr) < 36 {
sessions, err := client.ListChatSessions(context.Background())
if err != nil {
return SessionResumedMsg{Err: err}
}
for _, s := range sessions {
if strings.HasPrefix(s.ID, sessionIDStr) {
targetID = s.ID
break
}
}
}
detail, err := client.GetChatSession(context.Background(), targetID)
if err != nil {
return SessionResumedMsg{Err: fmt.Errorf("session not found: %s", sessionIDStr)}
}
return SessionResumedMsg{Detail: detail}
}
}
// loadAgentsCmd returns a tea.Cmd that loads agents from the API.
func loadAgentsCmd(client *api.Client) tea.Cmd {
return func() tea.Msg {
agents, err := client.ListAgents(context.Background())
return InitDoneMsg{Agents: agents, Err: err}
}
}

View File

@@ -1,23 +0,0 @@
package tui
const helpText = `Onyx CLI Commands
/help Show this help message
/clear Clear chat and start a new session
/agent List and switch agents
/attach <path> Attach a file to next message
/sessions Browse and resume previous sessions
/configure Re-run connection setup
/connectors Open connectors page in browser
/settings Open Onyx settings in browser
/quit Exit Onyx CLI
Keyboard Shortcuts
Enter Send message
Escape Cancel current generation
Ctrl+O Toggle source citations
Ctrl+D Quit (press twice)
Scroll Up/Down Mouse wheel or Shift+Up/Down
Page Up/Down Scroll half page
`

View File

@@ -1,241 +0,0 @@
package tui
import (
"os"
"path/filepath"
"strings"
"github.com/charmbracelet/bubbles/textinput"
tea "github.com/charmbracelet/bubbletea"
)
// slashCommand defines a slash command with its description.
type slashCommand struct {
command string
description string
}
var slashCommands = []slashCommand{
{"/help", "Show help message"},
{"/clear", "Clear chat and start a new session"},
{"/agent", "List and switch agents"},
{"/attach", "Attach a file to next message"},
{"/sessions", "Browse and resume previous sessions"},
{"/configure", "Re-run connection setup"},
{"/connectors", "Open connectors in browser"},
{"/settings", "Open settings in browser"},
{"/quit", "Exit Onyx CLI"},
}
// Commands that take arguments (filled in with trailing space on Tab/Enter).
var argCommands = map[string]bool{
"/attach": true,
}
// inputModel manages the text input and slash command menu.
type inputModel struct {
textInput textinput.Model
menuVisible bool
menuItems []slashCommand
menuIndex int
attachedFiles []string
}
func newInputModel() inputModel {
ti := textinput.New()
ti.Prompt = "" // We render our own prompt in viewInput()
ti.Placeholder = "Send a message…"
ti.CharLimit = 10000
// Don't focus here — focus after first WindowSizeMsg to avoid
// capturing terminal init escape sequences as input.
return inputModel{
textInput: ti,
}
}
func (m inputModel) update(msg tea.Msg) (inputModel, tea.Cmd) {
switch msg := msg.(type) {
case tea.KeyMsg:
return m.handleKey(msg)
}
var cmd tea.Cmd
m.textInput, cmd = m.textInput.Update(msg)
m = m.updateMenu()
return m, cmd
}
func (m inputModel) handleKey(msg tea.KeyMsg) (inputModel, tea.Cmd) {
switch msg.Type {
case tea.KeyUp:
if m.menuVisible && m.menuIndex > 0 {
m.menuIndex--
return m, nil
}
case tea.KeyDown:
if m.menuVisible && m.menuIndex < len(m.menuItems)-1 {
m.menuIndex++
return m, nil
}
case tea.KeyTab:
if m.menuVisible && len(m.menuItems) > 0 {
cmd := m.menuItems[m.menuIndex].command
if argCommands[cmd] {
m.textInput.SetValue(cmd + " ")
m.textInput.SetCursor(len(cmd) + 1)
} else {
m.textInput.SetValue(cmd)
m.textInput.SetCursor(len(cmd))
}
m.menuVisible = false
return m, nil
}
case tea.KeyEnter:
if m.menuVisible && len(m.menuItems) > 0 {
cmd := m.menuItems[m.menuIndex].command
if argCommands[cmd] {
m.textInput.SetValue(cmd + " ")
m.textInput.SetCursor(len(cmd) + 1)
m.menuVisible = false
return m, nil
}
// Execute immediately
m.textInput.SetValue("")
m.menuVisible = false
return m, func() tea.Msg { return submitMsg{text: cmd} }
}
text := strings.TrimSpace(m.textInput.Value())
if text == "" {
return m, nil
}
// Check for file path (drag-and-drop)
if dropped := detectFileDrop(text); dropped != "" {
m.textInput.SetValue("")
return m, func() tea.Msg { return fileDropMsg{path: dropped} }
}
m.textInput.SetValue("")
m.menuVisible = false
return m, func() tea.Msg { return submitMsg{text: text} }
case tea.KeyEscape:
if m.menuVisible {
m.menuVisible = false
return m, nil
}
}
var cmd tea.Cmd
m.textInput, cmd = m.textInput.Update(msg)
m = m.updateMenu()
return m, cmd
}
func (m inputModel) updateMenu() inputModel {
val := strings.TrimSpace(m.textInput.Value())
if strings.HasPrefix(val, "/") && !strings.Contains(val, " ") {
needle := strings.ToLower(val)
var filtered []slashCommand
for _, sc := range slashCommands {
if strings.HasPrefix(sc.command, needle) {
filtered = append(filtered, sc)
}
}
if len(filtered) > 0 {
m.menuVisible = true
m.menuItems = filtered
if m.menuIndex >= len(filtered) {
m.menuIndex = 0
}
} else {
m.menuVisible = false
}
} else {
m.menuVisible = false
}
return m
}
func (m *inputModel) addFile(name string) {
m.attachedFiles = append(m.attachedFiles, name)
}
func (m *inputModel) clearFiles() {
m.attachedFiles = nil
}
// submitMsg is sent when user submits text.
type submitMsg struct {
text string
}
// fileDropMsg is sent when a file path is detected.
type fileDropMsg struct {
path string
}
// detectFileDrop checks if the text looks like a file path.
func detectFileDrop(text string) string {
cleaned := strings.Trim(text, "'\"")
if cleaned == "" {
return ""
}
// Only treat as a file drop if it looks explicitly path-like
if !strings.HasPrefix(cleaned, "/") && !strings.HasPrefix(cleaned, "~") &&
!strings.HasPrefix(cleaned, "./") && !strings.HasPrefix(cleaned, "../") {
return ""
}
// Expand ~ to home dir
if strings.HasPrefix(cleaned, "~") {
home, err := os.UserHomeDir()
if err == nil {
cleaned = filepath.Join(home, cleaned[1:])
}
}
abs, err := filepath.Abs(cleaned)
if err != nil {
return ""
}
info, err := os.Stat(abs)
if err != nil {
return ""
}
if info.IsDir() {
return ""
}
return abs
}
// viewMenu renders the slash command menu.
func (m inputModel) viewMenu(width int) string {
if !m.menuVisible || len(m.menuItems) == 0 {
return ""
}
var lines []string
for i, item := range m.menuItems {
prefix := " "
if i == m.menuIndex {
prefix = "> "
}
line := prefix + item.command + " " + statusMsgStyle.Render(item.description)
lines = append(lines, line)
}
return strings.Join(lines, "\n")
}
// viewInput renders the input line with prompt and optional file badges.
func (m inputModel) viewInput() string {
var parts []string
if len(m.attachedFiles) > 0 {
badges := strings.Join(m.attachedFiles, "] [")
parts = append(parts, statusMsgStyle.Render("Attached: ["+badges+"]"))
}
parts = append(parts, inputPrompt+m.textInput.View())
return strings.Join(parts, "\n")
}

View File

@@ -1,36 +0,0 @@
package tui
import (
"github.com/onyx-dot-app/onyx/cli/internal/models"
)
// InitDoneMsg signals that async initialization is complete.
type InitDoneMsg struct {
Agents []models.AgentSummary
Err error
}
// SessionsLoadedMsg carries loaded chat sessions.
type SessionsLoadedMsg struct {
Sessions []models.ChatSessionDetails
Err error
}
// SessionResumedMsg carries a loaded session detail.
type SessionResumedMsg struct {
Detail *models.ChatSessionDetailResponse
Err error
}
// FileUploadedMsg carries an uploaded file descriptor.
type FileUploadedMsg struct {
Descriptor *models.FileDescriptorPayload
FileName string
Err error
}
// AgentsLoadedMsg carries freshly fetched agents from the API.
type AgentsLoadedMsg struct {
Agents []models.AgentSummary
Err error
}

View File

@@ -1,79 +0,0 @@
package tui
import (
"strings"
"github.com/charmbracelet/lipgloss"
)
const onyxLogo = ` ██████╗ ███╗ ██╗██╗ ██╗██╗ ██╗
██╔═══██╗████╗ ██║╚██╗ ██╔╝╚██╗██╔╝
██║ ██║██╔██╗ ██║ ╚████╔╝ ╚███╔╝
██║ ██║██║╚██╗██║ ╚██╔╝ ██╔██╗
╚██████╔╝██║ ╚████║ ██║ ██╔╝ ██╗
╚═════╝ ╚═╝ ╚═══╝ ╚═╝ ╚═╝ ╚═╝`
const tagline = "Your terminal interface for Onyx"
const splashHint = "Type a message to begin · /help for commands"
// renderSplash renders the splash screen centered for the given dimensions.
func renderSplash(width, height int) string {
// Render the logo as a single block (don't center individual lines)
logo := splashStyle.Render(onyxLogo)
// Center tagline and hint relative to the logo block width
logoWidth := lipgloss.Width(logo)
tag := lipgloss.NewStyle().Width(logoWidth).Align(lipgloss.Center).Render(
taglineStyle.Render(tagline),
)
hint := lipgloss.NewStyle().Width(logoWidth).Align(lipgloss.Center).Render(
hintStyle.Render(splashHint),
)
block := lipgloss.JoinVertical(lipgloss.Left, logo, "", tag, hint)
return lipgloss.Place(width, height, lipgloss.Center, lipgloss.Center, block)
}
// RenderSplashOnboarding renders splash for the terminal onboarding screen.
func RenderSplashOnboarding(width, height int) string {
// Render the logo as a styled block, then center it as a unit
styledLogo := splashStyle.Render(onyxLogo)
logoWidth := lipgloss.Width(styledLogo)
logoLines := strings.Split(styledLogo, "\n")
logoHeight := len(logoLines)
contentHeight := logoHeight + 2 // logo + blank + tagline
topPad := (height - contentHeight) / 2
if topPad < 1 {
topPad = 1
}
// Center the entire logo block horizontally
blockPad := (width - logoWidth) / 2
if blockPad < 0 {
blockPad = 0
}
var b strings.Builder
for i := 0; i < topPad; i++ {
b.WriteByte('\n')
}
for _, line := range logoLines {
b.WriteString(strings.Repeat(" ", blockPad))
b.WriteString(line)
b.WriteByte('\n')
}
b.WriteByte('\n')
tagPad := (width - len(tagline)) / 2
if tagPad < 0 {
tagPad = 0
}
b.WriteString(strings.Repeat(" ", tagPad))
b.WriteString(taglineStyle.Render(tagline))
b.WriteByte('\n')
return b.String()
}

View File

@@ -1,60 +0,0 @@
package tui
import (
"strings"
"github.com/charmbracelet/lipgloss"
)
// statusBar manages the footer status display.
type statusBar struct {
agentName string
serverURL string
sessionID string
streaming bool
width int
}
func newStatusBar() statusBar {
return statusBar{
agentName: "Default",
}
}
func (s *statusBar) setAgent(name string) { s.agentName = name }
func (s *statusBar) setServer(url string) { s.serverURL = url }
func (s *statusBar) setSession(id string) {
if len(id) > 8 {
id = id[:8]
}
s.sessionID = id
}
func (s *statusBar) setStreaming(v bool) { s.streaming = v }
func (s *statusBar) setWidth(w int) { s.width = w }
func (s statusBar) view() string {
var leftParts []string
if s.serverURL != "" {
leftParts = append(leftParts, s.serverURL)
}
name := s.agentName
if name == "" {
name = "Default"
}
leftParts = append(leftParts, name)
left := statusBarStyle.Render(strings.Join(leftParts, " · "))
right := "Ctrl+D to quit"
if s.streaming {
right = "Esc to cancel"
}
rightRendered := statusBarStyle.Render(right)
// Fill space between left and right
gap := s.width - lipgloss.Width(left) - lipgloss.Width(rightRendered)
if gap < 1 {
gap = 1
}
return left + strings.Repeat(" ", gap) + rightRendered
}

View File

@@ -1,29 +0,0 @@
package tui
import "github.com/charmbracelet/lipgloss"
var (
// Colors
accentColor = lipgloss.Color("#6c8ebf")
dimColor = lipgloss.Color("#555577")
errorColor = lipgloss.Color("#ff5555")
splashColor = lipgloss.Color("#7C6AEF")
separatorColor = lipgloss.Color("#333355")
citationColor = lipgloss.Color("#666688")
// Styles
userPrefixStyle = lipgloss.NewStyle().Foreground(dimColor)
agentDot = lipgloss.NewStyle().Foreground(accentColor).Bold(true).Render("◉")
infoStyle = lipgloss.NewStyle().Foreground(lipgloss.Color("#b0b0cc"))
dimInfoStyle = lipgloss.NewStyle().Foreground(dimColor)
statusMsgStyle = dimInfoStyle // used for slash menu descriptions, file badges
errorStyle = lipgloss.NewStyle().Foreground(errorColor).Bold(true)
warnStyle = lipgloss.NewStyle().Foreground(lipgloss.Color("#ffcc00"))
citationStyle = lipgloss.NewStyle().Foreground(citationColor)
statusBarStyle = lipgloss.NewStyle().Foreground(dimColor)
inputPrompt = lipgloss.NewStyle().Foreground(accentColor).Render(" ")
splashStyle = lipgloss.NewStyle().Foreground(splashColor).Bold(true)
taglineStyle = lipgloss.NewStyle().Foreground(lipgloss.Color("#A0A0A0"))
hintStyle = lipgloss.NewStyle().Foreground(dimColor)
)

View File

@@ -1,448 +0,0 @@
package tui
import (
"fmt"
"sort"
"strings"
"github.com/charmbracelet/glamour"
"github.com/charmbracelet/glamour/styles"
"github.com/charmbracelet/lipgloss"
)
// entryKind is the type of chat entry.
type entryKind int
const (
entryUser entryKind = iota
entryAgent
entryInfo
entryError
entryCitation
)
// chatEntry is a single rendered entry in the chat history.
type chatEntry struct {
kind entryKind
content string // raw content (for agent: the markdown source)
rendered string // pre-rendered output
citations []string // citation lines (for citation entries)
}
// pickerKind distinguishes what the picker is selecting.
type pickerKind int
const (
pickerSession pickerKind = iota
pickerAgent
)
// pickerItem is a selectable item in the picker.
type pickerItem struct {
id string
label string
}
// viewport manages the chat display.
type viewport struct {
entries []chatEntry
width int
streaming bool
streamBuf string
showSources bool
renderer *glamour.TermRenderer
pickerItems []pickerItem
pickerActive bool
pickerIndex int
pickerType pickerKind
scrollOffset int // lines scrolled up from bottom (0 = pinned to bottom)
}
// newMarkdownRenderer creates a Glamour renderer with zero left margin.
func newMarkdownRenderer(width int) *glamour.TermRenderer {
style := styles.DarkStyleConfig
zero := uint(0)
style.Document.Margin = &zero
r, _ := glamour.NewTermRenderer(
glamour.WithStyles(style),
glamour.WithWordWrap(width-4),
)
return r
}
func newViewport(width int) *viewport {
return &viewport{
width: width,
renderer: newMarkdownRenderer(width),
}
}
func (v *viewport) addSplash(height int) {
splash := renderSplash(v.width, height)
v.entries = append(v.entries, chatEntry{
kind: entryInfo,
rendered: splash,
})
}
func (v *viewport) setWidth(w int) {
v.width = w
v.renderer = newMarkdownRenderer(w)
for i := range v.entries {
if v.entries[i].kind == entryAgent && v.entries[i].content != "" {
v.entries[i].rendered = v.renderAgentContent(v.entries[i].content)
}
}
}
func (v *viewport) addUserMessage(msg string) {
rendered := "\n" + userPrefixStyle.Render(" ") + msg
v.entries = append(v.entries, chatEntry{
kind: entryUser,
content: msg,
rendered: rendered,
})
}
func (v *viewport) startAgent() {
v.streaming = true
v.streamBuf = ""
// Add a blank-line spacer entry before the agent message
v.entries = append(v.entries, chatEntry{kind: entryInfo, rendered: ""})
}
func (v *viewport) appendToken(token string) {
v.streamBuf += token
}
func (v *viewport) finishAgent() {
if v.streamBuf == "" {
v.streaming = false
// Remove the blank spacer entry added by startAgent()
if len(v.entries) > 0 && v.entries[len(v.entries)-1].kind == entryInfo && v.entries[len(v.entries)-1].rendered == "" {
v.entries = v.entries[:len(v.entries)-1]
}
return
}
rendered := v.renderAgentContent(v.streamBuf)
v.entries = append(v.entries, chatEntry{
kind: entryAgent,
content: v.streamBuf,
rendered: rendered,
})
v.streaming = false
v.streamBuf = ""
}
func (v *viewport) renderAgentContent(content string) string {
rendered := v.renderMarkdown(content)
rendered = strings.TrimLeft(rendered, "\n")
rendered = strings.TrimRight(rendered, "\n")
lines := strings.Split(rendered, "\n")
if len(lines) > 0 {
lines[0] = agentDot + " " + lines[0]
for i := 1; i < len(lines); i++ {
lines[i] = " " + lines[i]
}
}
return strings.Join(lines, "\n")
}
func (v *viewport) renderMarkdown(md string) string {
if v.renderer == nil {
return md
}
out, err := v.renderer.Render(md)
if err != nil {
return md
}
return out
}
func (v *viewport) addInfo(msg string) {
rendered := infoStyle.Render("● " + msg)
v.entries = append(v.entries, chatEntry{
kind: entryInfo,
content: msg,
rendered: rendered,
})
}
func (v *viewport) addWarning(msg string) {
rendered := warnStyle.Render("● " + msg)
v.entries = append(v.entries, chatEntry{
kind: entryError,
content: msg,
rendered: rendered,
})
}
func (v *viewport) addError(msg string) {
rendered := errorStyle.Render("● Error: ") + msg
v.entries = append(v.entries, chatEntry{
kind: entryError,
content: msg,
rendered: rendered,
})
}
func (v *viewport) addCitations(citations map[int]string) {
if len(citations) == 0 {
return
}
keys := make([]int, 0, len(citations))
for k := range citations {
keys = append(keys, k)
}
sort.Ints(keys)
var parts []string
for _, num := range keys {
parts = append(parts, fmt.Sprintf("[%d] %s", num, citations[num]))
}
text := fmt.Sprintf("Sources (%d): %s", len(citations), strings.Join(parts, " "))
var citLines []string
citLines = append(citLines, text)
v.entries = append(v.entries, chatEntry{
kind: entryCitation,
content: text,
rendered: citationStyle.Render("● "+text),
citations: citLines,
})
}
func (v *viewport) showPicker(kind pickerKind, items []pickerItem) {
v.pickerItems = items
v.pickerType = kind
v.pickerActive = true
v.pickerIndex = 0
}
func (v *viewport) scrollUp(n int, height int) {
v.scrollOffset += n
maxScroll := v.totalLines() - height
if maxScroll < 0 {
maxScroll = 0
}
if v.scrollOffset > maxScroll {
v.scrollOffset = maxScroll
}
}
func (v *viewport) scrollDown(n int) {
v.scrollOffset -= n
if v.scrollOffset < 0 {
v.scrollOffset = 0
}
}
func (v *viewport) clearAll() {
v.entries = nil
v.streaming = false
v.streamBuf = ""
v.pickerItems = nil
v.pickerActive = false
v.scrollOffset = 0
}
func (v *viewport) clearDisplay() {
v.entries = nil
v.scrollOffset = 0
v.streaming = false
v.streamBuf = ""
}
// pickerTitle returns a title for the current picker kind.
func (v *viewport) pickerTitle() string {
switch v.pickerType {
case pickerAgent:
return "Select Agent"
case pickerSession:
return "Resume Session"
default:
return "Select"
}
}
// renderPicker renders the picker as a bordered overlay.
func (v *viewport) renderPicker(width, height int) string {
title := v.pickerTitle()
// Determine picker dimensions
maxItems := len(v.pickerItems)
panelWidth := width - 4
if panelWidth < 30 {
panelWidth = 30
}
if panelWidth > 70 {
panelWidth = 70
}
innerWidth := panelWidth - 4 // border + padding
// Visible window of items (scroll if too many)
maxVisible := height - 6 // room for border, title, hint
if maxVisible < 3 {
maxVisible = 3
}
if maxVisible > maxItems {
maxVisible = maxItems
}
// Calculate scroll window around current index
startIdx := 0
if v.pickerIndex >= maxVisible {
startIdx = v.pickerIndex - maxVisible + 1
}
endIdx := startIdx + maxVisible
if endIdx > maxItems {
endIdx = maxItems
startIdx = endIdx - maxVisible
if startIdx < 0 {
startIdx = 0
}
}
var itemLines []string
for i := startIdx; i < endIdx; i++ {
item := v.pickerItems[i]
label := item.label
labelRunes := []rune(label)
if len(labelRunes) > innerWidth-4 {
label = string(labelRunes[:innerWidth-7]) + "..."
}
if i == v.pickerIndex {
line := lipgloss.NewStyle().Foreground(accentColor).Bold(true).Render("> " + label)
itemLines = append(itemLines, line)
} else {
itemLines = append(itemLines, " "+label)
}
}
hint := lipgloss.NewStyle().Foreground(dimColor).Render("↑↓ navigate • enter select • esc cancel")
body := strings.Join(itemLines, "\n") + "\n\n" + hint
panel := lipgloss.NewStyle().
Border(lipgloss.RoundedBorder()).
BorderForeground(accentColor).
Padding(1, 2).
Width(panelWidth).
Render(body)
titleRendered := lipgloss.NewStyle().
Foreground(accentColor).
Bold(true).
Render(" " + title + " ")
// Build top border manually to avoid ANSI-corrupted rune slicing.
// panelWidth+2 accounts for the left and right border characters.
borderColor := lipgloss.NewStyle().Foreground(accentColor)
titleWidth := lipgloss.Width(titleRendered)
rightDashes := panelWidth + 2 - 3 - titleWidth // total - "╭─" - "╮" - title
if rightDashes < 0 {
rightDashes = 0
}
topBorder := borderColor.Render("╭─") + titleRendered +
borderColor.Render(strings.Repeat("─", rightDashes)+"╮")
panelLines := strings.Split(panel, "\n")
if len(panelLines) > 0 {
panelLines[0] = topBorder
}
panel = strings.Join(panelLines, "\n")
// Center the panel in the viewport
return lipgloss.Place(width, height, lipgloss.Center, lipgloss.Center, panel)
}
// totalLines computes the total number of rendered content lines.
func (v *viewport) totalLines() int {
var lines []string
for _, e := range v.entries {
if e.kind == entryCitation && !v.showSources {
continue
}
lines = append(lines, e.rendered)
}
if v.streaming && v.streamBuf != "" {
bufLines := strings.Split(v.streamBuf, "\n")
if len(bufLines) > 0 {
bufLines[0] = agentDot + " " + bufLines[0]
for i := 1; i < len(bufLines); i++ {
bufLines[i] = " " + bufLines[i]
}
}
lines = append(lines, strings.Join(bufLines, "\n"))
} else if v.streaming {
lines = append(lines, agentDot+" ")
}
content := strings.Join(lines, "\n")
return len(strings.Split(content, "\n"))
}
// view renders the full viewport content.
func (v *viewport) view(height int) string {
// If picker is active, render it as an overlay
if v.pickerActive && len(v.pickerItems) > 0 {
return v.renderPicker(v.width, height)
}
var lines []string
for _, e := range v.entries {
if e.kind == entryCitation && !v.showSources {
continue
}
lines = append(lines, e.rendered)
}
// Streaming buffer (plain text, not markdown)
if v.streaming && v.streamBuf != "" {
bufLines := strings.Split(v.streamBuf, "\n")
if len(bufLines) > 0 {
bufLines[0] = agentDot + " " + bufLines[0]
for i := 1; i < len(bufLines); i++ {
bufLines[i] = " " + bufLines[i]
}
}
lines = append(lines, strings.Join(bufLines, "\n"))
} else if v.streaming {
lines = append(lines, agentDot+" ")
}
content := strings.Join(lines, "\n")
contentLines := strings.Split(content, "\n")
total := len(contentLines)
maxScroll := total - height
if maxScroll < 0 {
maxScroll = 0
}
scrollOffset := v.scrollOffset
if scrollOffset > maxScroll {
scrollOffset = maxScroll
}
if total <= height {
// Content fits — pad with empty lines at top to push content down
padding := make([]string, height-total)
for i := range padding {
padding[i] = ""
}
contentLines = append(padding, contentLines...)
} else {
// Show a window: end is (total - scrollOffset), start is (end - height)
end := total - scrollOffset
start := end - height
if start < 0 {
start = 0
}
contentLines = contentLines[start:end]
}
return strings.Join(contentLines, "\n")
}

View File

@@ -1,264 +0,0 @@
package tui
import (
"regexp"
"strings"
"testing"
)
// stripANSI removes ANSI escape sequences for test comparisons.
var ansiRegex = regexp.MustCompile(`\x1b\[[0-9;]*m`)
func stripANSI(s string) string {
return ansiRegex.ReplaceAllString(s, "")
}
func TestAddUserMessage(t *testing.T) {
v := newViewport(80)
v.addUserMessage("hello world")
if len(v.entries) != 1 {
t.Fatalf("expected 1 entry, got %d", len(v.entries))
}
e := v.entries[0]
if e.kind != entryUser {
t.Errorf("expected entryUser, got %d", e.kind)
}
if e.content != "hello world" {
t.Errorf("expected content 'hello world', got %q", e.content)
}
plain := stripANSI(e.rendered)
if !strings.Contains(plain, "") {
t.Errorf("expected rendered to contain , got %q", plain)
}
if !strings.Contains(plain, "hello world") {
t.Errorf("expected rendered to contain message text, got %q", plain)
}
}
func TestStartAndFinishAgent(t *testing.T) {
v := newViewport(80)
v.startAgent()
if !v.streaming {
t.Error("expected streaming to be true after startAgent")
}
if len(v.entries) != 1 {
t.Fatalf("expected 1 spacer entry, got %d", len(v.entries))
}
if v.entries[0].rendered != "" {
t.Errorf("expected empty spacer, got %q", v.entries[0].rendered)
}
v.appendToken("Hello ")
v.appendToken("world")
if v.streamBuf != "Hello world" {
t.Errorf("expected streamBuf 'Hello world', got %q", v.streamBuf)
}
v.finishAgent()
if v.streaming {
t.Error("expected streaming to be false after finishAgent")
}
if v.streamBuf != "" {
t.Errorf("expected empty streamBuf after finish, got %q", v.streamBuf)
}
if len(v.entries) != 2 {
t.Fatalf("expected 2 entries (spacer + agent), got %d", len(v.entries))
}
e := v.entries[1]
if e.kind != entryAgent {
t.Errorf("expected entryAgent, got %d", e.kind)
}
if e.content != "Hello world" {
t.Errorf("expected content 'Hello world', got %q", e.content)
}
plain := stripANSI(e.rendered)
if !strings.Contains(plain, "Hello world") {
t.Errorf("expected rendered to contain message text, got %q", plain)
}
}
func TestFinishAgentNoPadding(t *testing.T) {
v := newViewport(80)
v.startAgent()
v.appendToken("Test message")
v.finishAgent()
e := v.entries[1]
// First line should not start with plain spaces (ANSI codes are OK)
plain := stripANSI(e.rendered)
lines := strings.Split(plain, "\n")
if strings.HasPrefix(lines[0], " ") {
t.Errorf("first line should not start with spaces, got %q", lines[0])
}
}
func TestFinishAgentMultiline(t *testing.T) {
v := newViewport(80)
v.startAgent()
v.appendToken("Line one\n\nLine three")
v.finishAgent()
e := v.entries[1]
plain := stripANSI(e.rendered)
// Glamour may merge or reformat lines; just check content is present
if !strings.Contains(plain, "Line one") {
t.Errorf("expected 'Line one' in rendered, got %q", plain)
}
if !strings.Contains(plain, "Line three") {
t.Errorf("expected 'Line three' in rendered, got %q", plain)
}
}
func TestFinishAgentEmpty(t *testing.T) {
v := newViewport(80)
v.startAgent()
v.finishAgent()
if v.streaming {
t.Error("expected streaming to be false")
}
if len(v.entries) != 0 {
t.Errorf("expected 0 entries (spacer removed), got %d", len(v.entries))
}
}
func TestAddInfo(t *testing.T) {
v := newViewport(80)
v.addInfo("test info")
if len(v.entries) != 1 {
t.Fatalf("expected 1 entry, got %d", len(v.entries))
}
e := v.entries[0]
if e.kind != entryInfo {
t.Errorf("expected entryInfo, got %d", e.kind)
}
plain := stripANSI(e.rendered)
if strings.HasPrefix(plain, " ") {
t.Errorf("info should not have leading spaces, got %q", plain)
}
}
func TestAddError(t *testing.T) {
v := newViewport(80)
v.addError("something broke")
if len(v.entries) != 1 {
t.Fatalf("expected 1 entry, got %d", len(v.entries))
}
e := v.entries[0]
if e.kind != entryError {
t.Errorf("expected entryError, got %d", e.kind)
}
plain := stripANSI(e.rendered)
if !strings.Contains(plain, "something broke") {
t.Errorf("expected error message in rendered, got %q", plain)
}
}
func TestAddCitations(t *testing.T) {
v := newViewport(80)
v.addCitations(map[int]string{1: "doc-a", 2: "doc-b"})
if len(v.entries) != 1 {
t.Fatalf("expected 1 entry, got %d", len(v.entries))
}
e := v.entries[0]
if e.kind != entryCitation {
t.Errorf("expected entryCitation, got %d", e.kind)
}
plain := stripANSI(e.rendered)
if !strings.Contains(plain, "Sources (2)") {
t.Errorf("expected sources count in rendered, got %q", plain)
}
if strings.HasPrefix(plain, " ") {
t.Errorf("citation should not have leading spaces, got %q", plain)
}
}
func TestAddCitationsEmpty(t *testing.T) {
v := newViewport(80)
v.addCitations(map[int]string{})
if len(v.entries) != 0 {
t.Errorf("expected no entries for empty citations, got %d", len(v.entries))
}
}
func TestCitationVisibility(t *testing.T) {
v := newViewport(80)
v.addInfo("hello")
v.addCitations(map[int]string{1: "doc"})
v.showSources = false
view := v.view(20)
plain := stripANSI(view)
if strings.Contains(plain, "Sources") {
t.Error("expected citations hidden when showSources=false")
}
v.showSources = true
view = v.view(20)
plain = stripANSI(view)
if !strings.Contains(plain, "Sources") {
t.Error("expected citations visible when showSources=true")
}
}
func TestClearAll(t *testing.T) {
v := newViewport(80)
v.addUserMessage("test")
v.startAgent()
v.appendToken("response")
v.clearAll()
if len(v.entries) != 0 {
t.Errorf("expected no entries after clearAll, got %d", len(v.entries))
}
if v.streaming {
t.Error("expected streaming=false after clearAll")
}
if v.streamBuf != "" {
t.Errorf("expected empty streamBuf after clearAll, got %q", v.streamBuf)
}
}
func TestClearDisplay(t *testing.T) {
v := newViewport(80)
v.addUserMessage("test")
v.clearDisplay()
if len(v.entries) != 0 {
t.Errorf("expected no entries after clearDisplay, got %d", len(v.entries))
}
}
func TestViewPadsShortContent(t *testing.T) {
v := newViewport(80)
v.addInfo("hello")
view := v.view(10)
lines := strings.Split(view, "\n")
if len(lines) != 10 {
t.Errorf("expected 10 lines (padded), got %d", len(lines))
}
}
func TestViewTruncatesTallContent(t *testing.T) {
v := newViewport(80)
for i := 0; i < 20; i++ {
v.addInfo("line")
}
view := v.view(5)
lines := strings.Split(view, "\n")
if len(lines) != 5 {
t.Errorf("expected 5 lines (truncated), got %d", len(lines))
}
}

View File

@@ -1,29 +0,0 @@
// Package util provides shared utility functions.
package util
import (
"os/exec"
"runtime"
)
// OpenBrowser opens the given URL in the user's default browser.
// Returns true if the browser was launched successfully.
func OpenBrowser(url string) bool {
var cmd *exec.Cmd
switch runtime.GOOS {
case "darwin":
cmd = exec.Command("open", url)
case "linux":
cmd = exec.Command("xdg-open", url)
case "windows":
cmd = exec.Command("rundll32", "url.dll,FileProtocolHandler", url)
}
if cmd != nil {
if err := cmd.Start(); err == nil {
// Reap the child process to avoid zombies.
go func() { _ = cmd.Wait() }()
return true
}
}
return false
}

View File

@@ -1,13 +0,0 @@
// Package util provides shared utilities for the Onyx CLI.
package util
import "github.com/charmbracelet/lipgloss"
// Shared text styles used across the CLI.
var (
BoldStyle = lipgloss.NewStyle().Bold(true)
DimStyle = lipgloss.NewStyle().Foreground(lipgloss.Color("#555577"))
GreenStyle = lipgloss.NewStyle().Foreground(lipgloss.Color("#00cc66")).Bold(true)
RedStyle = lipgloss.NewStyle().Foreground(lipgloss.Color("#ff5555")).Bold(true)
YellowStyle = lipgloss.NewStyle().Foreground(lipgloss.Color("#ffcc00"))
)

View File

@@ -1,23 +0,0 @@
package main
import (
"fmt"
"os"
"github.com/onyx-dot-app/onyx/cli/cmd"
)
var (
version = "dev"
commit = "none"
)
func main() {
cmd.Version = version
cmd.Commit = commit
if err := cmd.Execute(); err != nil {
fmt.Fprintf(os.Stderr, "Error: %v\n", err)
os.Exit(1)
}
}

View File

@@ -92,12 +92,6 @@ Add clear comments:
- Connector code (data → Onyx documents):
- Any in-memory structure that can grow without bound based on input must be periodically size-checked.
- If a connector is OOMing (often shows up as “missing celery tasks”), this is a top thing to check retroactively.
- Async and event loops:
- Never introduce new async/event loop Python code, and try to make existing
async code synchronous when possible if it makes sense.
- Writing async code without 100% understanding the code and having a
concrete reason to do so is likely to introduce bugs and not add any
meaningful performance gains.
---

View File

@@ -1868,13 +1868,13 @@
}
},
"node_modules/@typescript-eslint/typescript-estree/node_modules/minimatch": {
"version": "9.0.9",
"resolved": "https://registry.npmjs.org/minimatch/-/minimatch-9.0.9.tgz",
"integrity": "sha512-OBwBN9AL4dqmETlpS2zasx+vTeWclWzkblfZk7KTA5j3jeOONz/tRCnZomUyvNg83wL5Zv9Ss6HMJXAgL8R2Yg==",
"version": "9.0.5",
"resolved": "https://registry.npmjs.org/minimatch/-/minimatch-9.0.5.tgz",
"integrity": "sha512-G6T0ZX48xgozx7587koeX9Ys2NYy6Gmv//P89sEte9V9whIapMNF4idKxnW2QtCcLiTWlb/wfCabAtAFWhhBow==",
"dev": true,
"license": "ISC",
"dependencies": {
"brace-expansion": "^2.0.2"
"brace-expansion": "^2.0.1"
},
"engines": {
"node": ">=16 || 14 >=14.17"
@@ -5857,11 +5857,10 @@
}
},
"node_modules/minimatch": {
"version": "3.1.5",
"resolved": "https://registry.npmjs.org/minimatch/-/minimatch-3.1.5.tgz",
"integrity": "sha512-VgjWUsnnT6n+NUk6eZq77zeFdpW2LWDzP6zFGrCbHXiYNul5Dzqk2HHQ5uFH2DNW5Xbp8+jVzaeNt94ssEEl4w==",
"version": "3.1.2",
"resolved": "https://registry.npmjs.org/minimatch/-/minimatch-3.1.2.tgz",
"integrity": "sha512-J7p63hRiAjw1NDEww1W7i37+ByIrOWO5XQQAzZ3VOcL0PNybwpfmV/N05zFAzwQ9USyEcX6t3UO+K5aqBQOIHw==",
"dev": true,
"license": "ISC",
"dependencies": {
"brace-expansion": "^1.1.7"
},

View File

@@ -11,6 +11,7 @@ dependencies = [
"aioboto3==15.1.0",
"cohere==5.6.1",
"fastapi==0.133.1",
"google-cloud-aiplatform==1.121.0",
"google-genai==1.52.0",
"litellm==1.81.6",
"openai==2.14.0",
@@ -143,7 +144,7 @@ dev = [
"matplotlib==3.10.8",
"mypy-extensions==1.0.0",
"mypy==1.13.0",
"onyx-devtools==0.6.3",
"onyx-devtools==0.6.2",
"openapi-generator-cli==7.17.0",
"pandas-stubs~=2.3.3",
"pre-commit==3.2.2",

View File

@@ -6,7 +6,6 @@ import (
"os"
"os/exec"
"regexp"
"strconv"
"strings"
log "github.com/sirupsen/logrus"
@@ -34,15 +33,11 @@ func NewCherryPickCommand() *cobra.Command {
opts := &CherryPickOptions{}
cmd := &cobra.Command{
Use: "cherry-pick <commit-or-pr> [<commit-or-pr>...]",
Use: "cherry-pick <commit-sha> [<commit-sha>...]",
Aliases: []string{"cp"},
Short: "Cherry-pick one or more commits (or PRs) to a release branch",
Short: "Cherry-pick one or more commits to a release branch",
Long: `Cherry-pick one or more commits to a release branch and create a PR.
Arguments can be commit SHAs or GitHub PR numbers. A purely numeric argument
with fewer than 6 digits is treated as a PR number and resolved to its merge
commit automatically.
This command will:
1. Find the nearest stable version tag
2. Fetch the corresponding release branch(es)
@@ -59,8 +54,7 @@ If a cherry-pick hits a merge conflict, resolve it manually, then run:
Example usage:
$ ods cherry-pick foo123 bar456 --release 2.5 --release 2.6
$ ods cp foo123 --release 2.5
$ ods cp 1234 --release 2.5 # cherry-pick merge commit of PR #1234`,
$ ods cp foo123 --release 2.5`,
Args: func(cmd *cobra.Command, args []string) error {
cont, _ := cmd.Flags().GetBool("continue")
if cont {
@@ -96,12 +90,11 @@ Example usage:
func runCherryPick(cmd *cobra.Command, args []string, opts *CherryPickOptions) {
git.CheckGitHubCLI()
// Resolve any PR numbers (e.g. "1234") to their merge commit SHAs
commitSHAs, labels := resolveArgs(args)
commitSHAs := args
if len(commitSHAs) == 1 {
log.Debugf("Cherry-picking %s (%s)", labels[0], commitSHAs[0])
log.Debugf("Cherry-picking commit: %s", commitSHAs[0])
} else {
log.Debugf("Cherry-picking %d commits: %s", len(commitSHAs), strings.Join(labels, ", "))
log.Debugf("Cherry-picking %d commits: %s", len(commitSHAs), strings.Join(commitSHAs, ", "))
}
if opts.DryRun {
@@ -301,11 +294,6 @@ func runCherryPickContinue() {
log.Infof("Resuming cherry-pick (original branch: %s, releases: %v)", state.OriginalBranch, state.Releases)
// If a rebase is in progress (REBASE_HEAD exists), it must be resolved first
if git.IsRebaseInProgress() {
log.Fatal("A git rebase is in progress. Resolve it first:\n To continue: git rebase --continue\n To abort: git rebase --abort\nThen re-run: ods cherry-pick --continue")
}
// If git cherry-pick is still in progress (CHERRY_PICK_HEAD exists), continue it
if git.IsCherryPickInProgress() {
log.Info("Continuing in-progress cherry-pick...")
@@ -339,23 +327,6 @@ func cherryPickToRelease(commitSHAs, commitMessages []string, branchSuffix, vers
return "", fmt.Errorf("failed to checkout existing hotfix branch: %w", err)
}
// Only rebase when the branch has no unique commits (pure fast-forward).
// If unique commits exist (e.g. after --continue resolved a cherry-pick
// conflict), rebasing would re-apply them and risk the same conflicts.
remoteRef := fmt.Sprintf("origin/%s", releaseBranch)
uniqueCount, err := git.CountUniqueCommits(hotfixBranch, remoteRef)
if err != nil {
log.Warnf("Could not determine unique commits, skipping rebase: %v", err)
} else if uniqueCount == 0 {
log.Infof("Rebasing %s onto %s", hotfixBranch, releaseBranch)
if err := git.RunCommand("rebase", "--quiet", remoteRef); err != nil {
_ = git.RunCommand("rebase", "--abort")
return "", fmt.Errorf("failed to rebase hotfix branch onto %s (rebase aborted, re-run to retry): %w", releaseBranch, err)
}
} else {
log.Infof("Branch %s has %d unique commit(s), skipping rebase", hotfixBranch, uniqueCount)
}
// Check which commits need to be cherry-picked
commitsToCherry := []string{}
for _, sha := range commitSHAs {
@@ -393,6 +364,7 @@ func cherryPickToRelease(commitSHAs, commitMessages []string, branchSuffix, vers
return "", nil
}
// Push the hotfix branch
log.Infof("Pushing hotfix branch: %s", hotfixBranch)
pushArgs := []string{"push", "-u", "origin", hotfixBranch}
if noVerify {
@@ -460,40 +432,6 @@ func performCherryPick(commitSHAs []string) error {
return nil
}
// isPRNumber returns true if the argument looks like a GitHub PR number
// (purely numeric with fewer than 6 digits).
func isPRNumber(arg string) bool {
if len(arg) == 0 || len(arg) >= 6 {
return false
}
n, err := strconv.Atoi(arg)
return err == nil && n > 0
}
// resolveArgs resolves arguments that may be PR numbers into commit SHAs.
// Returns the resolved commit SHAs and a display-friendly label for logging
// (e.g. "PR #1234" instead of raw SHA).
func resolveArgs(args []string) (commitSHAs []string, labels []string) {
commitSHAs = make([]string, len(args))
labels = make([]string, len(args))
for i, arg := range args {
if isPRNumber(arg) {
log.Infof("Resolving PR #%s to merge commit...", arg)
sha, err := git.ResolvePRToMergeCommit(arg)
if err != nil {
log.Fatalf("Failed to resolve PR #%s: %v", arg, err)
}
log.Infof("PR #%s → %s", arg, sha)
commitSHAs[i] = sha
labels[i] = fmt.Sprintf("PR #%s", arg)
} else {
commitSHAs[i] = arg
labels[i] = arg
}
}
return commitSHAs, labels
}
// normalizeVersion ensures the version has a 'v' prefix
func normalizeVersion(version string) string {
if !strings.HasPrefix(version, "v") {

View File

@@ -1,144 +0,0 @@
package cmd
import (
"encoding/json"
"errors"
"fmt"
"os"
"os/exec"
"path/filepath"
"sort"
"strings"
log "github.com/sirupsen/logrus"
"github.com/spf13/cobra"
"github.com/onyx-dot-app/onyx/tools/ods/internal/paths"
)
type desktopPackageJSON struct {
Scripts map[string]string `json:"scripts"`
}
// NewDesktopCommand creates a command that runs npm scripts from the desktop directory.
func NewDesktopCommand() *cobra.Command {
cmd := &cobra.Command{
Use: "desktop <script> [args...]",
Short: "Run desktop/package.json npm scripts",
Long: desktopHelpDescription(),
Args: cobra.MinimumNArgs(1),
ValidArgsFunction: func(cmd *cobra.Command, args []string, toComplete string) ([]string, cobra.ShellCompDirective) {
if len(args) > 0 {
return nil, cobra.ShellCompDirectiveNoFileComp
}
return desktopScriptNames(), cobra.ShellCompDirectiveNoFileComp
},
Run: func(cmd *cobra.Command, args []string) {
runDesktopScript(args)
},
}
cmd.Flags().SetInterspersed(false)
return cmd
}
func runDesktopScript(args []string) {
desktopDir, err := desktopDir()
if err != nil {
log.Fatalf("Failed to find desktop directory: %v", err)
}
scriptName := args[0]
scriptArgs := args[1:]
if len(scriptArgs) > 0 && scriptArgs[0] == "--" {
scriptArgs = scriptArgs[1:]
}
npmArgs := []string{"run", scriptName}
if len(scriptArgs) > 0 {
// npm requires "--" to forward flags to the underlying script.
npmArgs = append(npmArgs, "--")
npmArgs = append(npmArgs, scriptArgs...)
}
log.Debugf("Running in %s: npm %v", desktopDir, npmArgs)
desktopCmd := exec.Command("npm", npmArgs...)
desktopCmd.Dir = desktopDir
desktopCmd.Stdout = os.Stdout
desktopCmd.Stderr = os.Stderr
desktopCmd.Stdin = os.Stdin
if err := desktopCmd.Run(); err != nil {
// For wrapped commands, preserve the child process's exit code and
// avoid duplicating already-printed stderr output.
var exitErr *exec.ExitError
if errors.As(err, &exitErr) {
if code := exitErr.ExitCode(); code != -1 {
os.Exit(code)
}
}
log.Fatalf("Failed to run npm: %v", err)
}
}
func desktopScriptNames() []string {
scripts, err := loadDesktopScripts()
if err != nil {
return nil
}
names := make([]string, 0, len(scripts))
for name := range scripts {
names = append(names, name)
}
sort.Strings(names)
return names
}
func desktopHelpDescription() string {
description := `Run npm scripts from desktop/package.json.
Examples:
ods desktop dev
ods desktop build
ods desktop build:dmg`
scripts := desktopScriptNames()
if len(scripts) == 0 {
return description + "\n\nAvailable scripts: (unable to load)"
}
return description + "\n\nAvailable scripts:\n " + strings.Join(scripts, "\n ")
}
func loadDesktopScripts() (map[string]string, error) {
desktopDir, err := desktopDir()
if err != nil {
return nil, err
}
packageJSONPath := filepath.Join(desktopDir, "package.json")
data, err := os.ReadFile(packageJSONPath)
if err != nil {
return nil, fmt.Errorf("failed to read %s: %w", packageJSONPath, err)
}
var pkg desktopPackageJSON
if err := json.Unmarshal(data, &pkg); err != nil {
return nil, fmt.Errorf("failed to parse %s: %w", packageJSONPath, err)
}
if pkg.Scripts == nil {
return nil, nil
}
return pkg.Scripts, nil
}
func desktopDir() (string, error) {
root, err := paths.GitRoot()
if err != nil {
return "", err
}
return filepath.Join(root, "desktop"), nil
}

View File

@@ -50,7 +50,6 @@ func NewRootCommand() *cobra.Command {
cmd.AddCommand(NewPullCommand())
cmd.AddCommand(NewRunCICommand())
cmd.AddCommand(NewScreenshotDiffCommand())
cmd.AddCommand(NewDesktopCommand())
cmd.AddCommand(NewWebCommand())
cmd.AddCommand(NewWhoisCommand())

View File

@@ -1,14 +1,14 @@
module github.com/onyx-dot-app/onyx/tools/ods
go 1.26.0
go 1.24.11
require (
github.com/sirupsen/logrus v1.9.3
github.com/spf13/cobra v1.10.1
github.com/spf13/pflag v1.0.9
)
require (
github.com/inconshreveable/mousetrap v1.1.0 // indirect
github.com/spf13/pflag v1.0.9 // indirect
golang.org/x/sys v0.0.0-20220715151400-c0bba94af5f8 // indirect
)

View File

@@ -6,7 +6,6 @@ import (
"os"
"os/exec"
"path/filepath"
"strconv"
"strings"
log "github.com/sirupsen/logrus"
@@ -174,26 +173,6 @@ func IsCherryPickInProgress() bool {
return cmd.Run() == nil
}
// CountUniqueCommits returns the number of commits on branch that are not on upstream.
func CountUniqueCommits(branch, upstream string) (int, error) {
cmd := exec.Command("git", "rev-list", "--count", fmt.Sprintf("%s..%s", upstream, branch))
output, err := cmd.Output()
if err != nil {
return 0, fmt.Errorf("git rev-list --count failed: %w", err)
}
count, err := strconv.Atoi(strings.TrimSpace(string(output)))
if err != nil {
return 0, fmt.Errorf("failed to parse commit count: %w", err)
}
return count, nil
}
// IsRebaseInProgress checks if a rebase is currently in progress
func IsRebaseInProgress() bool {
cmd := exec.Command("git", "rev-parse", "--verify", "--quiet", "REBASE_HEAD")
return cmd.Run() == nil
}
// HasStagedChanges checks if there are staged changes in the index
func HasStagedChanges() bool {
cmd := exec.Command("git", "diff", "--quiet", "--cached")
@@ -237,23 +216,6 @@ func IsCommitAppliedOnBranch(commitSHA, branchName string) bool {
return false
}
// ResolvePRToMergeCommit resolves a GitHub PR number to its merge commit SHA
func ResolvePRToMergeCommit(prNumber string) (string, error) {
cmd := exec.Command("gh", "pr", "view", prNumber, "--json", "mergeCommit", "--jq", ".mergeCommit.oid")
output, err := cmd.Output()
if err != nil {
if exitErr, ok := err.(*exec.ExitError); ok {
return "", fmt.Errorf("gh pr view failed: %w: %s", err, string(exitErr.Stderr))
}
return "", fmt.Errorf("gh pr view failed: %w", err)
}
sha := strings.TrimSpace(string(output))
if sha == "" || sha == "null" {
return "", fmt.Errorf("PR #%s has no merge commit (is it merged?)", prNumber)
}
return sha, nil
}
// RunCherryPickContinue runs git cherry-pick --continue --no-edit
func RunCherryPickContinue() error {
return RunCommandVerboseOnError("cherry-pick", "--continue", "--no-edit")

View File

@@ -1,5 +1,5 @@
[build-system]
requires = ["hatchling", "go-bin~=1.26.0", "manygo"]
requires = ["hatchling", "go-bin~=1.24.11", "manygo"]
build-backend = "hatchling.build"
[project]

260
uv.lock generated
View File

@@ -453,14 +453,14 @@ wheels = [
[[package]]
name = "authlib"
version = "1.6.7"
version = "1.6.6"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "cryptography" },
]
sdist = { url = "https://files.pythonhosted.org/packages/49/dc/ed1681bf1339dd6ea1ce56136bad4baabc6f7ad466e375810702b0237047/authlib-1.6.7.tar.gz", hash = "sha256:dbf10100011d1e1b34048c9d120e83f13b35d69a826ae762b93d2fb5aafc337b", size = 164950, upload-time = "2026-02-06T14:04:14.171Z" }
sdist = { url = "https://files.pythonhosted.org/packages/bb/9b/b1661026ff24bc641b76b78c5222d614776b0c085bcfdac9bd15a1cb4b35/authlib-1.6.6.tar.gz", hash = "sha256:45770e8e056d0f283451d9996fbb59b70d45722b45d854d58f32878d0a40c38e", size = 164894, upload-time = "2025-12-12T08:01:41.464Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/f8/00/3ed12264094ec91f534fae429945efbaa9f8c666f3aa7061cc3b2a26a0cd/authlib-1.6.7-py2.py3-none-any.whl", hash = "sha256:c637340d9a02789d2efa1d003a7437d10d3e565237bcb5fcbc6c134c7b95bab0", size = 244115, upload-time = "2026-02-06T14:04:12.141Z" },
{ url = "https://files.pythonhosted.org/packages/54/51/321e821856452f7386c4e9df866f196720b1ad0c5ea1623ea7399969ae3b/authlib-1.6.6-py2.py3-none-any.whl", hash = "sha256:7d9e9bc535c13974313a87f53e8430eb6ea3d1cf6ae4f6efcd793f2e949143fd", size = 244005, upload-time = "2025-12-12T08:01:40.209Z" },
]
[[package]]
@@ -756,20 +756,12 @@ sdist = { url = "https://files.pythonhosted.org/packages/92/88/b8527e1b00c1811db
wheels = [
{ url = "https://files.pythonhosted.org/packages/ec/90/543f556fcfcfa270713eef906b6352ab048e1e557afec12925c991dc93c2/caio-0.9.25-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:d6956d9e4a27021c8bd6c9677f3a59eb1d820cc32d0343cea7961a03b1371965", size = 36839, upload-time = "2025-12-26T15:21:40.267Z" },
{ url = "https://files.pythonhosted.org/packages/51/3b/36f3e8ec38dafe8de4831decd2e44c69303d2a3892d16ceda42afed44e1b/caio-0.9.25-cp311-cp311-manylinux2010_x86_64.manylinux2014_x86_64.manylinux_2_12_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:bf84bfa039f25ad91f4f52944452a5f6f405e8afab4d445450978cd6241d1478", size = 80255, upload-time = "2025-12-26T15:22:20.271Z" },
{ url = "https://files.pythonhosted.org/packages/df/ce/65e64867d928e6aff1b4f0e12dba0ef6d5bf412c240dc1df9d421ac10573/caio-0.9.25-cp311-cp311-manylinux_2_34_aarch64.whl", hash = "sha256:ae3d62587332bce600f861a8de6256b1014d6485cfd25d68c15caf1611dd1f7c", size = 80052, upload-time = "2026-03-04T22:08:20.402Z" },
{ url = "https://files.pythonhosted.org/packages/46/90/e278863c47e14ec58309aa2e38a45882fbe67b4cc29ec9bc8f65852d3e45/caio-0.9.25-cp311-cp311-manylinux_2_34_x86_64.whl", hash = "sha256:fc220b8533dcf0f238a6b1a4a937f92024c71e7b10b5a2dfc1c73604a25709bc", size = 78273, upload-time = "2026-03-04T22:08:21.368Z" },
{ url = "https://files.pythonhosted.org/packages/d3/25/79c98ebe12df31548ba4eaf44db11b7cad6b3e7b4203718335620939083c/caio-0.9.25-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:fb7ff95af4c31ad3f03179149aab61097a71fd85e05f89b4786de0359dffd044", size = 36983, upload-time = "2025-12-26T15:21:36.075Z" },
{ url = "https://files.pythonhosted.org/packages/a3/2b/21288691f16d479945968a0a4f2856818c1c5be56881d51d4dac9b255d26/caio-0.9.25-cp312-cp312-manylinux2010_x86_64.manylinux2014_x86_64.manylinux_2_12_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:97084e4e30dfa598449d874c4d8e0c8d5ea17d2f752ef5e48e150ff9d240cd64", size = 82012, upload-time = "2025-12-26T15:22:20.983Z" },
{ url = "https://files.pythonhosted.org/packages/03/c4/8a1b580875303500a9c12b9e0af58cb82e47f5bcf888c2457742a138273c/caio-0.9.25-cp312-cp312-manylinux_2_34_aarch64.whl", hash = "sha256:4fa69eba47e0f041b9d4f336e2ad40740681c43e686b18b191b6c5f4c5544bfb", size = 81502, upload-time = "2026-03-04T22:08:22.381Z" },
{ url = "https://files.pythonhosted.org/packages/d1/1c/0fe770b8ffc8362c48134d1592d653a81a3d8748d764bec33864db36319d/caio-0.9.25-cp312-cp312-manylinux_2_34_x86_64.whl", hash = "sha256:6bebf6f079f1341d19f7386db9b8b1f07e8cc15ae13bfdaff573371ba0575d69", size = 80200, upload-time = "2026-03-04T22:08:23.382Z" },
{ url = "https://files.pythonhosted.org/packages/31/57/5e6ff127e6f62c9f15d989560435c642144aa4210882f9494204bc892305/caio-0.9.25-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:d6c2a3411af97762a2b03840c3cec2f7f728921ff8adda53d7ea2315a8563451", size = 36979, upload-time = "2025-12-26T15:21:35.484Z" },
{ url = "https://files.pythonhosted.org/packages/a3/9f/f21af50e72117eb528c422d4276cbac11fb941b1b812b182e0a9c70d19c5/caio-0.9.25-cp313-cp313-manylinux2010_x86_64.manylinux2014_x86_64.manylinux_2_12_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:0998210a4d5cd5cb565b32ccfe4e53d67303f868a76f212e002a8554692870e6", size = 81900, upload-time = "2025-12-26T15:22:21.919Z" },
{ url = "https://files.pythonhosted.org/packages/9c/12/c39ae2a4037cb10ad5eb3578eb4d5f8c1a2575c62bba675f3406b7ef0824/caio-0.9.25-cp313-cp313-manylinux_2_34_aarch64.whl", hash = "sha256:1a177d4777141b96f175fe2c37a3d96dec7911ed9ad5f02bac38aaa1c936611f", size = 81523, upload-time = "2026-03-04T22:08:25.187Z" },
{ url = "https://files.pythonhosted.org/packages/22/59/f8f2e950eb4f1a5a3883e198dca514b9d475415cb6cd7b78b9213a0dd45a/caio-0.9.25-cp313-cp313-manylinux_2_34_x86_64.whl", hash = "sha256:9ed3cfb28c0e99fec5e208c934e5c157d0866aa9c32aa4dc5e9b6034af6286b7", size = 80243, upload-time = "2026-03-04T22:08:26.449Z" },
{ url = "https://files.pythonhosted.org/packages/69/ca/a08fdc7efdcc24e6a6131a93c85be1f204d41c58f474c42b0670af8c016b/caio-0.9.25-cp314-cp314-macosx_10_15_universal2.whl", hash = "sha256:fab6078b9348e883c80a5e14b382e6ad6aabbc4429ca034e76e730cf464269db", size = 36978, upload-time = "2025-12-26T15:21:41.055Z" },
{ url = "https://files.pythonhosted.org/packages/5e/6c/d4d24f65e690213c097174d26eda6831f45f4734d9d036d81790a27e7b78/caio-0.9.25-cp314-cp314-manylinux2010_x86_64.manylinux2014_x86_64.manylinux_2_12_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:44a6b58e52d488c75cfaa5ecaa404b2b41cc965e6c417e03251e868ecd5b6d77", size = 81832, upload-time = "2025-12-26T15:22:22.757Z" },
{ url = "https://files.pythonhosted.org/packages/87/a4/e534cf7d2d0e8d880e25dd61e8d921ffcfe15bd696734589826f5a2df727/caio-0.9.25-cp314-cp314-manylinux_2_34_aarch64.whl", hash = "sha256:628a630eb7fb22381dd8e3c8ab7f59e854b9c806639811fc3f4310c6bd711d79", size = 81565, upload-time = "2026-03-04T22:08:27.483Z" },
{ url = "https://files.pythonhosted.org/packages/3f/ed/bf81aeac1d290017e5e5ac3e880fd56ee15e50a6d0353986799d1bc5cfd5/caio-0.9.25-cp314-cp314-manylinux_2_34_x86_64.whl", hash = "sha256:0ba16aa605ccb174665357fc729cf500679c2d94d5f1458a6f0d5ca48f2060a7", size = 80071, upload-time = "2026-03-04T22:08:28.751Z" },
{ url = "https://files.pythonhosted.org/packages/86/93/1f76c8d1bafe3b0614e06b2195784a3765bbf7b0a067661af9e2dd47fc33/caio-0.9.25-py3-none-any.whl", hash = "sha256:06c0bb02d6b929119b1cfbe1ca403c768b2013a369e2db46bfa2a5761cf82e40", size = 19087, upload-time = "2025-12-26T15:22:00.221Z" },
]
@@ -2107,6 +2099,12 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/ed/d4/90197b416cb61cefd316964fd9e7bd8324bcbafabf40eef14a9f20b81974/google_api_core-2.28.1-py3-none-any.whl", hash = "sha256:4021b0f8ceb77a6fb4de6fde4502cecab45062e66ff4f2895169e0b35bc9466c", size = 173706, upload-time = "2025-10-28T21:34:50.151Z" },
]
[package.optional-dependencies]
grpc = [
{ name = "grpcio" },
{ name = "grpcio-status" },
]
[[package]]
name = "google-api-python-client"
version = "2.86.0"
@@ -2125,16 +2123,16 @@ wheels = [
[[package]]
name = "google-auth"
version = "2.48.0"
version = "2.43.0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "cryptography" },
{ name = "cachetools" },
{ name = "pyasn1-modules" },
{ name = "rsa" },
]
sdist = { url = "https://files.pythonhosted.org/packages/0c/41/242044323fbd746615884b1c16639749e73665b718209946ebad7ba8a813/google_auth-2.48.0.tar.gz", hash = "sha256:4f7e706b0cd3208a3d940a19a822c37a476ddba5450156c3e6624a71f7c841ce", size = 326522, upload-time = "2026-01-26T19:22:47.157Z" }
sdist = { url = "https://files.pythonhosted.org/packages/ff/ef/66d14cf0e01b08d2d51ffc3c20410c4e134a1548fc246a6081eae585a4fe/google_auth-2.43.0.tar.gz", hash = "sha256:88228eee5fc21b62a1b5fe773ca15e67778cb07dc8363adcb4a8827b52d81483", size = 296359, upload-time = "2025-11-06T00:13:36.587Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/83/1d/d6466de3a5249d35e832a52834115ca9d1d0de6abc22065f049707516d47/google_auth-2.48.0-py3-none-any.whl", hash = "sha256:2e2a537873d449434252a9632c28bfc268b0adb1e53f9fb62afc5333a975903f", size = 236499, upload-time = "2026-01-26T19:22:45.099Z" },
{ url = "https://files.pythonhosted.org/packages/6f/d1/385110a9ae86d91cc14c5282c61fe9f4dc41c0b9f7d423c6ad77038c4448/google_auth-2.43.0-py2.py3-none-any.whl", hash = "sha256:af628ba6fa493f75c7e9dbe9373d148ca9f4399b5ea29976519e0a3848eddd16", size = 223114, upload-time = "2025-11-06T00:13:35.209Z" },
]
[[package]]
@@ -2164,6 +2162,122 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/4a/07/8d9a8186e6768b55dfffeb57c719bc03770cf8a970a074616ae6f9e26a57/google_auth_oauthlib-1.0.0-py2.py3-none-any.whl", hash = "sha256:95880ca704928c300f48194d1770cf5b1462835b6e49db61445a520f793fd5fb", size = 18926, upload-time = "2023-02-07T20:53:18.837Z" },
]
[[package]]
name = "google-cloud-aiplatform"
version = "1.121.0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "docstring-parser" },
{ name = "google-api-core", extra = ["grpc"] },
{ name = "google-auth" },
{ name = "google-cloud-bigquery" },
{ name = "google-cloud-resource-manager" },
{ name = "google-cloud-storage" },
{ name = "google-genai" },
{ name = "packaging" },
{ name = "proto-plus" },
{ name = "protobuf" },
{ name = "pydantic" },
{ name = "shapely" },
{ name = "typing-extensions" },
]
sdist = { url = "https://files.pythonhosted.org/packages/b1/86/d1bad9a342122f0f5913cd8b7758ab340aac3f579cffb800d294da605a7c/google_cloud_aiplatform-1.121.0.tar.gz", hash = "sha256:65710396238fa461dbea9b2af9ed23f95458d70d9684e75519c7c9c1601ff308", size = 9705200, upload-time = "2025-10-15T20:27:59.262Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/bd/f6/806b39f86f912133a3071ffa9ff99801a12868216069e26c83a48943116b/google_cloud_aiplatform-1.121.0-py2.py3-none-any.whl", hash = "sha256:1e7105dfd17963207e966550c9544264508efdfded29cf4924c5b86ff4a22efd", size = 8067568, upload-time = "2025-10-15T20:27:54.842Z" },
]
[[package]]
name = "google-cloud-bigquery"
version = "3.38.0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "google-api-core", extra = ["grpc"] },
{ name = "google-auth" },
{ name = "google-cloud-core" },
{ name = "google-resumable-media" },
{ name = "packaging" },
{ name = "python-dateutil" },
{ name = "requests" },
]
sdist = { url = "https://files.pythonhosted.org/packages/07/b2/a17e40afcf9487e3d17db5e36728ffe75c8d5671c46f419d7b6528a5728a/google_cloud_bigquery-3.38.0.tar.gz", hash = "sha256:8afcb7116f5eac849097a344eb8bfda78b7cfaae128e60e019193dd483873520", size = 503666, upload-time = "2025-09-17T20:33:33.47Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/39/3c/c8cada9ec282b29232ed9aed5a0b5cca6cf5367cb2ffa8ad0d2583d743f1/google_cloud_bigquery-3.38.0-py3-none-any.whl", hash = "sha256:e06e93ff7b245b239945ef59cb59616057598d369edac457ebf292bd61984da6", size = 259257, upload-time = "2025-09-17T20:33:31.404Z" },
]
[[package]]
name = "google-cloud-core"
version = "2.5.0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "google-api-core" },
{ name = "google-auth" },
]
sdist = { url = "https://files.pythonhosted.org/packages/a6/03/ef0bc99d0e0faf4fdbe67ac445e18cdaa74824fd93cd069e7bb6548cb52d/google_cloud_core-2.5.0.tar.gz", hash = "sha256:7c1b7ef5c92311717bd05301aa1a91ffbc565673d3b0b4163a52d8413a186963", size = 36027, upload-time = "2025-10-29T23:17:39.513Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/89/20/bfa472e327c8edee00f04beecc80baeddd2ab33ee0e86fd7654da49d45e9/google_cloud_core-2.5.0-py3-none-any.whl", hash = "sha256:67d977b41ae6c7211ee830c7912e41003ea8194bff15ae7d72fd6f51e57acabc", size = 29469, upload-time = "2025-10-29T23:17:38.548Z" },
]
[[package]]
name = "google-cloud-resource-manager"
version = "1.15.0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "google-api-core", extra = ["grpc"] },
{ name = "google-auth" },
{ name = "grpc-google-iam-v1" },
{ name = "grpcio" },
{ name = "proto-plus" },
{ name = "protobuf" },
]
sdist = { url = "https://files.pythonhosted.org/packages/fc/19/b95d0e8814ce42522e434cdd85c0cb6236d874d9adf6685fc8e6d1fda9d1/google_cloud_resource_manager-1.15.0.tar.gz", hash = "sha256:3d0b78c3daa713f956d24e525b35e9e9a76d597c438837171304d431084cedaf", size = 449227, upload-time = "2025-10-20T14:57:01.108Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/8c/93/5aef41a5f146ad4559dd7040ae5fa8e7ddcab4dfadbef6cb4b66d775e690/google_cloud_resource_manager-1.15.0-py3-none-any.whl", hash = "sha256:0ccde5db644b269ddfdf7b407a2c7b60bdbf459f8e666344a5285601d00c7f6d", size = 397151, upload-time = "2025-10-20T14:53:45.409Z" },
]
[[package]]
name = "google-cloud-storage"
version = "2.19.0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "google-api-core" },
{ name = "google-auth" },
{ name = "google-cloud-core" },
{ name = "google-crc32c" },
{ name = "google-resumable-media" },
{ name = "requests" },
]
sdist = { url = "https://files.pythonhosted.org/packages/36/76/4d965702e96bb67976e755bed9828fa50306dca003dbee08b67f41dd265e/google_cloud_storage-2.19.0.tar.gz", hash = "sha256:cd05e9e7191ba6cb68934d8eb76054d9be4562aa89dbc4236feee4d7d51342b2", size = 5535488, upload-time = "2024-12-05T01:35:06.49Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/d5/94/6db383d8ee1adf45dc6c73477152b82731fa4c4a46d9c1932cc8757e0fd4/google_cloud_storage-2.19.0-py2.py3-none-any.whl", hash = "sha256:aeb971b5c29cf8ab98445082cbfe7b161a1f48ed275822f59ed3f1524ea54fba", size = 131787, upload-time = "2024-12-05T01:35:04.736Z" },
]
[[package]]
name = "google-crc32c"
version = "1.7.1"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/19/ae/87802e6d9f9d69adfaedfcfd599266bf386a54d0be058b532d04c794f76d/google_crc32c-1.7.1.tar.gz", hash = "sha256:2bff2305f98846f3e825dbeec9ee406f89da7962accdb29356e4eadc251bd472", size = 14495, upload-time = "2025-03-26T14:29:13.32Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/f7/94/220139ea87822b6fdfdab4fb9ba81b3fff7ea2c82e2af34adc726085bffc/google_crc32c-1.7.1-cp311-cp311-macosx_12_0_arm64.whl", hash = "sha256:6fbab4b935989e2c3610371963ba1b86afb09537fd0c633049be82afe153ac06", size = 30468, upload-time = "2025-03-26T14:32:52.215Z" },
{ url = "https://files.pythonhosted.org/packages/94/97/789b23bdeeb9d15dc2904660463ad539d0318286d7633fe2760c10ed0c1c/google_crc32c-1.7.1-cp311-cp311-macosx_12_0_x86_64.whl", hash = "sha256:ed66cbe1ed9cbaaad9392b5259b3eba4a9e565420d734e6238813c428c3336c9", size = 30313, upload-time = "2025-03-26T14:57:38.758Z" },
{ url = "https://files.pythonhosted.org/packages/81/b8/976a2b843610c211e7ccb3e248996a61e87dbb2c09b1499847e295080aec/google_crc32c-1.7.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ee6547b657621b6cbed3562ea7826c3e11cab01cd33b74e1f677690652883e77", size = 33048, upload-time = "2025-03-26T14:41:30.679Z" },
{ url = "https://files.pythonhosted.org/packages/c9/16/a3842c2cf591093b111d4a5e2bfb478ac6692d02f1b386d2a33283a19dc9/google_crc32c-1.7.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d68e17bad8f7dd9a49181a1f5a8f4b251c6dbc8cc96fb79f1d321dfd57d66f53", size = 32669, upload-time = "2025-03-26T14:41:31.432Z" },
{ url = "https://files.pythonhosted.org/packages/04/17/ed9aba495916fcf5fe4ecb2267ceb851fc5f273c4e4625ae453350cfd564/google_crc32c-1.7.1-cp311-cp311-win_amd64.whl", hash = "sha256:6335de12921f06e1f774d0dd1fbea6bf610abe0887a1638f64d694013138be5d", size = 33476, upload-time = "2025-03-26T14:29:10.211Z" },
{ url = "https://files.pythonhosted.org/packages/dd/b7/787e2453cf8639c94b3d06c9d61f512234a82e1d12d13d18584bd3049904/google_crc32c-1.7.1-cp312-cp312-macosx_12_0_arm64.whl", hash = "sha256:2d73a68a653c57281401871dd4aeebbb6af3191dcac751a76ce430df4d403194", size = 30470, upload-time = "2025-03-26T14:34:31.655Z" },
{ url = "https://files.pythonhosted.org/packages/ed/b4/6042c2b0cbac3ec3a69bb4c49b28d2f517b7a0f4a0232603c42c58e22b44/google_crc32c-1.7.1-cp312-cp312-macosx_12_0_x86_64.whl", hash = "sha256:22beacf83baaf59f9d3ab2bbb4db0fb018da8e5aebdce07ef9f09fce8220285e", size = 30315, upload-time = "2025-03-26T15:01:54.634Z" },
{ url = "https://files.pythonhosted.org/packages/29/ad/01e7a61a5d059bc57b702d9ff6a18b2585ad97f720bd0a0dbe215df1ab0e/google_crc32c-1.7.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:19eafa0e4af11b0a4eb3974483d55d2d77ad1911e6cf6f832e1574f6781fd337", size = 33180, upload-time = "2025-03-26T14:41:32.168Z" },
{ url = "https://files.pythonhosted.org/packages/3b/a5/7279055cf004561894ed3a7bfdf5bf90a53f28fadd01af7cd166e88ddf16/google_crc32c-1.7.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b6d86616faaea68101195c6bdc40c494e4d76f41e07a37ffdef270879c15fb65", size = 32794, upload-time = "2025-03-26T14:41:33.264Z" },
{ url = "https://files.pythonhosted.org/packages/0f/d6/77060dbd140c624e42ae3ece3df53b9d811000729a5c821b9fd671ceaac6/google_crc32c-1.7.1-cp312-cp312-win_amd64.whl", hash = "sha256:b7491bdc0c7564fcf48c0179d2048ab2f7c7ba36b84ccd3a3e1c3f7a72d3bba6", size = 33477, upload-time = "2025-03-26T14:29:10.94Z" },
{ url = "https://files.pythonhosted.org/packages/8b/72/b8d785e9184ba6297a8620c8a37cf6e39b81a8ca01bb0796d7cbb28b3386/google_crc32c-1.7.1-cp313-cp313-macosx_12_0_arm64.whl", hash = "sha256:df8b38bdaf1629d62d51be8bdd04888f37c451564c2042d36e5812da9eff3c35", size = 30467, upload-time = "2025-03-26T14:36:06.909Z" },
{ url = "https://files.pythonhosted.org/packages/34/25/5f18076968212067c4e8ea95bf3b69669f9fc698476e5f5eb97d5b37999f/google_crc32c-1.7.1-cp313-cp313-macosx_12_0_x86_64.whl", hash = "sha256:e42e20a83a29aa2709a0cf271c7f8aefaa23b7ab52e53b322585297bb94d4638", size = 30309, upload-time = "2025-03-26T15:06:15.318Z" },
{ url = "https://files.pythonhosted.org/packages/92/83/9228fe65bf70e93e419f38bdf6c5ca5083fc6d32886ee79b450ceefd1dbd/google_crc32c-1.7.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:905a385140bf492ac300026717af339790921f411c0dfd9aa5a9e69a08ed32eb", size = 33133, upload-time = "2025-03-26T14:41:34.388Z" },
{ url = "https://files.pythonhosted.org/packages/c3/ca/1ea2fd13ff9f8955b85e7956872fdb7050c4ace8a2306a6d177edb9cf7fe/google_crc32c-1.7.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6b211ddaf20f7ebeec5c333448582c224a7c90a9d98826fbab82c0ddc11348e6", size = 32773, upload-time = "2025-03-26T14:41:35.19Z" },
{ url = "https://files.pythonhosted.org/packages/89/32/a22a281806e3ef21b72db16f948cad22ec68e4bdd384139291e00ff82fe2/google_crc32c-1.7.1-cp313-cp313-win_amd64.whl", hash = "sha256:0f99eaa09a9a7e642a61e06742856eec8b19fc0037832e03f941fe7cf0c8e4db", size = 33475, upload-time = "2025-03-26T14:29:11.771Z" },
{ url = "https://files.pythonhosted.org/packages/b8/c5/002975aff514e57fc084ba155697a049b3f9b52225ec3bc0f542871dd524/google_crc32c-1.7.1-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:32d1da0d74ec5634a05f53ef7df18fc646666a25efaaca9fc7dcfd4caf1d98c3", size = 33243, upload-time = "2025-03-26T14:41:35.975Z" },
{ url = "https://files.pythonhosted.org/packages/61/cb/c585282a03a0cea70fcaa1bf55d5d702d0f2351094d663ec3be1c6c67c52/google_crc32c-1.7.1-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e10554d4abc5238823112c2ad7e4560f96c7bf3820b202660373d769d9e6e4c9", size = 32870, upload-time = "2025-03-26T14:41:37.08Z" },
{ url = "https://files.pythonhosted.org/packages/16/1b/1693372bf423ada422f80fd88260dbfd140754adb15cbc4d7e9a68b1cb8e/google_crc32c-1.7.1-pp311-pypy311_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:85fef7fae11494e747c9fd1359a527e5970fc9603c90764843caabd3a16a0a48", size = 28241, upload-time = "2025-03-26T14:41:45.898Z" },
{ url = "https://files.pythonhosted.org/packages/fd/3c/2a19a60a473de48717b4efb19398c3f914795b64a96cf3fbe82588044f78/google_crc32c-1.7.1-pp311-pypy311_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6efb97eb4369d52593ad6f75e7e10d053cf00c48983f7a973105bc70b0ac4d82", size = 28048, upload-time = "2025-03-26T14:41:46.696Z" },
]
[[package]]
name = "google-genai"
version = "1.52.0"
@@ -2183,6 +2297,18 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/ec/66/03f663e7bca7abe9ccfebe6cb3fe7da9a118fd723a5abb278d6117e7990e/google_genai-1.52.0-py3-none-any.whl", hash = "sha256:c8352b9f065ae14b9322b949c7debab8562982f03bf71d44130cd2b798c20743", size = 261219, upload-time = "2025-11-21T02:18:54.515Z" },
]
[[package]]
name = "google-resumable-media"
version = "2.7.2"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "google-crc32c" },
]
sdist = { url = "https://files.pythonhosted.org/packages/58/5a/0efdc02665dca14e0837b62c8a1a93132c264bd02054a15abb2218afe0ae/google_resumable_media-2.7.2.tar.gz", hash = "sha256:5280aed4629f2b60b847b0d42f9857fd4935c11af266744df33d8074cae92fe0", size = 2163099, upload-time = "2024-08-07T22:20:38.555Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/82/35/b8d3baf8c46695858cb9d8835a53baa1eeb9906ddaf2f728a5f5b640fd1e/google_resumable_media-2.7.2-py2.py3-none-any.whl", hash = "sha256:3ce7551e9fe6d99e9a126101d2536612bb73486721951e9562fee0f90c6ababa", size = 81251, upload-time = "2024-08-07T22:20:36.409Z" },
]
[[package]]
name = "googleapis-common-protos"
version = "1.72.0"
@@ -2195,6 +2321,11 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/c4/ab/09169d5a4612a5f92490806649ac8d41e3ec9129c636754575b3553f4ea4/googleapis_common_protos-1.72.0-py3-none-any.whl", hash = "sha256:4299c5a82d5ae1a9702ada957347726b167f9f8d1fc352477702a1e851ff4038", size = 297515, upload-time = "2025-11-06T18:29:13.14Z" },
]
[package.optional-dependencies]
grpc = [
{ name = "grpcio" },
]
[[package]]
name = "greenlet"
version = "3.2.4"
@@ -2245,6 +2376,85 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/e3/a5/6ddab2b4c112be95601c13428db1d8b6608a8b6039816f2ba09c346c08fc/greenlet-3.2.4-cp314-cp314-win_amd64.whl", hash = "sha256:e37ab26028f12dbb0ff65f29a8d3d44a765c61e729647bf2ddfbbed621726f01", size = 303425, upload-time = "2025-08-07T13:32:27.59Z" },
]
[[package]]
name = "grpc-google-iam-v1"
version = "0.14.3"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "googleapis-common-protos", extra = ["grpc"] },
{ name = "grpcio" },
{ name = "protobuf" },
]
sdist = { url = "https://files.pythonhosted.org/packages/76/1e/1011451679a983f2f5c6771a1682542ecb027776762ad031fd0d7129164b/grpc_google_iam_v1-0.14.3.tar.gz", hash = "sha256:879ac4ef33136c5491a6300e27575a9ec760f6cdf9a2518798c1b8977a5dc389", size = 23745, upload-time = "2025-10-15T21:14:53.318Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/4a/bd/330a1bbdb1afe0b96311249e699b6dc9cfc17916394fd4503ac5aca2514b/grpc_google_iam_v1-0.14.3-py3-none-any.whl", hash = "sha256:7a7f697e017a067206a3dfef44e4c634a34d3dee135fe7d7a4613fe3e59217e6", size = 32690, upload-time = "2025-10-15T21:14:51.72Z" },
]
[[package]]
name = "grpcio"
version = "1.76.0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "typing-extensions" },
]
sdist = { url = "https://files.pythonhosted.org/packages/b6/e0/318c1ce3ae5a17894d5791e87aea147587c9e702f24122cc7a5c8bbaeeb1/grpcio-1.76.0.tar.gz", hash = "sha256:7be78388d6da1a25c0d5ec506523db58b18be22d9c37d8d3a32c08be4987bd73", size = 12785182, upload-time = "2025-10-21T16:23:12.106Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/a0/00/8163a1beeb6971f66b4bbe6ac9457b97948beba8dd2fc8e1281dce7f79ec/grpcio-1.76.0-cp311-cp311-linux_armv7l.whl", hash = "sha256:2e1743fbd7f5fa713a1b0a8ac8ebabf0ec980b5d8809ec358d488e273b9cf02a", size = 5843567, upload-time = "2025-10-21T16:20:52.829Z" },
{ url = "https://files.pythonhosted.org/packages/10/c1/934202f5cf335e6d852530ce14ddb0fef21be612ba9ecbbcbd4d748ca32d/grpcio-1.76.0-cp311-cp311-macosx_11_0_universal2.whl", hash = "sha256:a8c2cf1209497cf659a667d7dea88985e834c24b7c3b605e6254cbb5076d985c", size = 11848017, upload-time = "2025-10-21T16:20:56.705Z" },
{ url = "https://files.pythonhosted.org/packages/11/0b/8dec16b1863d74af6eb3543928600ec2195af49ca58b16334972f6775663/grpcio-1.76.0-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:08caea849a9d3c71a542827d6df9d5a69067b0a1efbea8a855633ff5d9571465", size = 6412027, upload-time = "2025-10-21T16:20:59.3Z" },
{ url = "https://files.pythonhosted.org/packages/d7/64/7b9e6e7ab910bea9d46f2c090380bab274a0b91fb0a2fe9b0cd399fffa12/grpcio-1.76.0-cp311-cp311-manylinux2014_i686.manylinux_2_17_i686.whl", hash = "sha256:f0e34c2079d47ae9f6188211db9e777c619a21d4faba6977774e8fa43b085e48", size = 7075913, upload-time = "2025-10-21T16:21:01.645Z" },
{ url = "https://files.pythonhosted.org/packages/68/86/093c46e9546073cefa789bd76d44c5cb2abc824ca62af0c18be590ff13ba/grpcio-1.76.0-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:8843114c0cfce61b40ad48df65abcfc00d4dba82eae8718fab5352390848c5da", size = 6615417, upload-time = "2025-10-21T16:21:03.844Z" },
{ url = "https://files.pythonhosted.org/packages/f7/b6/5709a3a68500a9c03da6fb71740dcdd5ef245e39266461a03f31a57036d8/grpcio-1.76.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:8eddfb4d203a237da6f3cc8a540dad0517d274b5a1e9e636fd8d2c79b5c1d397", size = 7199683, upload-time = "2025-10-21T16:21:06.195Z" },
{ url = "https://files.pythonhosted.org/packages/91/d3/4b1f2bf16ed52ce0b508161df3a2d186e4935379a159a834cb4a7d687429/grpcio-1.76.0-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:32483fe2aab2c3794101c2a159070584e5db11d0aa091b2c0ea9c4fc43d0d749", size = 8163109, upload-time = "2025-10-21T16:21:08.498Z" },
{ url = "https://files.pythonhosted.org/packages/5c/61/d9043f95f5f4cf085ac5dd6137b469d41befb04bd80280952ffa2a4c3f12/grpcio-1.76.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:dcfe41187da8992c5f40aa8c5ec086fa3672834d2be57a32384c08d5a05b4c00", size = 7626676, upload-time = "2025-10-21T16:21:10.693Z" },
{ url = "https://files.pythonhosted.org/packages/36/95/fd9a5152ca02d8881e4dd419cdd790e11805979f499a2e5b96488b85cf27/grpcio-1.76.0-cp311-cp311-win32.whl", hash = "sha256:2107b0c024d1b35f4083f11245c0e23846ae64d02f40b2b226684840260ed054", size = 3997688, upload-time = "2025-10-21T16:21:12.746Z" },
{ url = "https://files.pythonhosted.org/packages/60/9c/5c359c8d4c9176cfa3c61ecd4efe5affe1f38d9bae81e81ac7186b4c9cc8/grpcio-1.76.0-cp311-cp311-win_amd64.whl", hash = "sha256:522175aba7af9113c48ec10cc471b9b9bd4f6ceb36aeb4544a8e2c80ed9d252d", size = 4709315, upload-time = "2025-10-21T16:21:15.26Z" },
{ url = "https://files.pythonhosted.org/packages/bf/05/8e29121994b8d959ffa0afd28996d452f291b48cfc0875619de0bde2c50c/grpcio-1.76.0-cp312-cp312-linux_armv7l.whl", hash = "sha256:81fd9652b37b36f16138611c7e884eb82e0cec137c40d3ef7c3f9b3ed00f6ed8", size = 5799718, upload-time = "2025-10-21T16:21:17.939Z" },
{ url = "https://files.pythonhosted.org/packages/d9/75/11d0e66b3cdf998c996489581bdad8900db79ebd83513e45c19548f1cba4/grpcio-1.76.0-cp312-cp312-macosx_11_0_universal2.whl", hash = "sha256:04bbe1bfe3a68bbfd4e52402ab7d4eb59d72d02647ae2042204326cf4bbad280", size = 11825627, upload-time = "2025-10-21T16:21:20.466Z" },
{ url = "https://files.pythonhosted.org/packages/28/50/2f0aa0498bc188048f5d9504dcc5c2c24f2eb1a9337cd0fa09a61a2e75f0/grpcio-1.76.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:d388087771c837cdb6515539f43b9d4bf0b0f23593a24054ac16f7a960be16f4", size = 6359167, upload-time = "2025-10-21T16:21:23.122Z" },
{ url = "https://files.pythonhosted.org/packages/66/e5/bbf0bb97d29ede1d59d6588af40018cfc345b17ce979b7b45424628dc8bb/grpcio-1.76.0-cp312-cp312-manylinux2014_i686.manylinux_2_17_i686.whl", hash = "sha256:9f8f757bebaaea112c00dba718fc0d3260052ce714e25804a03f93f5d1c6cc11", size = 7044267, upload-time = "2025-10-21T16:21:25.995Z" },
{ url = "https://files.pythonhosted.org/packages/f5/86/f6ec2164f743d9609691115ae8ece098c76b894ebe4f7c94a655c6b03e98/grpcio-1.76.0-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:980a846182ce88c4f2f7e2c22c56aefd515daeb36149d1c897f83cf57999e0b6", size = 6573963, upload-time = "2025-10-21T16:21:28.631Z" },
{ url = "https://files.pythonhosted.org/packages/60/bc/8d9d0d8505feccfdf38a766d262c71e73639c165b311c9457208b56d92ae/grpcio-1.76.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:f92f88e6c033db65a5ae3d97905c8fea9c725b63e28d5a75cb73b49bda5024d8", size = 7164484, upload-time = "2025-10-21T16:21:30.837Z" },
{ url = "https://files.pythonhosted.org/packages/67/e6/5d6c2fc10b95edf6df9b8f19cf10a34263b7fd48493936fffd5085521292/grpcio-1.76.0-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:4baf3cbe2f0be3289eb68ac8ae771156971848bb8aaff60bad42005539431980", size = 8127777, upload-time = "2025-10-21T16:21:33.577Z" },
{ url = "https://files.pythonhosted.org/packages/3f/c8/dce8ff21c86abe025efe304d9e31fdb0deaaa3b502b6a78141080f206da0/grpcio-1.76.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:615ba64c208aaceb5ec83bfdce7728b80bfeb8be97562944836a7a0a9647d882", size = 7594014, upload-time = "2025-10-21T16:21:41.882Z" },
{ url = "https://files.pythonhosted.org/packages/e0/42/ad28191ebf983a5d0ecef90bab66baa5a6b18f2bfdef9d0a63b1973d9f75/grpcio-1.76.0-cp312-cp312-win32.whl", hash = "sha256:45d59a649a82df5718fd9527ce775fd66d1af35e6d31abdcdc906a49c6822958", size = 3984750, upload-time = "2025-10-21T16:21:44.006Z" },
{ url = "https://files.pythonhosted.org/packages/9e/00/7bd478cbb851c04a48baccaa49b75abaa8e4122f7d86da797500cccdd771/grpcio-1.76.0-cp312-cp312-win_amd64.whl", hash = "sha256:c088e7a90b6017307f423efbb9d1ba97a22aa2170876223f9709e9d1de0b5347", size = 4704003, upload-time = "2025-10-21T16:21:46.244Z" },
{ url = "https://files.pythonhosted.org/packages/fc/ed/71467ab770effc9e8cef5f2e7388beb2be26ed642d567697bb103a790c72/grpcio-1.76.0-cp313-cp313-linux_armv7l.whl", hash = "sha256:26ef06c73eb53267c2b319f43e6634c7556ea37672029241a056629af27c10e2", size = 5807716, upload-time = "2025-10-21T16:21:48.475Z" },
{ url = "https://files.pythonhosted.org/packages/2c/85/c6ed56f9817fab03fa8a111ca91469941fb514e3e3ce6d793cb8f1e1347b/grpcio-1.76.0-cp313-cp313-macosx_11_0_universal2.whl", hash = "sha256:45e0111e73f43f735d70786557dc38141185072d7ff8dc1829d6a77ac1471468", size = 11821522, upload-time = "2025-10-21T16:21:51.142Z" },
{ url = "https://files.pythonhosted.org/packages/ac/31/2b8a235ab40c39cbc141ef647f8a6eb7b0028f023015a4842933bc0d6831/grpcio-1.76.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:83d57312a58dcfe2a3a0f9d1389b299438909a02db60e2f2ea2ae2d8034909d3", size = 6362558, upload-time = "2025-10-21T16:21:54.213Z" },
{ url = "https://files.pythonhosted.org/packages/bd/64/9784eab483358e08847498ee56faf8ff6ea8e0a4592568d9f68edc97e9e9/grpcio-1.76.0-cp313-cp313-manylinux2014_i686.manylinux_2_17_i686.whl", hash = "sha256:3e2a27c89eb9ac3d81ec8835e12414d73536c6e620355d65102503064a4ed6eb", size = 7049990, upload-time = "2025-10-21T16:21:56.476Z" },
{ url = "https://files.pythonhosted.org/packages/2b/94/8c12319a6369434e7a184b987e8e9f3b49a114c489b8315f029e24de4837/grpcio-1.76.0-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:61f69297cba3950a524f61c7c8ee12e55c486cb5f7db47ff9dcee33da6f0d3ae", size = 6575387, upload-time = "2025-10-21T16:21:59.051Z" },
{ url = "https://files.pythonhosted.org/packages/15/0f/f12c32b03f731f4a6242f771f63039df182c8b8e2cf8075b245b409259d4/grpcio-1.76.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:6a15c17af8839b6801d554263c546c69c4d7718ad4321e3166175b37eaacca77", size = 7166668, upload-time = "2025-10-21T16:22:02.049Z" },
{ url = "https://files.pythonhosted.org/packages/ff/2d/3ec9ce0c2b1d92dd59d1c3264aaec9f0f7c817d6e8ac683b97198a36ed5a/grpcio-1.76.0-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:25a18e9810fbc7e7f03ec2516addc116a957f8cbb8cbc95ccc80faa072743d03", size = 8124928, upload-time = "2025-10-21T16:22:04.984Z" },
{ url = "https://files.pythonhosted.org/packages/1a/74/fd3317be5672f4856bcdd1a9e7b5e17554692d3db9a3b273879dc02d657d/grpcio-1.76.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:931091142fd8cc14edccc0845a79248bc155425eee9a98b2db2ea4f00a235a42", size = 7589983, upload-time = "2025-10-21T16:22:07.881Z" },
{ url = "https://files.pythonhosted.org/packages/45/bb/ca038cf420f405971f19821c8c15bcbc875505f6ffadafe9ffd77871dc4c/grpcio-1.76.0-cp313-cp313-win32.whl", hash = "sha256:5e8571632780e08526f118f74170ad8d50fb0a48c23a746bef2a6ebade3abd6f", size = 3984727, upload-time = "2025-10-21T16:22:10.032Z" },
{ url = "https://files.pythonhosted.org/packages/41/80/84087dc56437ced7cdd4b13d7875e7439a52a261e3ab4e06488ba6173b0a/grpcio-1.76.0-cp313-cp313-win_amd64.whl", hash = "sha256:f9f7bd5faab55f47231ad8dba7787866b69f5e93bc306e3915606779bbfb4ba8", size = 4702799, upload-time = "2025-10-21T16:22:12.709Z" },
{ url = "https://files.pythonhosted.org/packages/b4/46/39adac80de49d678e6e073b70204091e76631e03e94928b9ea4ecf0f6e0e/grpcio-1.76.0-cp314-cp314-linux_armv7l.whl", hash = "sha256:ff8a59ea85a1f2191a0ffcc61298c571bc566332f82e5f5be1b83c9d8e668a62", size = 5808417, upload-time = "2025-10-21T16:22:15.02Z" },
{ url = "https://files.pythonhosted.org/packages/9c/f5/a4531f7fb8b4e2a60b94e39d5d924469b7a6988176b3422487be61fe2998/grpcio-1.76.0-cp314-cp314-macosx_11_0_universal2.whl", hash = "sha256:06c3d6b076e7b593905d04fdba6a0525711b3466f43b3400266f04ff735de0cd", size = 11828219, upload-time = "2025-10-21T16:22:17.954Z" },
{ url = "https://files.pythonhosted.org/packages/4b/1c/de55d868ed7a8bd6acc6b1d6ddc4aa36d07a9f31d33c912c804adb1b971b/grpcio-1.76.0-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:fd5ef5932f6475c436c4a55e4336ebbe47bd3272be04964a03d316bbf4afbcbc", size = 6367826, upload-time = "2025-10-21T16:22:20.721Z" },
{ url = "https://files.pythonhosted.org/packages/59/64/99e44c02b5adb0ad13ab3adc89cb33cb54bfa90c74770f2607eea629b86f/grpcio-1.76.0-cp314-cp314-manylinux2014_i686.manylinux_2_17_i686.whl", hash = "sha256:b331680e46239e090f5b3cead313cc772f6caa7d0fc8de349337563125361a4a", size = 7049550, upload-time = "2025-10-21T16:22:23.637Z" },
{ url = "https://files.pythonhosted.org/packages/43/28/40a5be3f9a86949b83e7d6a2ad6011d993cbe9b6bd27bea881f61c7788b6/grpcio-1.76.0-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:2229ae655ec4e8999599469559e97630185fdd53ae1e8997d147b7c9b2b72cba", size = 6575564, upload-time = "2025-10-21T16:22:26.016Z" },
{ url = "https://files.pythonhosted.org/packages/4b/a9/1be18e6055b64467440208a8559afac243c66a8b904213af6f392dc2212f/grpcio-1.76.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:490fa6d203992c47c7b9e4a9d39003a0c2bcc1c9aa3c058730884bbbb0ee9f09", size = 7176236, upload-time = "2025-10-21T16:22:28.362Z" },
{ url = "https://files.pythonhosted.org/packages/0f/55/dba05d3fcc151ce6e81327541d2cc8394f442f6b350fead67401661bf041/grpcio-1.76.0-cp314-cp314-musllinux_1_2_i686.whl", hash = "sha256:479496325ce554792dba6548fae3df31a72cef7bad71ca2e12b0e58f9b336bfc", size = 8125795, upload-time = "2025-10-21T16:22:31.075Z" },
{ url = "https://files.pythonhosted.org/packages/4a/45/122df922d05655f63930cf42c9e3f72ba20aadb26c100ee105cad4ce4257/grpcio-1.76.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:1c9b93f79f48b03ada57ea24725d83a30284a012ec27eab2cf7e50a550cbbbcc", size = 7592214, upload-time = "2025-10-21T16:22:33.831Z" },
{ url = "https://files.pythonhosted.org/packages/4a/6e/0b899b7f6b66e5af39e377055fb4a6675c9ee28431df5708139df2e93233/grpcio-1.76.0-cp314-cp314-win32.whl", hash = "sha256:747fa73efa9b8b1488a95d0ba1039c8e2dca0f741612d80415b1e1c560febf4e", size = 4062961, upload-time = "2025-10-21T16:22:36.468Z" },
{ url = "https://files.pythonhosted.org/packages/19/41/0b430b01a2eb38ee887f88c1f07644a1df8e289353b78e82b37ef988fb64/grpcio-1.76.0-cp314-cp314-win_amd64.whl", hash = "sha256:922fa70ba549fce362d2e2871ab542082d66e2aaf0c19480ea453905b01f384e", size = 4834462, upload-time = "2025-10-21T16:22:39.772Z" },
]
[[package]]
name = "grpcio-status"
version = "1.76.0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "googleapis-common-protos" },
{ name = "grpcio" },
{ name = "protobuf" },
]
sdist = { url = "https://files.pythonhosted.org/packages/3f/46/e9f19d5be65e8423f886813a2a9d0056ba94757b0c5007aa59aed1a961fa/grpcio_status-1.76.0.tar.gz", hash = "sha256:25fcbfec74c15d1a1cb5da3fab8ee9672852dc16a5a9eeb5baf7d7a9952943cd", size = 13679, upload-time = "2025-10-21T16:28:52.545Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/8c/cc/27ba60ad5a5f2067963e6a858743500df408eb5855e98be778eaef8c9b02/grpcio_status-1.76.0-py3-none-any.whl", hash = "sha256:380568794055a8efbbd8871162df92012e0228a5f6dffaf57f2a00c534103b18", size = 14425, upload-time = "2025-10-21T16:28:40.853Z" },
]
[[package]]
name = "h11"
version = "0.16.0"
@@ -4207,6 +4417,7 @@ dependencies = [
{ name = "cohere" },
{ name = "discord-py" },
{ name = "fastapi" },
{ name = "google-cloud-aiplatform" },
{ name = "google-genai" },
{ name = "kubernetes" },
{ name = "litellm" },
@@ -4411,6 +4622,7 @@ requires-dist = [
{ name = "google-api-python-client", marker = "extra == 'backend'", specifier = "==2.86.0" },
{ name = "google-auth-httplib2", marker = "extra == 'backend'", specifier = "==0.1.0" },
{ name = "google-auth-oauthlib", marker = "extra == 'backend'", specifier = "==1.0.0" },
{ name = "google-cloud-aiplatform", specifier = "==1.121.0" },
{ name = "google-genai", specifier = "==1.52.0" },
{ name = "hatchling", marker = "extra == 'dev'", specifier = "==1.28.0" },
{ name = "httpcore", marker = "extra == 'backend'", specifier = "==1.0.9" },
@@ -4443,7 +4655,7 @@ requires-dist = [
{ name = "numpy", marker = "extra == 'model-server'", specifier = "==2.4.1" },
{ name = "oauthlib", marker = "extra == 'backend'", specifier = "==3.2.2" },
{ name = "office365-rest-python-client", marker = "extra == 'backend'", specifier = "==2.6.2" },
{ name = "onyx-devtools", marker = "extra == 'dev'", specifier = "==0.6.3" },
{ name = "onyx-devtools", marker = "extra == 'dev'", specifier = "==0.6.2" },
{ name = "openai", specifier = "==2.14.0" },
{ name = "openapi-generator-cli", marker = "extra == 'dev'", specifier = "==7.17.0" },
{ name = "openinference-instrumentation", marker = "extra == 'backend'", specifier = "==0.1.42" },
@@ -4548,20 +4760,20 @@ requires-dist = [{ name = "onyx", extras = ["backend", "dev", "ee"], editable =
[[package]]
name = "onyx-devtools"
version = "0.6.3"
version = "0.6.2"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "fastapi" },
{ name = "openapi-generator-cli" },
]
wheels = [
{ url = "https://files.pythonhosted.org/packages/84/e2/e7619722c3ccd18eb38100f776fb3dd6b4ae0fbbee09fca5af7c69a279b5/onyx_devtools-0.6.3-py3-none-any.whl", hash = "sha256:d3a5422945d9da12cafc185f64b39f6e727ee4cc92b37427deb7a38f9aad4966", size = 3945381, upload-time = "2026-03-05T20:39:25.896Z" },
{ url = "https://files.pythonhosted.org/packages/f2/09/513d2dabedc1e54ad4376830fc9b34a3d9c164bdbcdedfcdbb8b8154dc5a/onyx_devtools-0.6.3-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:efe300e9f3a2e7ae75f88a4f9e0a5c4c471478296cb1615b6a1f03d247582e13", size = 3978761, upload-time = "2026-03-05T20:39:28.822Z" },
{ url = "https://files.pythonhosted.org/packages/39/41/e757602a0de032d74ed01c7ee57f30e57728fb9cd4f922f50d2affda3889/onyx_devtools-0.6.3-py3-none-macosx_11_0_arm64.whl", hash = "sha256:594066eed3f917cfab5a8c7eac3d4a210df30259f2049f664787749709345e19", size = 3665378, upload-time = "2026-03-05T20:44:22.696Z" },
{ url = "https://files.pythonhosted.org/packages/33/1c/c93b65d0b32e202596a2647922a75c7011cb982f899ddfcfd171f792c58f/onyx_devtools-0.6.3-py3-none-manylinux_2_17_aarch64.whl", hash = "sha256:384ef66030b55c0fd68b3898782b5b4b868ff3de119569dfc8544e2ce534b98a", size = 3540890, upload-time = "2026-03-05T20:39:28.886Z" },
{ url = "https://files.pythonhosted.org/packages/f4/33/760eb656013f7f0cdff24570480d3dc4e52bbd8e6147ea1e8cf6fad7554f/onyx_devtools-0.6.3-py3-none-manylinux_2_17_x86_64.whl", hash = "sha256:82e218f3a49f64910c2c4c34d5dc12d1ea1520a27e0b0f6e4c0949ff9abaf0e1", size = 3945396, upload-time = "2026-03-05T20:39:34.323Z" },
{ url = "https://files.pythonhosted.org/packages/1a/eb/f54b3675c464df8a51194ff75afc97c2417659e3a209dc46948b47c28860/onyx_devtools-0.6.3-py3-none-win_amd64.whl", hash = "sha256:8af614ae7229290ef2417cb85270184a1e826ed9a3a34658da93851edb36df57", size = 4045936, upload-time = "2026-03-05T20:39:28.375Z" },
{ url = "https://files.pythonhosted.org/packages/04/b8/5bee38e748f3d4b8ec935766224db1bbc1214c91092e5822c080fccd9130/onyx_devtools-0.6.3-py3-none-win_arm64.whl", hash = "sha256:717589db4b42528d33ae96f8006ee6aad3555034dcfee724705b6576be6a6ec4", size = 3608268, upload-time = "2026-03-05T20:39:28.731Z" },
{ url = "https://files.pythonhosted.org/packages/cc/20/d9f6089616044b0fb6e097cbae82122de24f3acd97820be4868d5c28ee3f/onyx_devtools-0.6.2-py3-none-any.whl", hash = "sha256:e48d14695d39d62ec3247a4c76ea56604bc5fb635af84c4ff3e9628bcc67b4fb", size = 3785941, upload-time = "2026-02-25T22:33:43.585Z" },
{ url = "https://files.pythonhosted.org/packages/d6/f5/f754a717f6b011050eb52ef09895cfa2f048f567f4aa3d5e0f773657dea4/onyx_devtools-0.6.2-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:505f9910a04868ab62d99bb483dc37c9f4ad94fa80e6ac0e6a10b86351c31420", size = 3832182, upload-time = "2026-02-25T22:33:43.283Z" },
{ url = "https://files.pythonhosted.org/packages/6a/35/6e653398c62078e87ebb0d03dc944df6691d92ca427c92867309d2d803b7/onyx_devtools-0.6.2-py3-none-macosx_11_0_arm64.whl", hash = "sha256:edec98e3acc0fa22cf9102c2070409ea7bcf99d7ded72bd8cb184ece8171c36a", size = 3576948, upload-time = "2026-02-25T22:33:42.962Z" },
{ url = "https://files.pythonhosted.org/packages/3c/97/cff707c5c3d2acd714365b1023f0100676abc99816a29558319e8ef01d5f/onyx_devtools-0.6.2-py3-none-manylinux_2_17_aarch64.whl", hash = "sha256:97abab61216866cdccd8c0a7e27af328776083756ce4fb57c4bd723030449e3b", size = 3439359, upload-time = "2026-02-25T22:33:44.684Z" },
{ url = "https://files.pythonhosted.org/packages/fc/98/3b768d18e5599178834b966b447075626d224e048d6eb264d89d19abacb4/onyx_devtools-0.6.2-py3-none-manylinux_2_17_x86_64.whl", hash = "sha256:681b038ab6f1457409d14b2490782c7a8014fc0f0f1b9cd69bb2b7199f99aef1", size = 3785959, upload-time = "2026-02-25T22:33:44.342Z" },
{ url = "https://files.pythonhosted.org/packages/d6/38/9b047f9e61c14ccf22b8f386c7a57da3965f90737453f3a577a97da45cdf/onyx_devtools-0.6.2-py3-none-win_amd64.whl", hash = "sha256:a2063be6be104b50a7538cf0d26c7f7ab9159d53327dd6f3e91db05d793c95f3", size = 3878776, upload-time = "2026-02-25T22:33:45.229Z" },
{ url = "https://files.pythonhosted.org/packages/9d/0f/742f644bae84f5f8f7b500094a2f58da3ff8027fc739944622577e2e2850/onyx_devtools-0.6.2-py3-none-win_arm64.whl", hash = "sha256:00fb90a49a15c932b5cacf818b1b4918e5b5c574bde243dc1828b57690dd5046", size = 3501112, upload-time = "2026-02-25T22:33:41.512Z" },
]
[[package]]

View File

@@ -99,7 +99,6 @@ export { default as SvgLineChartUp } from "@opal/icons/line-chart-up";
export { default as SvgLink } from "@opal/icons/link";
export { default as SvgLinkedDots } from "@opal/icons/linked-dots";
export { default as SvgLitellm } from "@opal/icons/litellm";
export { default as SvgLmStudio } from "@opal/icons/lm-studio";
export { default as SvgLoader } from "@opal/icons/loader";
export { default as SvgLock } from "@opal/icons/lock";
export { default as SvgLogOut } from "@opal/icons/log-out";

View File

@@ -1,141 +0,0 @@
import React from "react";
import type { IconProps } from "@opal/types";
const SvgLmStudio = ({ size, ...props }: IconProps) => {
const gradientId = React.useId();
return (
<svg
width={size}
height={size}
viewBox="0 0 480 480"
fill="none"
xmlns="http://www.w3.org/2000/svg"
{...props}
>
<rect width={480} height={480} rx={96} fill={`url(#${gradientId})`} />
<rect
opacity={0.25}
x={128}
y={80}
width={208}
height={40}
rx={20}
fill="white"
/>
<rect
opacity={0.9}
x={64}
y={80}
width={208}
height={40}
rx={20}
fill="white"
/>
<rect
opacity={0.25}
x={208}
y={136}
width={208}
height={40}
rx={20}
fill="white"
/>
<rect
opacity={0.9}
x={144}
y={136}
width={208}
height={40}
rx={20}
fill="white"
/>
<rect
opacity={0.25}
x={160}
y={192}
width={208}
height={40}
rx={20}
fill="white"
/>
<rect
opacity={0.9}
x={96}
y={192}
width={208}
height={40}
rx={20}
fill="white"
/>
<rect
opacity={0.25}
x={104}
y={248}
width={208}
height={40}
rx={20}
fill="white"
/>
<rect
opacity={0.9}
x={40}
y={248}
width={208}
height={40}
rx={20}
fill="white"
/>
<rect
opacity={0.25}
x={160}
y={304}
width={208}
height={40}
rx={20}
fill="white"
/>
<rect
opacity={0.9}
x={96}
y={304}
width={208}
height={40}
rx={20}
fill="white"
/>
<rect
opacity={0.25}
x={296}
y={360}
width={136}
height={40}
rx={20}
fill="white"
/>
<rect
opacity={0.9}
x={224}
y={360}
width={144}
height={40}
rx={20}
fill="white"
/>
<defs>
<linearGradient
id={gradientId}
x1={-206.055}
y1={215.087}
x2={224.119}
y2={658.689}
gradientUnits="userSpaceOnUse"
>
<stop stopColor="#6E7EF3" />
<stop offset={1} stopColor="#4F13BE" />
</linearGradient>
</defs>
</svg>
);
};
export default SvgLmStudio;

52
web/package-lock.json generated
View File

@@ -48,7 +48,6 @@
"cmdk": "^1.0.0",
"cookies-next": "^5.1.0",
"date-fns": "^3.6.0",
"docx-preview": "^0.3.7",
"favicon-fetch": "^1.0.0",
"formik": "^2.2.9",
"highlight.js": "^11.11.1",
@@ -7959,15 +7958,6 @@
"node": ">=0.10.0"
}
},
"node_modules/docx-preview": {
"version": "0.3.7",
"resolved": "https://registry.npmjs.org/docx-preview/-/docx-preview-0.3.7.tgz",
"integrity": "sha512-Lav69CTA/IYZPJTsKH7oYeoZjyg96N0wEJMNslGJnZJ+dMUZK85Lt5ASC79yUlD48ecWjuv+rkcmFt6EVPV0Xg==",
"license": "Apache-2.0",
"dependencies": {
"jszip": ">=3.0.0"
}
},
"node_modules/dom-accessibility-api": {
"version": "0.6.3",
"dev": true,
@@ -13904,6 +13894,14 @@
],
"license": "MIT"
},
"node_modules/randombytes": {
"version": "2.1.0",
"license": "MIT",
"peer": true,
"dependencies": {
"safe-buffer": "^5.1.0"
}
},
"node_modules/react": {
"version": "19.2.4",
"resolved": "https://registry.npmjs.org/react/-/react-19.2.4.tgz",
@@ -14600,6 +14598,25 @@
"url": "https://github.com/sponsors/ljharb"
}
},
"node_modules/safe-buffer": {
"version": "5.2.1",
"funding": [
{
"type": "github",
"url": "https://github.com/sponsors/feross"
},
{
"type": "patreon",
"url": "https://www.patreon.com/feross"
},
{
"type": "consulting",
"url": "https://feross.org/support"
}
],
"license": "MIT",
"peer": true
},
"node_modules/safe-push-apply": {
"version": "1.0.0",
"dev": true,
@@ -14679,6 +14696,14 @@
"node": ">=10"
}
},
"node_modules/serialize-javascript": {
"version": "6.0.2",
"license": "BSD-3-Clause",
"peer": true,
"dependencies": {
"randombytes": "^2.1.0"
}
},
"node_modules/set-function-length": {
"version": "1.2.2",
"dev": true,
@@ -15486,15 +15511,16 @@
}
},
"node_modules/terser-webpack-plugin": {
"version": "5.3.17",
"resolved": "https://registry.npmjs.org/terser-webpack-plugin/-/terser-webpack-plugin-5.3.17.tgz",
"integrity": "sha512-YR7PtUp6GMU91BgSJmlaX/rS2lGDbAF7D+Wtq7hRO+MiljNmodYvqslzCFiYVAgW+Qoaaia/QUIP4lGXufjdZw==",
"version": "5.3.16",
"resolved": "https://registry.npmjs.org/terser-webpack-plugin/-/terser-webpack-plugin-5.3.16.tgz",
"integrity": "sha512-h9oBFCWrq78NyWWVcSwZarJkZ01c2AyGrzs1crmHZO3QUg9D61Wu4NPjBy69n7JqylFF5y+CsUZYmYEIZ3mR+Q==",
"license": "MIT",
"peer": true,
"dependencies": {
"@jridgewell/trace-mapping": "^0.3.25",
"jest-worker": "^27.4.5",
"schema-utils": "^4.3.0",
"serialize-javascript": "^6.0.2",
"terser": "^5.31.1"
},
"engines": {

View File

@@ -64,7 +64,6 @@
"cmdk": "^1.0.0",
"cookies-next": "^5.1.0",
"date-fns": "^3.6.0",
"docx-preview": "^0.3.7",
"favicon-fetch": "^1.0.0",
"formik": "^2.2.9",
"highlight.js": "^11.11.1",

Some files were not shown because too many files have changed in this diff Show More