Compare commits

..

4 Commits

Author SHA1 Message Date
Raunak Bhagat
22903c9343 feat: move SidebarTab to Opal, add disabled styling for sidebar variants
- Add [data-disabled] CSS rules for sidebar-heavy and sidebar-light
  variants (text-02 foreground, transparent background, no opacity)
- Move SidebarTab from refresh-components to @opal/components with
  disabled prop, docstrings, stories, and README
- Old file re-exports from Opal for backwards compatibility
2026-04-02 10:25:23 -07:00
Raunak Bhagat
17b0d19faf docs: update Disabled section in AGENTS.md 2026-04-02 09:51:22 -07:00
Raunak Bhagat
3b6955468e docs: update Disabled and Interactive.Simple docstrings
Disabled is now a pure CSS wrapper with no React context. Remove
"use client" directive, update CLAUDE.md description, and simplify
Interactive.Simple's docstring.
2026-04-02 09:51:22 -07:00
Raunak Bhagat
9c091ecd45 refactor: remove Disabled context, add disabled prop to all Interactive primitives
- Add `disabled` prop to Interactive.Stateful and Interactive.Simple
- Remove useDisabled() from Interactive.Container (derives from aria-disabled)
- Add `disabled` prop to OpenButton and SelectButton
- Migrate callsites: SharedAppInputBar, AppInputBar, InputBar, LLMPopover
- Strip context, useDisabled, and DisabledContextValue from Disabled component
- Disabled is now a pure CSS wrapper with no React context
2026-04-02 09:51:22 -07:00
66 changed files with 754 additions and 2232 deletions

View File

@@ -1 +0,0 @@
../../../cli/internal/embedded/SKILL.md

View File

@@ -0,0 +1,186 @@
---
name: onyx-cli
description: Query the Onyx knowledge base using the onyx-cli command. Use when the user wants to search company documents, ask questions about internal knowledge, query connected data sources, or look up information stored in Onyx.
---
# Onyx CLI — Agent Tool
Onyx is an enterprise search and Gen-AI platform that connects to company documents, apps, and people. The `onyx-cli` CLI provides non-interactive commands to query the Onyx knowledge base and list available agents.
## Prerequisites
### 1. Check if installed
```bash
which onyx-cli
```
### 2. Install (if needed)
**Primary — pip:**
```bash
pip install onyx-cli
```
**From source (Go):**
```bash
cd cli && go build -o onyx-cli . && sudo mv onyx-cli /usr/local/bin/
```
### 3. Check if configured
```bash
onyx-cli validate-config
```
This checks the config file exists, API key is present, and tests the server connection via `/api/me`. Exit code 0 on success, non-zero with a descriptive error on failure.
If unconfigured, you have two options:
**Option A — Interactive setup (requires user input):**
```bash
onyx-cli configure
```
This prompts for the Onyx server URL and API key, tests the connection, and saves config.
**Option B — Environment variables (non-interactive, preferred for agents):**
```bash
export ONYX_SERVER_URL="https://your-onyx-server.com" # default: https://cloud.onyx.app
export ONYX_API_KEY="your-api-key"
```
Environment variables override the config file. If these are set, no config file is needed.
| Variable | Required | Description |
|----------|----------|-------------|
| `ONYX_SERVER_URL` | No | Onyx server base URL (default: `https://cloud.onyx.app`) |
| `ONYX_API_KEY` | Yes | API key for authentication |
| `ONYX_PERSONA_ID` | No | Default agent/persona ID |
If neither the config file nor environment variables are set, tell the user that `onyx-cli` needs to be configured and ask them to either:
- Run `onyx-cli configure` interactively, or
- Set `ONYX_SERVER_URL` and `ONYX_API_KEY` environment variables
## Commands
### Validate configuration
```bash
onyx-cli validate-config
```
Checks config file exists, API key is present, and tests the server connection. Use this before `ask` or `agents` to confirm the CLI is properly set up.
### List available agents
```bash
onyx-cli agents
```
Prints a table of agent IDs, names, and descriptions. Use `--json` for structured output:
```bash
onyx-cli agents --json
```
Use agent IDs with `ask --agent-id` to query a specific agent.
### Basic query (plain text output)
```bash
onyx-cli ask "What is our company's PTO policy?"
```
Streams the answer as plain text to stdout. Exit code 0 on success, non-zero on error.
### JSON output (structured events)
```bash
onyx-cli ask --json "What authentication methods do we support?"
```
Outputs JSON-encoded parsed stream events (one object per line). Key event objects include message deltas, stop, errors, search-start, and citation payloads.
Each line is a JSON object with this envelope:
```json
{"type": "<event_type>", "event": { ... }}
```
| Event Type | Description |
|------------|-------------|
| `message_delta` | Content token — concatenate all `content` fields for the full answer |
| `stop` | Stream complete |
| `error` | Error with `error` message field |
| `search_tool_start` | Onyx started searching documents |
| `citation_info` | Source citation — see shape below |
`citation_info` event shape:
```json
{
"type": "citation_info",
"event": {
"citation_number": 1,
"document_id": "abc123def456",
"placement": {"turn_index": 0, "tab_index": 0, "sub_turn_index": null}
}
}
```
`placement` is metadata about where in the conversation the citation appeared and can be ignored for most use cases.
### Specify an agent
```bash
onyx-cli ask --agent-id 5 "Summarize our Q4 roadmap"
```
Uses a specific Onyx agent/persona instead of the default.
### All flags
| Flag | Type | Description |
|------|------|-------------|
| `--agent-id` | int | Agent ID to use (overrides default) |
| `--json` | bool | Output raw NDJSON events instead of plain text |
## Statelessness
Each `onyx-cli ask` call creates an independent chat session. There is no built-in way to chain context across multiple `ask` invocations — every call starts fresh. If you need multi-turn conversation with memory, use the interactive TUI (`onyx-cli` or `onyx-cli chat`) instead.
## When to Use
Use `onyx-cli ask` when:
- The user asks about company-specific information (policies, docs, processes)
- You need to search internal knowledge bases or connected data sources
- The user references Onyx, asks you to "search Onyx", or wants to query their documents
- You need context from company wikis, Confluence, Google Drive, Slack, or other connected sources
Do NOT use when:
- The question is about general programming knowledge (use your own knowledge)
- The user is asking about code in the current repository (use grep/read tools)
- The user hasn't mentioned Onyx and the question doesn't require internal company data
## Examples
```bash
# Simple question
onyx-cli ask "What are the steps to deploy to production?"
# Get structured output for parsing
onyx-cli ask --json "List all active API integrations"
# Use a specialized agent
onyx-cli ask --agent-id 3 "What were the action items from last week's standup?"
# Pipe the answer into another command
onyx-cli ask "What is the database schema for users?" | head -20
```

View File

@@ -1,14 +1,20 @@
from datetime import datetime
from datetime import timezone
from uuid import UUID
from celery import shared_task
from celery import Task
from ee.onyx.background.celery_utils import should_perform_chat_ttl_check
from ee.onyx.background.task_name_builders import name_chat_ttl_task
from onyx.configs.app_configs import JOB_TIMEOUT
from onyx.configs.constants import OnyxCeleryTask
from onyx.db.chat import delete_chat_session
from onyx.db.chat import get_chat_sessions_older_than
from onyx.db.engine.sql_engine import get_session_with_current_tenant
from onyx.db.enums import TaskStatus
from onyx.db.tasks import mark_task_as_finished_with_id
from onyx.db.tasks import register_task
from onyx.server.settings.store import load_settings
from onyx.utils.logger import setup_logger
@@ -23,42 +29,59 @@ logger = setup_logger()
trail=False,
)
def perform_ttl_management_task(
self: Task, retention_limit_days: int, *, tenant_id: str # noqa: ARG001
self: Task, retention_limit_days: int, *, tenant_id: str
) -> None:
task_id = self.request.id
if not task_id:
raise RuntimeError("No task id defined for this task; cannot identify it")
start_time = datetime.now(tz=timezone.utc)
user_id: UUID | None = None
session_id: UUID | None = None
try:
with get_session_with_current_tenant() as db_session:
# we generally want to move off this, but keeping for now
register_task(
db_session=db_session,
task_name=name_chat_ttl_task(retention_limit_days, tenant_id),
task_id=task_id,
status=TaskStatus.STARTED,
start_time=start_time,
)
old_chat_sessions = get_chat_sessions_older_than(
retention_limit_days, db_session
)
for user_id, session_id in old_chat_sessions:
try:
with get_session_with_current_tenant() as db_session:
delete_chat_session(
user_id,
session_id,
db_session,
include_deleted=True,
hard_delete=True,
)
except Exception:
logger.exception(
"Failed to delete chat session "
f"user_id={user_id} session_id={session_id}, "
"continuing with remaining sessions"
# one session per delete so that we don't blow up if a deletion fails.
with get_session_with_current_tenant() as db_session:
delete_chat_session(
user_id,
session_id,
db_session,
include_deleted=True,
hard_delete=True,
)
with get_session_with_current_tenant() as db_session:
mark_task_as_finished_with_id(
db_session=db_session,
task_id=task_id,
success=True,
)
except Exception:
logger.exception(
f"delete_chat_session exceptioned. user_id={user_id} session_id={session_id}"
)
with get_session_with_current_tenant() as db_session:
mark_task_as_finished_with_id(
db_session=db_session,
task_id=task_id,
success=False,
)
raise

View File

@@ -36,7 +36,6 @@ from onyx.configs.constants import OnyxRedisLocks
from onyx.db.engine.sql_engine import get_session_with_current_tenant
from onyx.db.opensearch_migration import build_sanitized_to_original_doc_id_mapping
from onyx.db.opensearch_migration import get_vespa_visit_state
from onyx.db.opensearch_migration import is_migration_completed
from onyx.db.opensearch_migration import (
mark_migration_completed_time_if_not_set_with_commit,
)
@@ -107,19 +106,14 @@ def migrate_chunks_from_vespa_to_opensearch_task(
acquired; effectively a no-op. True if the task completed
successfully. False if the task errored.
"""
# 1. Check if we should run the task.
# 1.a. If OpenSearch indexing is disabled, we don't run the task.
if not ENABLE_OPENSEARCH_INDEXING_FOR_ONYX:
task_logger.warning(
"OpenSearch migration is not enabled, skipping chunk migration task."
)
return None
task_logger.info("Starting chunk-level migration from Vespa to OpenSearch.")
task_start_time = time.monotonic()
# 1.b. Only one instance per tenant of this task may run concurrently at
# once. If we fail to acquire a lock, we assume it is because another task
# has one and we exit.
r = get_redis_client()
lock: RedisLock = r.lock(
name=OnyxRedisLocks.OPENSEARCH_MIGRATION_BEAT_LOCK,
@@ -142,11 +136,10 @@ def migrate_chunks_from_vespa_to_opensearch_task(
f"Token: {lock.local.token}"
)
# 2. Prepare to migrate.
total_chunks_migrated_this_task = 0
total_chunks_errored_this_task = 0
try:
# 2.a. Double-check that tenant info is correct.
# Double check that tenant info is correct.
if tenant_id != get_current_tenant_id():
err_str = (
f"Tenant ID mismatch in the OpenSearch migration task: "
@@ -155,62 +148,16 @@ def migrate_chunks_from_vespa_to_opensearch_task(
task_logger.error(err_str)
return False
# Do as much as we can with a DB session in one spot to not hold a
# session during a migration batch.
with get_session_with_current_tenant() as db_session:
# 2.b. Immediately check to see if this tenant is done, to save
# having to do any other work. This function does not require a
# migration record to necessarily exist.
if is_migration_completed(db_session):
return True
# 2.c. Try to insert the OpenSearchTenantMigrationRecord table if it
# does not exist.
with (
get_session_with_current_tenant() as db_session,
get_vespa_http_client(
timeout=VESPA_MIGRATION_REQUEST_TIMEOUT_S
) as vespa_client,
):
try_insert_opensearch_tenant_migration_record_with_commit(db_session)
# 2.d. Get search settings.
search_settings = get_current_search_settings(db_session)
indexing_setting = IndexingSetting.from_db_model(search_settings)
# 2.e. Build sanitized to original doc ID mapping to check for
# conflicts in the event we sanitize a doc ID to an
# already-existing doc ID.
# We reconstruct this mapping for every task invocation because
# a document may have been added in the time between two tasks.
sanitized_doc_start_time = time.monotonic()
sanitized_to_original_doc_id_mapping = (
build_sanitized_to_original_doc_id_mapping(db_session)
)
task_logger.debug(
f"Built sanitized_to_original_doc_id_mapping with {len(sanitized_to_original_doc_id_mapping)} entries "
f"in {time.monotonic() - sanitized_doc_start_time:.3f} seconds."
)
# 2.f. Get the current migration state.
continuation_token_map, total_chunks_migrated = get_vespa_visit_state(
db_session
)
# 2.f.1. Double-check that the migration state does not imply
# completion. Really we should never have to enter this block as we
# would expect is_migration_completed to return True, but in the
# strange event that the migration is complete but the migration
# completed time was never stamped, we do so here.
if is_continuation_token_done_for_all_slices(continuation_token_map):
task_logger.info(
f"OpenSearch migration COMPLETED for tenant {tenant_id}. Total chunks migrated: {total_chunks_migrated}."
)
mark_migration_completed_time_if_not_set_with_commit(db_session)
return True
task_logger.debug(
f"Read the tenant migration record. Total chunks migrated: {total_chunks_migrated}. "
f"Continuation token map: {continuation_token_map}"
)
with get_vespa_http_client(
timeout=VESPA_MIGRATION_REQUEST_TIMEOUT_S
) as vespa_client:
# 2.g. Create the OpenSearch and Vespa document indexes.
tenant_state = TenantState(tenant_id=tenant_id, multitenant=MULTI_TENANT)
indexing_setting = IndexingSetting.from_db_model(search_settings)
opensearch_document_index = OpenSearchDocumentIndex(
tenant_state=tenant_state,
index_name=search_settings.index_name,
@@ -224,14 +171,22 @@ def migrate_chunks_from_vespa_to_opensearch_task(
httpx_client=vespa_client,
)
# 2.h. Get the approximate chunk count in Vespa as of this time to
# update the migration record.
sanitized_doc_start_time = time.monotonic()
# We reconstruct this mapping for every task invocation because a
# document may have been added in the time between two tasks.
sanitized_to_original_doc_id_mapping = (
build_sanitized_to_original_doc_id_mapping(db_session)
)
task_logger.debug(
f"Built sanitized_to_original_doc_id_mapping with {len(sanitized_to_original_doc_id_mapping)} entries "
f"in {time.monotonic() - sanitized_doc_start_time:.3f} seconds."
)
approx_chunk_count_in_vespa: int | None = None
get_chunk_count_start_time = time.monotonic()
try:
approx_chunk_count_in_vespa = vespa_document_index.get_chunk_count()
except Exception:
# This failure should not be blocking.
task_logger.exception(
"Error getting approximate chunk count in Vespa. Moving on..."
)
@@ -240,12 +195,25 @@ def migrate_chunks_from_vespa_to_opensearch_task(
f"approximate chunk count in Vespa. Got {approx_chunk_count_in_vespa}."
)
# 3. Do the actual migration in batches until we run out of time.
while (
time.monotonic() - task_start_time < MIGRATION_TASK_SOFT_TIME_LIMIT_S
and lock.owned()
):
# 3.a. Get the next batch of raw chunks from Vespa.
(
continuation_token_map,
total_chunks_migrated,
) = get_vespa_visit_state(db_session)
if is_continuation_token_done_for_all_slices(continuation_token_map):
task_logger.info(
f"OpenSearch migration COMPLETED for tenant {tenant_id}. Total chunks migrated: {total_chunks_migrated}."
)
mark_migration_completed_time_if_not_set_with_commit(db_session)
break
task_logger.debug(
f"Read the tenant migration record. Total chunks migrated: {total_chunks_migrated}. "
f"Continuation token map: {continuation_token_map}"
)
get_vespa_chunks_start_time = time.monotonic()
raw_vespa_chunks, next_continuation_token_map = (
vespa_document_index.get_all_raw_document_chunks_paginated(
@@ -258,7 +226,6 @@ def migrate_chunks_from_vespa_to_opensearch_task(
f"seconds. Next continuation token map: {next_continuation_token_map}"
)
# 3.b. Transform the raw chunks to OpenSearch chunks in memory.
opensearch_document_chunks, errored_chunks = (
transform_vespa_chunks_to_opensearch_chunks(
raw_vespa_chunks,
@@ -273,7 +240,6 @@ def migrate_chunks_from_vespa_to_opensearch_task(
"errored."
)
# 3.c. Index the OpenSearch chunks into OpenSearch.
index_opensearch_chunks_start_time = time.monotonic()
opensearch_document_index.index_raw_chunks(
chunks=opensearch_document_chunks
@@ -285,38 +251,12 @@ def migrate_chunks_from_vespa_to_opensearch_task(
total_chunks_migrated_this_task += len(opensearch_document_chunks)
total_chunks_errored_this_task += len(errored_chunks)
# Do as much as we can with a DB session in one spot to not hold a
# session during a migration batch.
with get_session_with_current_tenant() as db_session:
# 3.d. Update the migration state.
update_vespa_visit_progress_with_commit(
db_session,
continuation_token_map=next_continuation_token_map,
chunks_processed=len(opensearch_document_chunks),
chunks_errored=len(errored_chunks),
approx_chunk_count_in_vespa=approx_chunk_count_in_vespa,
)
# 3.e. Get the current migration state. Even thought we
# technically have it in-memory since we just wrote it, we
# want to reference the DB as the source of truth at all
# times.
continuation_token_map, total_chunks_migrated = (
get_vespa_visit_state(db_session)
)
# 3.e.1. Check if the migration is done.
if is_continuation_token_done_for_all_slices(
continuation_token_map
):
task_logger.info(
f"OpenSearch migration COMPLETED for tenant {tenant_id}. Total chunks migrated: {total_chunks_migrated}."
)
mark_migration_completed_time_if_not_set_with_commit(db_session)
return True
task_logger.debug(
f"Read the tenant migration record. Total chunks migrated: {total_chunks_migrated}. "
f"Continuation token map: {continuation_token_map}"
update_vespa_visit_progress_with_commit(
db_session,
continuation_token_map=next_continuation_token_map,
chunks_processed=len(opensearch_document_chunks),
chunks_errored=len(errored_chunks),
approx_chunk_count_in_vespa=approx_chunk_count_in_vespa,
)
except Exception:
traceback.print_exc()

View File

@@ -199,7 +199,7 @@ def delete_messages_and_files_from_chat_session(
for _, files in messages_with_files:
file_store = get_default_file_store()
for file_info in files or []:
file_store.delete_file(file_id=file_info.get("id"), error_on_missing=False)
file_store.delete_file(file_id=file_info.get("id"))
# Delete ChatMessage records - CASCADE constraints will automatically handle:
# - ChatMessage__StandardAnswer relationship records

View File

@@ -324,15 +324,6 @@ def mark_migration_completed_time_if_not_set_with_commit(
db_session.commit()
def is_migration_completed(db_session: Session) -> bool:
"""Returns True if the migration is completed.
Can be run even if the migration record does not exist.
"""
record = db_session.query(OpenSearchTenantMigrationRecord).first()
return record is not None and record.migration_completed_at is not None
def build_sanitized_to_original_doc_id_mapping(
db_session: Session,
) -> dict[str, str]:

View File

@@ -1,4 +1,3 @@
import hashlib
from datetime import datetime
from datetime import timezone
from typing import Any
@@ -21,13 +20,9 @@ from onyx.document_index.opensearch.constants import DEFAULT_MAX_CHUNK_SIZE
from onyx.document_index.opensearch.constants import EF_CONSTRUCTION
from onyx.document_index.opensearch.constants import EF_SEARCH
from onyx.document_index.opensearch.constants import M
from onyx.document_index.opensearch.string_filtering import DocumentIDTooLongError
from onyx.document_index.opensearch.string_filtering import (
filter_and_validate_document_id,
)
from onyx.document_index.opensearch.string_filtering import (
MAX_DOCUMENT_ID_ENCODED_LENGTH,
)
from onyx.utils.tenant import get_tenant_id_short_string
from shared_configs.configs import MULTI_TENANT
from shared_configs.contextvars import get_current_tenant_id
@@ -80,50 +75,17 @@ def get_opensearch_doc_chunk_id(
This will be the string used to identify the chunk in OpenSearch. Any direct
chunk queries should use this function.
If the document ID is too long, a hash of the ID is used instead.
"""
opensearch_doc_chunk_id_suffix: str = f"__{max_chunk_size}__{chunk_index}"
encoded_suffix_length: int = len(opensearch_doc_chunk_id_suffix.encode("utf-8"))
max_encoded_permissible_doc_id_length: int = (
MAX_DOCUMENT_ID_ENCODED_LENGTH - encoded_suffix_length
sanitized_document_id = filter_and_validate_document_id(document_id)
opensearch_doc_chunk_id = (
f"{sanitized_document_id}__{max_chunk_size}__{chunk_index}"
)
opensearch_doc_chunk_id_tenant_prefix: str = ""
if tenant_state.multitenant:
short_tenant_id: str = get_tenant_id_short_string(tenant_state.tenant_id)
# Use tenant ID because in multitenant mode each tenant has its own
# Documents table, so there is a very small chance that doc IDs are not
# actually unique across all tenants.
opensearch_doc_chunk_id_tenant_prefix = f"{short_tenant_id}__"
encoded_prefix_length: int = len(
opensearch_doc_chunk_id_tenant_prefix.encode("utf-8")
)
max_encoded_permissible_doc_id_length -= encoded_prefix_length
try:
sanitized_document_id: str = filter_and_validate_document_id(
document_id, max_encoded_length=max_encoded_permissible_doc_id_length
)
except DocumentIDTooLongError:
# If the document ID is too long, use a hash instead.
# We use blake2b because it is faster and equally secure as SHA256, and
# accepts digest_size which controls the number of bytes returned in the
# hash.
# digest_size is the size of the returned hash in bytes. Since we're
# decoding the hash bytes as a hex string, the digest_size should be
# half the max target size of the hash string.
# Subtract 1 because filter_and_validate_document_id compares on >= on
# max_encoded_length.
# 64 is the max digest_size blake2b returns.
digest_size: int = min((max_encoded_permissible_doc_id_length - 1) // 2, 64)
sanitized_document_id = hashlib.blake2b(
document_id.encode("utf-8"), digest_size=digest_size
).hexdigest()
opensearch_doc_chunk_id: str = (
f"{opensearch_doc_chunk_id_tenant_prefix}{sanitized_document_id}{opensearch_doc_chunk_id_suffix}"
)
short_tenant_id = get_tenant_id_short_string(tenant_state.tenant_id)
opensearch_doc_chunk_id = f"{short_tenant_id}__{opensearch_doc_chunk_id}"
# Do one more validation to ensure we haven't exceeded the max length.
opensearch_doc_chunk_id = filter_and_validate_document_id(opensearch_doc_chunk_id)
return opensearch_doc_chunk_id

View File

@@ -1,15 +1,7 @@
import re
MAX_DOCUMENT_ID_ENCODED_LENGTH: int = 512
class DocumentIDTooLongError(ValueError):
"""Raised when a document ID is too long for OpenSearch after filtering."""
def filter_and_validate_document_id(
document_id: str, max_encoded_length: int = MAX_DOCUMENT_ID_ENCODED_LENGTH
) -> str:
def filter_and_validate_document_id(document_id: str) -> str:
"""
Filters and validates a document ID such that it can be used as an ID in
OpenSearch.
@@ -27,13 +19,9 @@ def filter_and_validate_document_id(
Args:
document_id: The document ID to filter and validate.
max_encoded_length: The maximum length of the document ID after
filtering in bytes. Compared with >= for extra resilience, so
encoded values of this length will fail.
Raises:
DocumentIDTooLongError: If the document ID is too long after filtering.
ValueError: If the document ID is empty after filtering.
ValueError: If the document ID is empty or too long after filtering.
Returns:
str: The filtered document ID.
@@ -41,8 +29,6 @@ def filter_and_validate_document_id(
filtered_document_id = re.sub(r"[^A-Za-z0-9_.\-~]", "", document_id)
if not filtered_document_id:
raise ValueError(f"Document ID {document_id} is empty after filtering.")
if len(filtered_document_id.encode("utf-8")) >= max_encoded_length:
raise DocumentIDTooLongError(
f"Document ID {document_id} is too long after filtering."
)
if len(filtered_document_id.encode("utf-8")) >= 512:
raise ValueError(f"Document ID {document_id} is too long after filtering.")
return filtered_document_id

View File

@@ -136,14 +136,12 @@ class FileStore(ABC):
"""
@abstractmethod
def delete_file(self, file_id: str, error_on_missing: bool = True) -> None:
def delete_file(self, file_id: str) -> None:
"""
Delete a file by its ID.
Parameters:
- file_id: ID of file to delete
- error_on_missing: If False, silently return when the file record
does not exist instead of raising.
- file_name: Name of file to delete
"""
@abstractmethod
@@ -454,23 +452,12 @@ class S3BackedFileStore(FileStore):
logger.warning(f"Error getting file size for {file_id}: {e}")
return None
def delete_file(
self,
file_id: str,
error_on_missing: bool = True,
db_session: Session | None = None,
) -> None:
def delete_file(self, file_id: str, db_session: Session | None = None) -> None:
with get_session_with_current_tenant_if_none(db_session) as db_session:
try:
file_record = get_filerecord_by_file_id_optional(
file_record = get_filerecord_by_file_id(
file_id=file_id, db_session=db_session
)
if file_record is None:
if error_on_missing:
raise RuntimeError(
f"File by id {file_id} does not exist or was deleted"
)
return
if not file_record.bucket_name:
logger.error(
f"File record {file_id} with key {file_record.object_key} "

View File

@@ -222,23 +222,12 @@ class PostgresBackedFileStore(FileStore):
logger.warning(f"Error getting file size for {file_id}: {e}")
return None
def delete_file(
self,
file_id: str,
error_on_missing: bool = True,
db_session: Session | None = None,
) -> None:
def delete_file(self, file_id: str, db_session: Session | None = None) -> None:
with get_session_with_current_tenant_if_none(db_session) as session:
try:
file_content = get_file_content_by_file_id_optional(
file_content = get_file_content_by_file_id(
file_id=file_id, db_session=session
)
if file_content is None:
if error_on_missing:
raise RuntimeError(
f"File content for file_id {file_id} does not exist or was deleted"
)
return
raw_conn = _get_raw_connection(session)
try:

View File

@@ -3,8 +3,6 @@
from datetime import datetime
from typing import Any
import httpx
from onyx.configs.constants import DocumentSource
from onyx.mcp_server.api import mcp_server
from onyx.mcp_server.utils import get_http_client
@@ -17,21 +15,6 @@ from onyx.utils.variable_functionality import global_version
logger = setup_logger()
def _extract_error_detail(response: httpx.Response) -> str:
"""Extract a human-readable error message from a failed backend response.
The backend returns OnyxError responses as
``{"error_code": "...", "detail": "..."}``.
"""
try:
body = response.json()
if detail := body.get("detail"):
return str(detail)
except Exception:
pass
return f"Request failed with status {response.status_code}"
@mcp_server.tool()
async def search_indexed_documents(
query: str,
@@ -175,14 +158,7 @@ async def search_indexed_documents(
json=search_request,
headers=auth_headers,
)
if not response.is_success:
error_detail = _extract_error_detail(response)
return {
"documents": [],
"total_results": 0,
"query": query,
"error": error_detail,
}
response.raise_for_status()
result = response.json()
# Check for error in response
@@ -258,13 +234,7 @@ async def search_web(
json=request_payload,
headers={"Authorization": f"Bearer {access_token.token}"},
)
if not response.is_success:
error_detail = _extract_error_detail(response)
return {
"error": error_detail,
"results": [],
"query": query,
}
response.raise_for_status()
response_payload = response.json()
results = response_payload.get("results", [])
return {
@@ -310,12 +280,7 @@ async def open_urls(
json={"urls": urls},
headers={"Authorization": f"Bearer {access_token.token}"},
)
if not response.is_success:
error_detail = _extract_error_detail(response)
return {
"error": error_detail,
"results": [],
}
response.raise_for_status()
response_payload = response.json()
results = response_payload.get("results", [])
return {

View File

@@ -1,5 +1,6 @@
from fastapi import APIRouter
from fastapi import Depends
from fastapi import HTTPException
from sqlalchemy.orm import Session
from onyx.auth.users import current_user
@@ -8,8 +9,6 @@ from onyx.db.engine.sql_engine import get_session
from onyx.db.models import User
from onyx.db.web_search import fetch_active_web_content_provider
from onyx.db.web_search import fetch_active_web_search_provider
from onyx.error_handling.error_codes import OnyxErrorCode
from onyx.error_handling.exceptions import OnyxError
from onyx.server.features.web_search.models import OpenUrlsToolRequest
from onyx.server.features.web_search.models import OpenUrlsToolResponse
from onyx.server.features.web_search.models import WebSearchToolRequest
@@ -62,10 +61,9 @@ def _get_active_search_provider(
) -> tuple[WebSearchProviderView, WebSearchProvider]:
provider_model = fetch_active_web_search_provider(db_session)
if provider_model is None:
raise OnyxError(
OnyxErrorCode.INVALID_INPUT,
"No web search provider configured. Please configure one in "
"Admin > Web Search settings.",
raise HTTPException(
status_code=400,
detail="No web search provider configured.",
)
provider_view = WebSearchProviderView(
@@ -78,10 +76,9 @@ def _get_active_search_provider(
)
if provider_model.api_key is None:
raise OnyxError(
OnyxErrorCode.INVALID_INPUT,
"Web search provider requires an API key. Please configure one in "
"Admin > Web Search settings.",
raise HTTPException(
status_code=400,
detail="Web search provider requires an API key.",
)
try:
@@ -91,7 +88,7 @@ def _get_active_search_provider(
config=provider_model.config or {},
)
except ValueError as exc:
raise OnyxError(OnyxErrorCode.INVALID_INPUT, str(exc)) from exc
raise HTTPException(status_code=400, detail=str(exc)) from exc
return provider_view, provider
@@ -113,9 +110,9 @@ def _get_active_content_provider(
if provider_model.api_key is None:
# TODO - this is not a great error, in fact, this key should not be nullable.
raise OnyxError(
OnyxErrorCode.INVALID_INPUT,
"Web content provider requires an API key.",
raise HTTPException(
status_code=400,
detail="Web content provider requires an API key.",
)
try:
@@ -128,12 +125,12 @@ def _get_active_content_provider(
config=config,
)
except ValueError as exc:
raise OnyxError(OnyxErrorCode.INVALID_INPUT, str(exc)) from exc
raise HTTPException(status_code=400, detail=str(exc)) from exc
if provider is None:
raise OnyxError(
OnyxErrorCode.INVALID_INPUT,
"Unable to initialize the configured web content provider.",
raise HTTPException(
status_code=400,
detail="Unable to initialize the configured web content provider.",
)
provider_view = WebContentProviderView(
@@ -157,13 +154,12 @@ def _run_web_search(
for query in request.queries:
try:
search_results = provider.search(query)
except OnyxError:
except HTTPException:
raise
except Exception as exc:
logger.exception("Web search provider failed for query '%s'", query)
raise OnyxError(
OnyxErrorCode.BAD_GATEWAY,
"Web search provider failed to execute query.",
raise HTTPException(
status_code=502, detail="Web search provider failed to execute query."
) from exc
filtered_results = filter_web_search_results_with_no_title_or_snippet(
@@ -196,13 +192,12 @@ def _open_urls(
docs = filter_web_contents_with_no_title_or_content(
list(provider.contents(urls))
)
except OnyxError:
except HTTPException:
raise
except Exception as exc:
logger.exception("Web content provider failed to fetch URLs")
raise OnyxError(
OnyxErrorCode.BAD_GATEWAY,
"Web content provider failed to fetch URLs.",
raise HTTPException(
status_code=502, detail="Web content provider failed to fetch URLs."
) from exc
results: list[LlmOpenUrlResult] = []

View File

@@ -1,203 +0,0 @@
import pytest
from onyx.document_index.interfaces_new import TenantState
from onyx.document_index.opensearch.constants import DEFAULT_MAX_CHUNK_SIZE
from onyx.document_index.opensearch.schema import get_opensearch_doc_chunk_id
from onyx.document_index.opensearch.string_filtering import (
MAX_DOCUMENT_ID_ENCODED_LENGTH,
)
from shared_configs.configs import POSTGRES_DEFAULT_SCHEMA_STANDARD_VALUE
SINGLE_TENANT_STATE = TenantState(
tenant_id=POSTGRES_DEFAULT_SCHEMA_STANDARD_VALUE, multitenant=False
)
MULTI_TENANT_STATE = TenantState(
tenant_id="tenant_abcdef12-3456-7890-abcd-ef1234567890", multitenant=True
)
EXPECTED_SHORT_TENANT = "abcdef12"
class TestGetOpensearchDocChunkIdSingleTenant:
def test_basic(self) -> None:
result = get_opensearch_doc_chunk_id(
SINGLE_TENANT_STATE, "my-doc-id", chunk_index=0
)
assert result == f"my-doc-id__{DEFAULT_MAX_CHUNK_SIZE}__0"
def test_custom_chunk_size(self) -> None:
result = get_opensearch_doc_chunk_id(
SINGLE_TENANT_STATE, "doc1", chunk_index=3, max_chunk_size=1024
)
assert result == "doc1__1024__3"
def test_special_chars_are_stripped(self) -> None:
"""Tests characters not matching [A-Za-z0-9_.-~] are removed."""
result = get_opensearch_doc_chunk_id(
SINGLE_TENANT_STATE, "doc/with?special#chars&more%stuff", chunk_index=0
)
assert "/" not in result
assert "?" not in result
assert "#" not in result
assert result == f"docwithspecialcharsmorestuff__{DEFAULT_MAX_CHUNK_SIZE}__0"
def test_short_doc_id_not_hashed(self) -> None:
"""
Tests that a short doc ID should appear directly in the result, not as a
hash.
"""
doc_id = "short-id"
result = get_opensearch_doc_chunk_id(SINGLE_TENANT_STATE, doc_id, chunk_index=0)
assert "short-id" in result
def test_long_doc_id_is_hashed(self) -> None:
"""
Tests that a doc ID exceeding the max length should be replaced with a
blake2b hash.
"""
# Create a doc ID that will exceed max length after the suffix is
# appended.
doc_id = "a" * MAX_DOCUMENT_ID_ENCODED_LENGTH
result = get_opensearch_doc_chunk_id(SINGLE_TENANT_STATE, doc_id, chunk_index=0)
# The original doc ID should NOT appear in the result.
assert doc_id not in result
# The suffix should still be present.
assert f"__{DEFAULT_MAX_CHUNK_SIZE}__0" in result
def test_long_doc_id_hash_is_deterministic(self) -> None:
doc_id = "x" * MAX_DOCUMENT_ID_ENCODED_LENGTH
result1 = get_opensearch_doc_chunk_id(
SINGLE_TENANT_STATE, doc_id, chunk_index=5
)
result2 = get_opensearch_doc_chunk_id(
SINGLE_TENANT_STATE, doc_id, chunk_index=5
)
assert result1 == result2
def test_long_doc_id_different_inputs_produce_different_hashes(self) -> None:
doc_id_a = "a" * MAX_DOCUMENT_ID_ENCODED_LENGTH
doc_id_b = "b" * MAX_DOCUMENT_ID_ENCODED_LENGTH
result_a = get_opensearch_doc_chunk_id(
SINGLE_TENANT_STATE, doc_id_a, chunk_index=0
)
result_b = get_opensearch_doc_chunk_id(
SINGLE_TENANT_STATE, doc_id_b, chunk_index=0
)
assert result_a != result_b
def test_result_never_exceeds_max_length(self) -> None:
"""
Tests that the final result should always be under
MAX_DOCUMENT_ID_ENCODED_LENGTH bytes.
"""
doc_id = "z" * (MAX_DOCUMENT_ID_ENCODED_LENGTH * 2)
result = get_opensearch_doc_chunk_id(
SINGLE_TENANT_STATE, doc_id, chunk_index=999, max_chunk_size=99999
)
assert len(result.encode("utf-8")) < MAX_DOCUMENT_ID_ENCODED_LENGTH
def test_no_tenant_prefix_in_single_tenant(self) -> None:
result = get_opensearch_doc_chunk_id(
SINGLE_TENANT_STATE, "mydoc", chunk_index=0
)
assert not result.startswith(SINGLE_TENANT_STATE.tenant_id)
class TestGetOpensearchDocChunkIdMultiTenant:
def test_includes_tenant_prefix(self) -> None:
result = get_opensearch_doc_chunk_id(MULTI_TENANT_STATE, "mydoc", chunk_index=0)
assert result.startswith(f"{EXPECTED_SHORT_TENANT}__")
def test_format(self) -> None:
result = get_opensearch_doc_chunk_id(
MULTI_TENANT_STATE, "mydoc", chunk_index=2, max_chunk_size=256
)
assert result == f"{EXPECTED_SHORT_TENANT}__mydoc__256__2"
def test_long_doc_id_is_hashed_multitenant(self) -> None:
doc_id = "d" * MAX_DOCUMENT_ID_ENCODED_LENGTH
result = get_opensearch_doc_chunk_id(MULTI_TENANT_STATE, doc_id, chunk_index=0)
# Should still have tenant prefix.
assert result.startswith(f"{EXPECTED_SHORT_TENANT}__")
# The original doc ID should NOT appear in the result.
assert doc_id not in result
# The suffix should still be present.
assert f"__{DEFAULT_MAX_CHUNK_SIZE}__0" in result
def test_result_never_exceeds_max_length_multitenant(self) -> None:
doc_id = "q" * (MAX_DOCUMENT_ID_ENCODED_LENGTH * 2)
result = get_opensearch_doc_chunk_id(
MULTI_TENANT_STATE, doc_id, chunk_index=999, max_chunk_size=99999
)
assert len(result.encode("utf-8")) < MAX_DOCUMENT_ID_ENCODED_LENGTH
def test_different_tenants_produce_different_ids(self) -> None:
tenant_a = TenantState(
tenant_id="tenant_aaaaaaaa-0000-0000-0000-000000000000", multitenant=True
)
tenant_b = TenantState(
tenant_id="tenant_bbbbbbbb-0000-0000-0000-000000000000", multitenant=True
)
result_a = get_opensearch_doc_chunk_id(tenant_a, "same-doc", chunk_index=0)
result_b = get_opensearch_doc_chunk_id(tenant_b, "same-doc", chunk_index=0)
assert result_a != result_b
class TestGetOpensearchDocChunkIdEdgeCases:
def test_chunk_index_zero(self) -> None:
result = get_opensearch_doc_chunk_id(SINGLE_TENANT_STATE, "doc", chunk_index=0)
assert result.endswith("__0")
def test_large_chunk_index(self) -> None:
result = get_opensearch_doc_chunk_id(
SINGLE_TENANT_STATE, "doc", chunk_index=99999
)
assert result.endswith("__99999")
def test_doc_id_with_only_special_chars_raises(self) -> None:
"""
Tests that a doc ID that becomes empty after filtering should raise
ValueError.
"""
with pytest.raises(ValueError, match="empty after filtering"):
get_opensearch_doc_chunk_id(SINGLE_TENANT_STATE, "###???///", chunk_index=0)
def test_doc_id_at_boundary_length(self) -> None:
"""
Tests that a doc ID right at the boundary should not be hashed.
"""
suffix = f"__{DEFAULT_MAX_CHUNK_SIZE}__0"
suffix_len = len(suffix.encode("utf-8"))
# Max doc ID length that won't trigger hashing (must be <
# max_encoded_length).
max_doc_len = MAX_DOCUMENT_ID_ENCODED_LENGTH - suffix_len - 1
doc_id = "a" * max_doc_len
result = get_opensearch_doc_chunk_id(SINGLE_TENANT_STATE, doc_id, chunk_index=0)
assert doc_id in result
def test_doc_id_at_boundary_length_multitenant(self) -> None:
"""
Tests that a doc ID right at the boundary should not be hashed in
multitenant mode.
"""
suffix = f"__{DEFAULT_MAX_CHUNK_SIZE}__0"
suffix_len = len(suffix.encode("utf-8"))
prefix = f"{EXPECTED_SHORT_TENANT}__"
prefix_len = len(prefix.encode("utf-8"))
# Max doc ID length that won't trigger hashing (must be <
# max_encoded_length).
max_doc_len = MAX_DOCUMENT_ID_ENCODED_LENGTH - suffix_len - prefix_len - 1
doc_id = "a" * max_doc_len
result = get_opensearch_doc_chunk_id(MULTI_TENANT_STATE, doc_id, chunk_index=0)
assert doc_id in result
def test_doc_id_one_over_boundary_is_hashed(self) -> None:
"""
Tests that a doc ID one byte over the boundary should be hashed.
"""
suffix = f"__{DEFAULT_MAX_CHUNK_SIZE}__0"
suffix_len = len(suffix.encode("utf-8"))
# This length will trigger the >= check in filter_and_validate_document_id
doc_id = "a" * (MAX_DOCUMENT_ID_ENCODED_LENGTH - suffix_len)
result = get_opensearch_doc_chunk_id(SINGLE_TENANT_STATE, doc_id, chunk_index=0)
assert doc_id not in result

View File

@@ -1,91 +0,0 @@
"""Tests for FileStore.delete_file error_on_missing behavior."""
from unittest.mock import MagicMock
from unittest.mock import patch
import pytest
_S3_MODULE = "onyx.file_store.file_store"
_PG_MODULE = "onyx.file_store.postgres_file_store"
def _mock_db_session() -> MagicMock:
session = MagicMock()
session.__enter__ = MagicMock(return_value=session)
session.__exit__ = MagicMock(return_value=False)
return session
# ── S3BackedFileStore ────────────────────────────────────────────────
@patch(f"{_S3_MODULE}.get_session_with_current_tenant_if_none")
@patch(f"{_S3_MODULE}.get_filerecord_by_file_id_optional", return_value=None)
def test_s3_delete_missing_file_raises_by_default(
_mock_get_record: MagicMock,
mock_ctx: MagicMock,
) -> None:
from onyx.file_store.file_store import S3BackedFileStore
mock_ctx.return_value = _mock_db_session()
store = S3BackedFileStore(bucket_name="b")
with pytest.raises(RuntimeError, match="does not exist"):
store.delete_file("nonexistent")
@patch(f"{_S3_MODULE}.get_session_with_current_tenant_if_none")
@patch(f"{_S3_MODULE}.get_filerecord_by_file_id_optional", return_value=None)
@patch(f"{_S3_MODULE}.delete_filerecord_by_file_id")
def test_s3_delete_missing_file_silent_when_error_on_missing_false(
mock_delete_record: MagicMock,
_mock_get_record: MagicMock,
mock_ctx: MagicMock,
) -> None:
from onyx.file_store.file_store import S3BackedFileStore
mock_ctx.return_value = _mock_db_session()
store = S3BackedFileStore(bucket_name="b")
store.delete_file("nonexistent", error_on_missing=False)
mock_delete_record.assert_not_called()
# ── PostgresBackedFileStore ──────────────────────────────────────────
@patch(f"{_PG_MODULE}.get_session_with_current_tenant_if_none")
@patch(f"{_PG_MODULE}.get_file_content_by_file_id_optional", return_value=None)
def test_pg_delete_missing_file_raises_by_default(
_mock_get_content: MagicMock,
mock_ctx: MagicMock,
) -> None:
from onyx.file_store.postgres_file_store import PostgresBackedFileStore
mock_ctx.return_value = _mock_db_session()
store = PostgresBackedFileStore()
with pytest.raises(RuntimeError, match="does not exist"):
store.delete_file("nonexistent")
@patch(f"{_PG_MODULE}.get_session_with_current_tenant_if_none")
@patch(f"{_PG_MODULE}.get_file_content_by_file_id_optional", return_value=None)
@patch(f"{_PG_MODULE}.delete_file_content_by_file_id")
@patch(f"{_PG_MODULE}.delete_filerecord_by_file_id")
def test_pg_delete_missing_file_silent_when_error_on_missing_false(
mock_delete_record: MagicMock,
mock_delete_content: MagicMock,
_mock_get_content: MagicMock,
mock_ctx: MagicMock,
) -> None:
from onyx.file_store.postgres_file_store import PostgresBackedFileStore
mock_ctx.return_value = _mock_db_session()
store = PostgresBackedFileStore()
store.delete_file("nonexistent", error_on_missing=False)
mock_delete_record.assert_not_called()
mock_delete_content.assert_not_called()

View File

@@ -98,7 +98,6 @@ Useful hardening flags:
| `serve` | Serve the interactive chat TUI over SSH |
| `configure` | Configure server URL and API key |
| `validate-config` | Validate configuration and test connection |
| `install-skill` | Install the agent skill file into a project |
## Slash Commands (in TUI)

View File

@@ -7,7 +7,6 @@ import (
"github.com/onyx-dot-app/onyx/cli/internal/api"
"github.com/onyx-dot-app/onyx/cli/internal/config"
"github.com/onyx-dot-app/onyx/cli/internal/exitcodes"
"github.com/spf13/cobra"
)
@@ -17,23 +16,16 @@ func newAgentsCmd() *cobra.Command {
cmd := &cobra.Command{
Use: "agents",
Short: "List available agents",
Long: `List all visible agents configured on the Onyx server.
By default, output is a human-readable table with ID, name, and description.
Use --json for machine-readable output.`,
Example: ` onyx-cli agents
onyx-cli agents --json
onyx-cli agents --json | jq '.[].name'`,
RunE: func(cmd *cobra.Command, args []string) error {
cfg := config.Load()
if !cfg.IsConfigured() {
return exitcodes.New(exitcodes.NotConfigured, "onyx CLI is not configured\n Run: onyx-cli configure")
return fmt.Errorf("onyx CLI is not configured — run 'onyx-cli configure' first")
}
client := api.NewClient(cfg)
agents, err := client.ListAgents(cmd.Context())
if err != nil {
return fmt.Errorf("failed to list agents: %w\n Check your connection with: onyx-cli validate-config", err)
return fmt.Errorf("failed to list agents: %w", err)
}
if agentsJSON {

View File

@@ -4,65 +4,33 @@ import (
"context"
"encoding/json"
"fmt"
"io"
"os"
"os/signal"
"strings"
"syscall"
"github.com/onyx-dot-app/onyx/cli/internal/api"
"github.com/onyx-dot-app/onyx/cli/internal/config"
"github.com/onyx-dot-app/onyx/cli/internal/exitcodes"
"github.com/onyx-dot-app/onyx/cli/internal/models"
"github.com/onyx-dot-app/onyx/cli/internal/overflow"
"github.com/spf13/cobra"
"golang.org/x/term"
)
const defaultMaxOutputBytes = 4096
func newAskCmd() *cobra.Command {
var (
askAgentID int
askJSON bool
askQuiet bool
askPrompt string
maxOutput int
)
cmd := &cobra.Command{
Use: "ask [question]",
Short: "Ask a one-shot question (non-interactive)",
Long: `Send a one-shot question to an Onyx agent and print the response.
The question can be provided as a positional argument, via --prompt, or piped
through stdin. When stdin contains piped data, it is sent as context along
with the question from --prompt (or used as the question itself).
When stdout is not a TTY (e.g., called by a script or AI agent), output is
automatically truncated to --max-output bytes and the full response is saved
to a temp file. Set --max-output 0 to disable truncation.`,
Args: cobra.MaximumNArgs(1),
Example: ` onyx-cli ask "What connectors are available?"
onyx-cli ask --agent-id 3 "Summarize our Q4 revenue"
onyx-cli ask --json "List all users" | jq '.event.content'
cat error.log | onyx-cli ask --prompt "Find the root cause"
echo "what is onyx?" | onyx-cli ask`,
Args: cobra.ExactArgs(1),
RunE: func(cmd *cobra.Command, args []string) error {
cfg := config.Load()
if !cfg.IsConfigured() {
return exitcodes.New(exitcodes.NotConfigured, "onyx CLI is not configured\n Run: onyx-cli configure")
}
if askJSON && askQuiet {
return exitcodes.New(exitcodes.BadRequest, "--json and --quiet cannot be used together")
}
question, err := resolveQuestion(args, askPrompt)
if err != nil {
return err
return fmt.Errorf("onyx CLI is not configured — run 'onyx-cli configure' first")
}
question := args[0]
agentID := cfg.DefaultAgentID
if cmd.Flags().Changed("agent-id") {
agentID = askAgentID
@@ -82,23 +50,9 @@ to a temp file. Set --max-output 0 to disable truncation.`,
nil,
)
// Determine truncation threshold.
isTTY := term.IsTerminal(int(os.Stdout.Fd()))
truncateAt := 0 // 0 means no truncation
if cmd.Flags().Changed("max-output") {
truncateAt = maxOutput
} else if !isTTY {
truncateAt = defaultMaxOutputBytes
}
var sessionID string
var lastErr error
gotStop := false
// Overflow writer: tees to stdout and optionally to a temp file.
// In quiet mode, buffer everything and print once at the end.
ow := &overflow.Writer{Limit: truncateAt, Quiet: askQuiet}
for event := range ch {
if e, ok := event.(models.SessionCreatedEvent); ok {
sessionID = e.ChatSessionID
@@ -128,50 +82,22 @@ to a temp file. Set --max-output 0 to disable truncation.`,
switch e := event.(type) {
case models.MessageDeltaEvent:
ow.Write(e.Content)
case models.SearchStartEvent:
if isTTY && !askQuiet {
if e.IsInternetSearch {
fmt.Fprintf(os.Stderr, "\033[2mSearching the web...\033[0m\n")
} else {
fmt.Fprintf(os.Stderr, "\033[2mSearching documents...\033[0m\n")
}
}
case models.SearchQueriesEvent:
if isTTY && !askQuiet {
for _, q := range e.Queries {
fmt.Fprintf(os.Stderr, "\033[2m → %s\033[0m\n", q)
}
}
case models.SearchDocumentsEvent:
if isTTY && !askQuiet && len(e.Documents) > 0 {
fmt.Fprintf(os.Stderr, "\033[2mFound %d documents\033[0m\n", len(e.Documents))
}
case models.ReasoningStartEvent:
if isTTY && !askQuiet {
fmt.Fprintf(os.Stderr, "\033[2mThinking...\033[0m\n")
}
case models.ToolStartEvent:
if isTTY && !askQuiet && e.ToolName != "" {
fmt.Fprintf(os.Stderr, "\033[2mUsing %s...\033[0m\n", e.ToolName)
}
fmt.Print(e.Content)
case models.ErrorEvent:
ow.Finish()
return fmt.Errorf("%s", e.Error)
case models.StopEvent:
ow.Finish()
fmt.Println()
return nil
}
}
if !askJSON {
ow.Finish()
}
if ctx.Err() != nil {
if sessionID != "" {
client.StopChatSession(context.Background(), sessionID)
}
if !askJSON {
fmt.Println()
}
return nil
}
@@ -179,56 +105,20 @@ to a temp file. Set --max-output 0 to disable truncation.`,
return lastErr
}
if !gotStop {
if !askJSON {
fmt.Println()
}
return fmt.Errorf("stream ended unexpectedly")
}
if !askJSON {
fmt.Println()
}
return nil
},
}
cmd.Flags().IntVar(&askAgentID, "agent-id", 0, "Agent ID to use")
cmd.Flags().BoolVar(&askJSON, "json", false, "Output raw JSON events")
cmd.Flags().BoolVarP(&askQuiet, "quiet", "q", false, "Buffer output and print once at end (no streaming)")
cmd.Flags().StringVar(&askPrompt, "prompt", "", "Question text (use with piped stdin context)")
cmd.Flags().IntVar(&maxOutput, "max-output", defaultMaxOutputBytes,
"Max bytes to print before truncating (0 to disable, auto-enabled for non-TTY)")
// Suppress cobra's default error/usage on RunE errors
return cmd
}
// resolveQuestion builds the final question string from args, --prompt, and stdin.
func resolveQuestion(args []string, prompt string) (string, error) {
hasArg := len(args) > 0
hasPrompt := prompt != ""
hasStdin := !term.IsTerminal(int(os.Stdin.Fd()))
if hasArg && hasPrompt {
return "", exitcodes.New(exitcodes.BadRequest, "specify the question as an argument or --prompt, not both")
}
var stdinContent string
if hasStdin {
const maxStdinBytes = 10 * 1024 * 1024 // 10MB
data, err := io.ReadAll(io.LimitReader(os.Stdin, maxStdinBytes))
if err != nil {
return "", fmt.Errorf("failed to read stdin: %w", err)
}
stdinContent = strings.TrimSpace(string(data))
}
switch {
case hasArg && stdinContent != "":
// arg is the question, stdin is context
return args[0] + "\n\n" + stdinContent, nil
case hasArg:
return args[0], nil
case hasPrompt && stdinContent != "":
// --prompt is the question, stdin is context
return prompt + "\n\n" + stdinContent, nil
case hasPrompt:
return prompt, nil
case stdinContent != "":
return stdinContent, nil
default:
return "", exitcodes.New(exitcodes.BadRequest, "no question provided\n Usage: onyx-cli ask \"your question\"\n Or: echo \"context\" | onyx-cli ask --prompt \"your question\"")
}
}

View File

@@ -4,7 +4,6 @@ import (
tea "github.com/charmbracelet/bubbletea"
"github.com/onyx-dot-app/onyx/cli/internal/config"
"github.com/onyx-dot-app/onyx/cli/internal/onboarding"
"github.com/onyx-dot-app/onyx/cli/internal/starprompt"
"github.com/onyx-dot-app/onyx/cli/internal/tui"
"github.com/spf13/cobra"
)
@@ -13,11 +12,6 @@ func newChatCmd() *cobra.Command {
return &cobra.Command{
Use: "chat",
Short: "Launch the interactive chat TUI (default)",
Long: `Launch the interactive terminal UI for chatting with your Onyx agent.
This is the default command when no subcommand is specified. On first run,
an interactive setup wizard will guide you through configuration.`,
Example: ` onyx-cli chat
onyx-cli`,
RunE: func(cmd *cobra.Command, args []string) error {
cfg := config.Load()
@@ -30,8 +24,6 @@ an interactive setup wizard will guide you through configuration.`,
cfg = *result
}
starprompt.MaybePrompt()
m := tui.NewModel(cfg)
p := tea.NewProgram(m, tea.WithAltScreen(), tea.WithMouseCellMotion())
_, err := p.Run()

View File

@@ -1,126 +1,19 @@
package cmd
import (
"context"
"errors"
"fmt"
"io"
"os"
"strings"
"time"
"github.com/onyx-dot-app/onyx/cli/internal/api"
"github.com/onyx-dot-app/onyx/cli/internal/config"
"github.com/onyx-dot-app/onyx/cli/internal/exitcodes"
"github.com/onyx-dot-app/onyx/cli/internal/onboarding"
"github.com/spf13/cobra"
"golang.org/x/term"
)
func newConfigureCmd() *cobra.Command {
var (
serverURL string
apiKey string
apiKeyStdin bool
dryRun bool
)
cmd := &cobra.Command{
return &cobra.Command{
Use: "configure",
Short: "Configure server URL and API key",
Long: `Set up the Onyx CLI with your server URL and API key.
When --server-url and --api-key are both provided, the configuration is saved
non-interactively (useful for scripts and AI agents). Otherwise, an interactive
setup wizard is launched.
If --api-key is omitted but stdin has piped data, the API key is read from
stdin automatically. You can also use --api-key-stdin to make this explicit.
This avoids leaking the key in shell history.
Use --dry-run to test the connection without saving the configuration.`,
Example: ` onyx-cli configure
onyx-cli configure --server-url https://my-onyx.com --api-key sk-...
echo "$ONYX_API_KEY" | onyx-cli configure --server-url https://my-onyx.com
echo "$ONYX_API_KEY" | onyx-cli configure --server-url https://my-onyx.com --api-key-stdin
onyx-cli configure --server-url https://my-onyx.com --api-key sk-... --dry-run`,
RunE: func(cmd *cobra.Command, args []string) error {
// Read API key from stdin if piped (implicit) or --api-key-stdin (explicit)
if apiKeyStdin && apiKey != "" {
return exitcodes.New(exitcodes.BadRequest, "--api-key and --api-key-stdin cannot be used together")
}
if (apiKey == "" && !term.IsTerminal(int(os.Stdin.Fd()))) || apiKeyStdin {
data, err := io.ReadAll(os.Stdin)
if err != nil {
return fmt.Errorf("failed to read API key from stdin: %w", err)
}
apiKey = strings.TrimSpace(string(data))
}
if serverURL != "" && apiKey != "" {
return configureNonInteractive(serverURL, apiKey, dryRun)
}
if dryRun {
return exitcodes.New(exitcodes.BadRequest, "--dry-run requires --server-url and --api-key")
}
if serverURL != "" || apiKey != "" {
return exitcodes.New(exitcodes.BadRequest, "both --server-url and --api-key are required for non-interactive setup\n Run 'onyx-cli configure' without flags for interactive setup")
}
cfg := config.Load()
onboarding.Run(&cfg)
return nil
},
}
cmd.Flags().StringVar(&serverURL, "server-url", "", "Onyx server URL (e.g., https://cloud.onyx.app)")
cmd.Flags().StringVar(&apiKey, "api-key", "", "API key for authentication (or pipe via stdin)")
cmd.Flags().BoolVar(&apiKeyStdin, "api-key-stdin", false, "Read API key from stdin (explicit; also happens automatically when stdin is piped)")
cmd.Flags().BoolVar(&dryRun, "dry-run", false, "Test connection without saving config (requires --server-url and --api-key)")
return cmd
}
func configureNonInteractive(serverURL, apiKey string, dryRun bool) error {
cfg := config.OnyxCliConfig{
ServerURL: serverURL,
APIKey: apiKey,
DefaultAgentID: 0,
}
// Preserve existing default agent ID from disk (not env overrides)
if existing := config.LoadFromDisk(); existing.DefaultAgentID != 0 {
cfg.DefaultAgentID = existing.DefaultAgentID
}
// Test connection
client := api.NewClient(cfg)
ctx, cancel := context.WithTimeout(context.Background(), 15*time.Second)
defer cancel()
if err := client.TestConnection(ctx); err != nil {
var authErr *api.AuthError
if errors.As(err, &authErr) {
return exitcodes.Newf(exitcodes.AuthFailure, "authentication failed: %v\n Check your API key", err)
}
return exitcodes.Newf(exitcodes.Unreachable, "connection failed: %v\n Check your server URL", err)
}
if dryRun {
fmt.Printf("Server: %s\n", serverURL)
fmt.Println("Status: connected and authenticated")
fmt.Println("Dry run: config was NOT saved")
return nil
}
if err := config.Save(cfg); err != nil {
return fmt.Errorf("could not save config: %w", err)
}
fmt.Printf("Config: %s\n", config.ConfigFilePath())
fmt.Printf("Server: %s\n", serverURL)
fmt.Println("Status: connected and authenticated")
return nil
}

View File

@@ -1,176 +0,0 @@
package cmd
import (
"fmt"
"os"
"path/filepath"
"github.com/onyx-dot-app/onyx/cli/internal/embedded"
"github.com/onyx-dot-app/onyx/cli/internal/fsutil"
"github.com/spf13/cobra"
)
// agentSkillDirs maps agent names to their skill directory paths (relative to
// the project or home root). "Universal" agents like Cursor and Codex read
// from .agents/skills directly, so they don't need their own entry here.
var agentSkillDirs = map[string]string{
"claude-code": filepath.Join(".claude", "skills"),
}
const (
canonicalDir = ".agents/skills"
skillName = "onyx-cli"
)
func newInstallSkillCmd() *cobra.Command {
var (
global bool
copyMode bool
agents []string
)
cmd := &cobra.Command{
Use: "install-skill",
Short: "Install the Onyx CLI agent skill file",
Long: `Install the bundled SKILL.md so that AI coding agents can discover and use
the Onyx CLI as a tool.
Files are written to the canonical .agents/skills/onyx-cli/ directory. For
agents that use their own skill directory (e.g. Claude Code uses .claude/skills/),
a symlink is created pointing back to the canonical copy.
By default the skill is installed at the project level (current directory).
Use --global to install under your home directory instead.
Use --copy to write independent copies instead of symlinks.
Use --agent to target specific agents (can be repeated).`,
Example: ` onyx-cli install-skill
onyx-cli install-skill --global
onyx-cli install-skill --agent claude-code
onyx-cli install-skill --copy`,
RunE: func(cmd *cobra.Command, args []string) error {
base, err := installBase(global)
if err != nil {
return err
}
// Write the canonical copy.
canonicalSkillDir := filepath.Join(base, canonicalDir, skillName)
dest := filepath.Join(canonicalSkillDir, "SKILL.md")
content := []byte(embedded.SkillMD)
status, err := fsutil.CompareFile(dest, content)
if err != nil {
return err
}
switch status {
case fsutil.StatusUpToDate:
_, _ = fmt.Fprintf(cmd.OutOrStdout(), "Up to date %s\n", dest)
case fsutil.StatusDiffers:
_, _ = fmt.Fprintf(cmd.ErrOrStderr(), "Warning: overwriting modified %s\n", dest)
if err := os.WriteFile(dest, content, 0o644); err != nil {
return fmt.Errorf("could not write skill file: %w", err)
}
_, _ = fmt.Fprintf(cmd.OutOrStdout(), "Installed %s\n", dest)
default: // statusMissing
if err := os.MkdirAll(canonicalSkillDir, 0o755); err != nil {
return fmt.Errorf("could not create directory: %w", err)
}
if err := os.WriteFile(dest, content, 0o644); err != nil {
return fmt.Errorf("could not write skill file: %w", err)
}
_, _ = fmt.Fprintf(cmd.OutOrStdout(), "Installed %s\n", dest)
}
// Determine which agents to link.
targets := agentSkillDirs
if len(agents) > 0 {
targets = make(map[string]string)
for _, a := range agents {
dir, ok := agentSkillDirs[a]
if !ok {
_, _ = fmt.Fprintf(cmd.ErrOrStderr(), "Unknown agent %q (skipped) — known agents:", a)
for name := range agentSkillDirs {
_, _ = fmt.Fprintf(cmd.ErrOrStderr(), " %s", name)
}
_, _ = fmt.Fprintln(cmd.ErrOrStderr())
continue
}
targets[a] = dir
}
}
// Create symlinks (or copies) from agent-specific dirs to canonical.
for name, skillsDir := range targets {
agentSkillDir := filepath.Join(base, skillsDir, skillName)
if copyMode {
copyDest := filepath.Join(agentSkillDir, "SKILL.md")
if err := fsutil.EnsureDirForCopy(agentSkillDir); err != nil {
return fmt.Errorf("could not prepare %s directory: %w", name, err)
}
if err := os.MkdirAll(agentSkillDir, 0o755); err != nil {
return fmt.Errorf("could not create %s directory: %w", name, err)
}
if err := os.WriteFile(copyDest, []byte(embedded.SkillMD), 0o644); err != nil {
return fmt.Errorf("could not write %s skill file: %w", name, err)
}
_, _ = fmt.Fprintf(cmd.OutOrStdout(), "Copied %s\n", copyDest)
continue
}
// Compute relative symlink target. Symlinks resolve relative to
// the parent directory of the link, not the link itself.
rel, err := filepath.Rel(filepath.Dir(agentSkillDir), canonicalSkillDir)
if err != nil {
return fmt.Errorf("could not compute relative path for %s: %w", name, err)
}
if err := os.MkdirAll(filepath.Dir(agentSkillDir), 0o755); err != nil {
return fmt.Errorf("could not create %s directory: %w", name, err)
}
// Remove existing symlink/dir before creating.
_ = os.Remove(agentSkillDir)
if err := os.Symlink(rel, agentSkillDir); err != nil {
// Fall back to copy if symlink fails (e.g. Windows without dev mode).
copyDest := filepath.Join(agentSkillDir, "SKILL.md")
if mkErr := os.MkdirAll(agentSkillDir, 0o755); mkErr != nil {
return fmt.Errorf("could not create %s directory: %w", name, mkErr)
}
if wErr := os.WriteFile(copyDest, []byte(embedded.SkillMD), 0o644); wErr != nil {
return fmt.Errorf("could not write %s skill file: %w", name, wErr)
}
_, _ = fmt.Fprintf(cmd.OutOrStdout(), "Copied %s (symlink failed)\n", copyDest)
continue
}
_, _ = fmt.Fprintf(cmd.OutOrStdout(), "Linked %s -> %s\n", agentSkillDir, rel)
}
return nil
},
}
cmd.Flags().BoolVarP(&global, "global", "g", false, "Install to home directory instead of project")
cmd.Flags().BoolVar(&copyMode, "copy", false, "Copy files instead of symlinking")
cmd.Flags().StringSliceVarP(&agents, "agent", "a", nil, "Target specific agents (e.g. claude-code)")
return cmd
}
func installBase(global bool) (string, error) {
if global {
home, err := os.UserHomeDir()
if err != nil {
return "", fmt.Errorf("could not determine home directory: %w", err)
}
return home, nil
}
cwd, err := os.Getwd()
if err != nil {
return "", fmt.Errorf("could not determine working directory: %w", err)
}
return cwd, nil
}

View File

@@ -97,7 +97,6 @@ func Execute() error {
rootCmd.AddCommand(newConfigureCmd())
rootCmd.AddCommand(newValidateConfigCmd())
rootCmd.AddCommand(newServeCmd())
rootCmd.AddCommand(newInstallSkillCmd())
// Default command is chat, but intercept --version first
rootCmd.RunE = func(cmd *cobra.Command, args []string) error {

View File

@@ -23,7 +23,6 @@ import (
"github.com/charmbracelet/wish/ratelimiter"
"github.com/onyx-dot-app/onyx/cli/internal/api"
"github.com/onyx-dot-app/onyx/cli/internal/config"
"github.com/onyx-dot-app/onyx/cli/internal/exitcodes"
"github.com/onyx-dot-app/onyx/cli/internal/tui"
"github.com/spf13/cobra"
"golang.org/x/time/rate"
@@ -296,15 +295,15 @@ provided via the ONYX_API_KEY environment variable to skip the prompt:
The server URL is taken from the server operator's config. The server
auto-generates an Ed25519 host key on first run if the key file does not
already exist. The host key path can also be set via the ONYX_SSH_HOST_KEY
environment variable (the --host-key flag takes precedence).`,
Example: ` onyx-cli serve --port 2222
ssh localhost -p 2222
onyx-cli serve --host 0.0.0.0 --port 2222
onyx-cli serve --idle-timeout 30m --max-session-timeout 2h`,
environment variable (the --host-key flag takes precedence).
Example:
onyx-cli serve --port 2222
ssh localhost -p 2222`,
RunE: func(cmd *cobra.Command, args []string) error {
serverCfg := config.Load()
if serverCfg.ServerURL == "" {
return exitcodes.New(exitcodes.NotConfigured, "server URL is not configured\n Run: onyx-cli configure")
return fmt.Errorf("server URL is not configured; run 'onyx-cli configure' first")
}
if !cmd.Flags().Changed("host-key") {
if v := os.Getenv(config.EnvSSHHostKey); v != "" {

View File

@@ -2,13 +2,11 @@ package cmd
import (
"context"
"errors"
"fmt"
"time"
"github.com/onyx-dot-app/onyx/cli/internal/api"
"github.com/onyx-dot-app/onyx/cli/internal/config"
"github.com/onyx-dot-app/onyx/cli/internal/exitcodes"
"github.com/onyx-dot-app/onyx/cli/internal/version"
log "github.com/sirupsen/logrus"
"github.com/spf13/cobra"
@@ -18,21 +16,17 @@ func newValidateConfigCmd() *cobra.Command {
return &cobra.Command{
Use: "validate-config",
Short: "Validate configuration and test server connection",
Long: `Check that the CLI is configured, the server is reachable, and the API key
is valid. Also reports the server version and warns if it is below the
minimum required.`,
Example: ` onyx-cli validate-config`,
RunE: func(cmd *cobra.Command, args []string) error {
// Check config file
if !config.ConfigExists() {
return exitcodes.Newf(exitcodes.NotConfigured, "config file not found at %s\n Run: onyx-cli configure", config.ConfigFilePath())
return fmt.Errorf("config file not found at %s\n Run 'onyx-cli configure' to set up", config.ConfigFilePath())
}
cfg := config.Load()
// Check API key
if !cfg.IsConfigured() {
return exitcodes.New(exitcodes.NotConfigured, "API key is missing\n Run: onyx-cli configure")
return fmt.Errorf("API key is missing\n Run 'onyx-cli configure' to set up")
}
_, _ = fmt.Fprintf(cmd.OutOrStdout(), "Config: %s\n", config.ConfigFilePath())
@@ -41,11 +35,7 @@ minimum required.`,
// Test connection
client := api.NewClient(cfg)
if err := client.TestConnection(cmd.Context()); err != nil {
var authErr *api.AuthError
if errors.As(err, &authErr) {
return exitcodes.Newf(exitcodes.AuthFailure, "authentication failed: %v\n Reconfigure with: onyx-cli configure", err)
}
return exitcodes.Newf(exitcodes.Unreachable, "connection failed: %v\n Reconfigure with: onyx-cli configure", err)
return fmt.Errorf("connection failed: %w", err)
}
_, _ = fmt.Fprintln(cmd.OutOrStdout(), "Status: connected and authenticated")

View File

@@ -149,12 +149,12 @@ func (c *Client) TestConnection(ctx context.Context) error {
if resp2.StatusCode == 401 || resp2.StatusCode == 403 {
if isHTML || strings.Contains(respServer, "awselb") {
return &AuthError{Message: fmt.Sprintf("HTTP %d from a reverse proxy (not the Onyx backend).\n Check your deployment's ingress / proxy configuration", resp2.StatusCode)}
return fmt.Errorf("HTTP %d from a reverse proxy (not the Onyx backend).\n Check your deployment's ingress / proxy configuration", resp2.StatusCode)
}
if resp2.StatusCode == 401 {
return &AuthError{Message: fmt.Sprintf("invalid API key or token.\n %s", body)}
return fmt.Errorf("invalid API key or token.\n %s", body)
}
return &AuthError{Message: fmt.Sprintf("access denied — check that the API key is valid.\n %s", body)}
return fmt.Errorf("access denied — check that the API key is valid.\n %s", body)
}
detail := fmt.Sprintf("HTTP %d", resp2.StatusCode)

View File

@@ -11,12 +11,3 @@ type OnyxAPIError struct {
func (e *OnyxAPIError) Error() string {
return fmt.Sprintf("HTTP %d: %s", e.StatusCode, e.Detail)
}
// AuthError is returned when authentication or authorization fails.
type AuthError struct {
Message string
}
func (e *AuthError) Error() string {
return e.Message
}

View File

@@ -59,10 +59,8 @@ func ConfigExists() bool {
return err == nil
}
// LoadFromDisk reads config from the file only, without applying environment
// variable overrides. Use this when you need the persisted config values
// (e.g., to preserve them during a save operation).
func LoadFromDisk() OnyxCliConfig {
// Load reads config from file and applies environment variable overrides.
func Load() OnyxCliConfig {
cfg := DefaultConfig()
data, err := os.ReadFile(ConfigFilePath())
@@ -72,13 +70,6 @@ func LoadFromDisk() OnyxCliConfig {
}
}
return cfg
}
// Load reads config from file and applies environment variable overrides.
func Load() OnyxCliConfig {
cfg := LoadFromDisk()
// Environment overrides
if v := os.Getenv(EnvServerURL); v != "" {
cfg.ServerURL = v

View File

@@ -1,187 +0,0 @@
---
name: onyx-cli
description: Query the Onyx knowledge base using the onyx-cli command. Use when the user wants to search company documents, ask questions about internal knowledge, query connected data sources, or look up information stored in Onyx.
---
# Onyx CLI — Agent Tool
Onyx is an enterprise search and Gen-AI platform that connects to company documents, apps, and people. The `onyx-cli` CLI provides non-interactive commands to query the Onyx knowledge base and list available agents.
## Prerequisites
### 1. Check if installed
```bash
which onyx-cli
```
### 2. Install (if needed)
**Primary — pip:**
```bash
pip install onyx-cli
```
**From source (Go):**
```bash
go build -o onyx-cli github.com/onyx-dot-app/onyx/cli && sudo mv onyx-cli /usr/local/bin/
```
### 3. Check if configured
```bash
onyx-cli validate-config
```
This checks the config file exists, API key is present, and tests the server connection via `/api/me`. Exit code 0 on success, non-zero with a descriptive error on failure.
If unconfigured, you have two options:
**Option A — Interactive setup (requires user input):**
```bash
onyx-cli configure
```
This prompts for the Onyx server URL and API key, tests the connection, and saves config.
**Option B — Environment variables (non-interactive, preferred for agents):**
```bash
export ONYX_SERVER_URL="https://your-onyx-server.com" # default: https://cloud.onyx.app
export ONYX_API_KEY="your-api-key"
```
Environment variables override the config file. If these are set, no config file is needed.
| Variable | Required | Description |
| ----------------- | -------- | -------------------------------------------------------- |
| `ONYX_SERVER_URL` | No | Onyx server base URL (default: `https://cloud.onyx.app`) |
| `ONYX_API_KEY` | Yes | API key for authentication |
| `ONYX_PERSONA_ID` | No | Default agent/persona ID |
If neither the config file nor environment variables are set, tell the user that `onyx-cli` needs to be configured and ask them to either:
- Run `onyx-cli configure` interactively, or
- Set `ONYX_SERVER_URL` and `ONYX_API_KEY` environment variables
## Commands
### Validate configuration
```bash
onyx-cli validate-config
```
Checks config file exists, API key is present, and tests the server connection. Use this before `ask` or `agents` to confirm the CLI is properly set up.
### List available agents
```bash
onyx-cli agents
```
Prints a table of agent IDs, names, and descriptions. Use `--json` for structured output:
```bash
onyx-cli agents --json
```
Use agent IDs with `ask --agent-id` to query a specific agent.
### Basic query (plain text output)
```bash
onyx-cli ask "What is our company's PTO policy?"
```
Streams the answer as plain text to stdout. Exit code 0 on success, non-zero on error.
### JSON output (structured events)
```bash
onyx-cli ask --json "What authentication methods do we support?"
```
Outputs JSON-encoded parsed stream events (one object per line). Key event objects include message deltas, stop, errors, search-start, and citation payloads.
Each line is a JSON object with this envelope:
```json
{"type": "<event_type>", "event": { ... }}
```
| Event Type | Description |
| ------------------- | -------------------------------------------------------------------- |
| `message_delta` | Content token — concatenate all `content` fields for the full answer |
| `stop` | Stream complete |
| `error` | Error with `error` message field |
| `search_tool_start` | Onyx started searching documents |
| `citation_info` | Source citation — see shape below |
`citation_info` event shape:
```json
{
"type": "citation_info",
"event": {
"citation_number": 1,
"document_id": "abc123def456",
"placement": { "turn_index": 0, "tab_index": 0, "sub_turn_index": null }
}
}
```
`placement` is metadata about where in the conversation the citation appeared and can be ignored for most use cases.
### Specify an agent
```bash
onyx-cli ask --agent-id 5 "Summarize our Q4 roadmap"
```
Uses a specific Onyx agent/persona instead of the default.
### All flags
| Flag | Type | Description |
| ------------ | ---- | ---------------------------------------------- |
| `--agent-id` | int | Agent ID to use (overrides default) |
| `--json` | bool | Output raw NDJSON events instead of plain text |
## Statelessness
Each `onyx-cli ask` call creates an independent chat session. There is no built-in way to chain context across multiple `ask` invocations — every call starts fresh. If you need multi-turn conversation with memory, use the interactive TUI (`onyx-cli` or `onyx-cli chat`) instead.
## When to Use
Use `onyx-cli ask` when:
- The user asks about company-specific information (policies, docs, processes)
- You need to search internal knowledge bases or connected data sources
- The user references Onyx, asks you to "search Onyx", or wants to query their documents
- You need context from company wikis, Confluence, Google Drive, Slack, or other connected sources
Do NOT use when:
- The question is about general programming knowledge (use your own knowledge)
- The user is asking about code in the current repository (use grep/read tools)
- The user hasn't mentioned Onyx and the question doesn't require internal company data
## Examples
```bash
# Simple question
onyx-cli ask "What are the steps to deploy to production?"
# Get structured output for parsing
onyx-cli ask --json "List all active API integrations"
# Use a specialized agent
onyx-cli ask --agent-id 3 "What were the action items from last week's standup?"
# Pipe the answer into another command
onyx-cli ask "What is the database schema for users?" | head -20
```

View File

@@ -1,7 +0,0 @@
// Package embedded holds files that are compiled into the onyx-cli binary.
package embedded
import _ "embed"
//go:embed SKILL.md
var SkillMD string

View File

@@ -1,33 +0,0 @@
// Package exitcodes defines semantic exit codes for the Onyx CLI.
package exitcodes
import "fmt"
const (
Success = 0
General = 1
BadRequest = 2 // invalid args / command-line errors (convention)
NotConfigured = 3
AuthFailure = 4
Unreachable = 5
)
// ExitError wraps an error with a specific exit code.
type ExitError struct {
Code int
Err error
}
func (e *ExitError) Error() string {
return e.Err.Error()
}
// New creates an ExitError with the given code and message.
func New(code int, msg string) *ExitError {
return &ExitError{Code: code, Err: fmt.Errorf("%s", msg)}
}
// Newf creates an ExitError with a formatted message.
func Newf(code int, format string, args ...any) *ExitError {
return &ExitError{Code: code, Err: fmt.Errorf(format, args...)}
}

View File

@@ -1,40 +0,0 @@
package exitcodes
import (
"errors"
"fmt"
"testing"
)
func TestExitError_Error(t *testing.T) {
e := New(NotConfigured, "not configured")
if e.Error() != "not configured" {
t.Fatalf("expected 'not configured', got %q", e.Error())
}
if e.Code != NotConfigured {
t.Fatalf("expected code %d, got %d", NotConfigured, e.Code)
}
}
func TestExitError_Newf(t *testing.T) {
e := Newf(Unreachable, "cannot reach %s", "server")
if e.Error() != "cannot reach server" {
t.Fatalf("expected 'cannot reach server', got %q", e.Error())
}
if e.Code != Unreachable {
t.Fatalf("expected code %d, got %d", Unreachable, e.Code)
}
}
func TestExitError_ErrorsAs(t *testing.T) {
e := New(BadRequest, "bad input")
wrapped := fmt.Errorf("wrapper: %w", e)
var exitErr *ExitError
if !errors.As(wrapped, &exitErr) {
t.Fatal("errors.As should find ExitError")
}
if exitErr.Code != BadRequest {
t.Fatalf("expected code %d, got %d", BadRequest, exitErr.Code)
}
}

View File

@@ -1,50 +0,0 @@
// Package fsutil provides filesystem helper functions.
package fsutil
import (
"bytes"
"errors"
"fmt"
"os"
)
// FileStatus describes how an on-disk file compares to expected content.
type FileStatus int
const (
StatusMissing FileStatus = iota
StatusUpToDate // file exists with identical content
StatusDiffers // file exists with different content
)
// CompareFile checks whether the file at path matches the expected content.
func CompareFile(path string, expected []byte) (FileStatus, error) {
existing, err := os.ReadFile(path)
if err != nil {
if errors.Is(err, os.ErrNotExist) {
return StatusMissing, nil
}
return 0, fmt.Errorf("could not read %s: %w", path, err)
}
if bytes.Equal(existing, expected) {
return StatusUpToDate, nil
}
return StatusDiffers, nil
}
// EnsureDirForCopy makes sure path is a real directory, not a symlink or
// regular file. If a symlink or file exists at path it is removed so the
// caller can create a directory with independent content.
func EnsureDirForCopy(path string) error {
info, err := os.Lstat(path)
if err == nil {
if info.Mode()&os.ModeSymlink != 0 || !info.IsDir() {
if err := os.Remove(path); err != nil {
return err
}
}
} else if !errors.Is(err, os.ErrNotExist) {
return err
}
return nil
}

View File

@@ -1,116 +0,0 @@
package fsutil
import (
"os"
"path/filepath"
"testing"
)
// TestCompareFile verifies that CompareFile correctly distinguishes between a
// missing file, a file with matching content, and a file with different content.
func TestCompareFile(t *testing.T) {
tmpDir := t.TempDir()
path := filepath.Join(tmpDir, "skill.md")
expected := []byte("expected content")
status, err := CompareFile(path, expected)
if err != nil {
t.Fatalf("CompareFile on missing file failed: %v", err)
}
if status != StatusMissing {
t.Fatalf("expected StatusMissing, got %v", status)
}
if err := os.WriteFile(path, expected, 0o644); err != nil {
t.Fatalf("write expected file failed: %v", err)
}
status, err = CompareFile(path, expected)
if err != nil {
t.Fatalf("CompareFile on matching file failed: %v", err)
}
if status != StatusUpToDate {
t.Fatalf("expected StatusUpToDate, got %v", status)
}
if err := os.WriteFile(path, []byte("different content"), 0o644); err != nil {
t.Fatalf("write different file failed: %v", err)
}
status, err = CompareFile(path, expected)
if err != nil {
t.Fatalf("CompareFile on different file failed: %v", err)
}
if status != StatusDiffers {
t.Fatalf("expected StatusDiffers, got %v", status)
}
}
// TestEnsureDirForCopy verifies that EnsureDirForCopy clears symlinks and
// regular files so --copy can write a real directory, while leaving existing
// directories and missing paths untouched.
func TestEnsureDirForCopy(t *testing.T) {
t.Run("removes symlink", func(t *testing.T) {
tmpDir := t.TempDir()
targetDir := filepath.Join(tmpDir, "target")
linkPath := filepath.Join(tmpDir, "link")
if err := os.MkdirAll(targetDir, 0o755); err != nil {
t.Fatalf("mkdir target failed: %v", err)
}
if err := os.Symlink(targetDir, linkPath); err != nil {
t.Fatalf("create symlink failed: %v", err)
}
if err := EnsureDirForCopy(linkPath); err != nil {
t.Fatalf("EnsureDirForCopy failed: %v", err)
}
if _, err := os.Lstat(linkPath); !os.IsNotExist(err) {
t.Fatalf("expected symlink path to be removed, got err=%v", err)
}
})
t.Run("removes regular file", func(t *testing.T) {
tmpDir := t.TempDir()
filePath := filepath.Join(tmpDir, "onyx-cli")
if err := os.WriteFile(filePath, []byte("x"), 0o644); err != nil {
t.Fatalf("write file failed: %v", err)
}
if err := EnsureDirForCopy(filePath); err != nil {
t.Fatalf("EnsureDirForCopy failed: %v", err)
}
if _, err := os.Lstat(filePath); !os.IsNotExist(err) {
t.Fatalf("expected file path to be removed, got err=%v", err)
}
})
t.Run("keeps existing directory", func(t *testing.T) {
tmpDir := t.TempDir()
dirPath := filepath.Join(tmpDir, "onyx-cli")
if err := os.MkdirAll(dirPath, 0o755); err != nil {
t.Fatalf("mkdir failed: %v", err)
}
if err := EnsureDirForCopy(dirPath); err != nil {
t.Fatalf("EnsureDirForCopy failed: %v", err)
}
info, err := os.Lstat(dirPath)
if err != nil {
t.Fatalf("lstat directory failed: %v", err)
}
if !info.IsDir() {
t.Fatalf("expected directory to remain, got mode %v", info.Mode())
}
})
t.Run("missing path is no-op", func(t *testing.T) {
tmpDir := t.TempDir()
missingPath := filepath.Join(tmpDir, "does-not-exist")
if err := EnsureDirForCopy(missingPath); err != nil {
t.Fatalf("EnsureDirForCopy failed: %v", err)
}
})
}

View File

@@ -1,121 +0,0 @@
// Package overflow provides a streaming writer that auto-truncates output
// for non-TTY callers (e.g., AI agents, scripts). Full content is saved to
// a temp file on disk; only the first N bytes are printed to stdout.
package overflow
import (
"fmt"
"os"
"strings"
log "github.com/sirupsen/logrus"
)
// Writer handles streaming output with optional truncation.
// When Limit > 0, it streams to a temp file on disk (not memory) and stops
// writing to stdout after Limit bytes. When Limit == 0, it writes directly
// to stdout. In Quiet mode, it buffers in memory and prints once at the end.
type Writer struct {
Limit int
Quiet bool
written int
totalBytes int
truncated bool
buf strings.Builder // used only in quiet mode
tmpFile *os.File // used only in truncation mode (Limit > 0)
}
// Write sends a chunk of content through the writer.
func (w *Writer) Write(s string) {
w.totalBytes += len(s)
// Quiet mode: buffer in memory, print nothing
if w.Quiet {
w.buf.WriteString(s)
return
}
if w.Limit <= 0 {
fmt.Print(s)
return
}
// Truncation mode: stream all content to temp file on disk
if w.tmpFile == nil {
f, err := os.CreateTemp("", "onyx-ask-*.txt")
if err != nil {
// Fall back to no-truncation if we can't create the file
fmt.Fprintf(os.Stderr, "warning: could not create temp file: %v\n", err)
w.Limit = 0
fmt.Print(s)
return
}
w.tmpFile = f
}
if _, err := w.tmpFile.WriteString(s); err != nil {
// Disk write failed — abandon truncation, stream directly to stdout
fmt.Fprintf(os.Stderr, "warning: temp file write failed: %v\n", err)
w.closeTmpFile(true)
w.Limit = 0
w.truncated = false
fmt.Print(s)
return
}
if w.truncated {
return
}
remaining := w.Limit - w.written
if len(s) <= remaining {
fmt.Print(s)
w.written += len(s)
} else {
if remaining > 0 {
fmt.Print(s[:remaining])
w.written += remaining
}
w.truncated = true
}
}
// Finish flushes remaining output. Call once after all Write calls are done.
func (w *Writer) Finish() {
// Quiet mode: print buffered content at once
if w.Quiet {
fmt.Println(w.buf.String())
return
}
if !w.truncated {
w.closeTmpFile(true) // clean up unused temp file
fmt.Println()
return
}
// Close the temp file so it's readable
tmpPath := w.tmpFile.Name()
w.closeTmpFile(false) // close but keep the file
fmt.Printf("\n\n--- response truncated (%d bytes total) ---\n", w.totalBytes)
fmt.Printf("Full response: %s\n", tmpPath)
fmt.Printf("Explore:\n")
fmt.Printf(" cat %s | grep \"<pattern>\"\n", tmpPath)
fmt.Printf(" cat %s | tail -50\n", tmpPath)
}
// closeTmpFile closes and optionally removes the temp file.
func (w *Writer) closeTmpFile(remove bool) {
if w.tmpFile == nil {
return
}
if err := w.tmpFile.Close(); err != nil {
log.Debugf("warning: failed to close temp file: %v", err)
}
if remove {
if err := os.Remove(w.tmpFile.Name()); err != nil {
log.Debugf("warning: failed to remove temp file: %v", err)
}
}
w.tmpFile = nil
}

View File

@@ -1,95 +0,0 @@
package overflow
import (
"os"
"testing"
)
func TestWriter_NoLimit(t *testing.T) {
w := &Writer{Limit: 0}
w.Write("hello world")
if w.truncated {
t.Fatal("should not be truncated with limit 0")
}
if w.totalBytes != 11 {
t.Fatalf("expected 11 total bytes, got %d", w.totalBytes)
}
}
func TestWriter_UnderLimit(t *testing.T) {
w := &Writer{Limit: 100}
w.Write("hello")
w.Write(" world")
if w.truncated {
t.Fatal("should not be truncated when under limit")
}
if w.written != 11 {
t.Fatalf("expected 11 written bytes, got %d", w.written)
}
}
func TestWriter_OverLimit(t *testing.T) {
w := &Writer{Limit: 5}
w.Write("hello world") // 11 bytes, limit 5
if !w.truncated {
t.Fatal("should be truncated")
}
if w.written != 5 {
t.Fatalf("expected 5 written bytes, got %d", w.written)
}
if w.totalBytes != 11 {
t.Fatalf("expected 11 total bytes, got %d", w.totalBytes)
}
if w.tmpFile == nil {
t.Fatal("temp file should have been created")
}
_ = w.tmpFile.Close()
data, _ := os.ReadFile(w.tmpFile.Name())
_ = os.Remove(w.tmpFile.Name())
if string(data) != "hello world" {
t.Fatalf("temp file should contain full content, got %q", string(data))
}
}
func TestWriter_MultipleChunks(t *testing.T) {
w := &Writer{Limit: 10}
w.Write("hello") // 5 bytes
w.Write(" ") // 6 bytes
w.Write("world") // 11 bytes, crosses limit
w.Write("!") // 12 bytes, already truncated
if !w.truncated {
t.Fatal("should be truncated")
}
if w.written != 10 {
t.Fatalf("expected 10 written bytes, got %d", w.written)
}
if w.totalBytes != 12 {
t.Fatalf("expected 12 total bytes, got %d", w.totalBytes)
}
if w.tmpFile == nil {
t.Fatal("temp file should have been created")
}
_ = w.tmpFile.Close()
data, _ := os.ReadFile(w.tmpFile.Name())
_ = os.Remove(w.tmpFile.Name())
if string(data) != "hello world!" {
t.Fatalf("temp file should contain full content, got %q", string(data))
}
}
func TestWriter_QuietMode(t *testing.T) {
w := &Writer{Limit: 0, Quiet: true}
w.Write("hello")
w.Write(" world")
if w.written != 0 {
t.Fatalf("quiet mode should not write to stdout, got %d written", w.written)
}
if w.totalBytes != 11 {
t.Fatalf("expected 11 total bytes, got %d", w.totalBytes)
}
if w.buf.String() != "hello world" {
t.Fatalf("buffer should contain full content, got %q", w.buf.String())
}
}

View File

@@ -1,83 +0,0 @@
// Package starprompt implements a one-time GitHub star prompt shown before the TUI.
// Skipped when stdin/stdout is not a TTY, when gh CLI is not installed,
// or when the user has already been prompted. State is stored in the
// config directory so it shows at most once per user.
package starprompt
import (
"bufio"
"fmt"
"os"
"os/exec"
"path/filepath"
"strings"
"time"
"github.com/onyx-dot-app/onyx/cli/internal/config"
"golang.org/x/term"
)
const repo = "onyx-dot-app/onyx"
func statePath() string {
return filepath.Join(config.ConfigDir(), ".star-prompted")
}
func hasBeenPrompted() bool {
_, err := os.Stat(statePath())
return err == nil
}
func markPrompted() {
_ = os.MkdirAll(config.ConfigDir(), 0o755)
f, err := os.Create(statePath())
if err == nil {
_ = f.Close()
}
}
func isGHInstalled() bool {
_, err := exec.LookPath("gh")
return err == nil
}
// MaybePrompt shows a one-time star prompt if conditions are met.
// It is safe to call unconditionally — it no-ops when not appropriate.
func MaybePrompt() {
if !term.IsTerminal(int(os.Stdin.Fd())) || !term.IsTerminal(int(os.Stdout.Fd())) {
return
}
if hasBeenPrompted() {
return
}
if !isGHInstalled() {
return
}
// Mark before asking so Ctrl+C won't cause a re-prompt.
markPrompted()
fmt.Print("Enjoying Onyx? Star the repo on GitHub? [Y/n] ")
reader := bufio.NewReader(os.Stdin)
answer, _ := reader.ReadString('\n')
answer = strings.TrimSpace(strings.ToLower(answer))
if answer == "n" || answer == "no" {
return
}
cmd := exec.Command("gh", "api", "-X", "PUT", "/user/starred/"+repo)
cmd.Env = append(os.Environ(), "GH_PAGER=")
if devnull, err := os.Open(os.DevNull); err == nil {
defer func() { _ = devnull.Close() }()
cmd.Stdin = devnull
cmd.Stdout = devnull
cmd.Stderr = devnull
}
if err := cmd.Run(); err != nil {
fmt.Println("Star us at: https://github.com/" + repo)
} else {
fmt.Println("Thanks for the star!")
time.Sleep(500 * time.Millisecond)
}
}

View File

@@ -1,12 +1,10 @@
package main
import (
"errors"
"fmt"
"os"
"github.com/onyx-dot-app/onyx/cli/cmd"
"github.com/onyx-dot-app/onyx/cli/internal/exitcodes"
)
var (
@@ -20,10 +18,6 @@ func main() {
if err := cmd.Execute(); err != nil {
fmt.Fprintf(os.Stderr, "Error: %v\n", err)
var exitErr *exitcodes.ExitError
if errors.As(err, &exitErr) {
os.Exit(exitErr.Code)
}
os.Exit(1)
}
}

View File

@@ -1302,18 +1302,4 @@ echo ""
print_info "Refer to the README in the ${INSTALL_ROOT} directory for more information."
echo ""
print_info "For help or issues, contact: founders@onyx.app"
echo ""
# --- GitHub star prompt (inspired by oh-my-codex) ---
# Only prompt in interactive mode and only if gh CLI is available.
# Uses the GitHub API directly (PUT /user/starred) like oh-my-codex.
if is_interactive && command -v gh &>/dev/null; then
prompt_yn_or_default "Enjoying Onyx? Star the repo on GitHub? [Y/n] " "Y"
if [[ ! "$REPLY" =~ ^[Nn] ]]; then
if GH_PAGER= gh api -X PUT /user/starred/onyx-dot-app/onyx < /dev/null >/dev/null 2>&1; then
print_success "Thanks for the star!"
else
print_info "Star us at: https://github.com/onyx-dot-app/onyx"
fi
fi
fi
echo ""

View File

@@ -16,7 +16,7 @@ div.relative
```
- **`sidebar-heavy`** (default) — muted when unselected (text-03/text-02), bold when selected (text-04/text-03)
- **`sidebar-light`** — uniformly muted across all states (text-02/text-02)
- **`sidebar-light`** (via `lowlight`) — uniformly muted across all states (text-02/text-02)
- **Disabled** — both variants use text-02 foreground, transparent background, no hover/active states
- **Navigation** uses an absolutely positioned `<Link>` overlay rather than `href` on the Interactive element, so `rightChildren` can sit above it with `pointer-events-auto`.
@@ -24,10 +24,10 @@ div.relative
| Prop | Type | Default | Description |
|------|------|---------|-------------|
| `variant` | `"sidebar-heavy" \| "sidebar-light"` | `"sidebar-heavy"` | Sidebar color variant |
| `selected` | `boolean` | `false` | Active/selected state |
| `icon` | `IconFunctionComponent` | — | Left icon |
| `children` | `ReactNode` | — | Label text or custom content |
| `selected` | `boolean` | `false` | Active/selected state |
| `lowlight` | `boolean` | `false` | Uses muted `sidebar-light` variant |
| `disabled` | `boolean` | `false` | Disables the tab |
| `folded` | `boolean` | `false` | Collapses label, shows tooltip on hover |
| `nested` | `boolean` | `false` | Renders spacer instead of icon for indented items |
@@ -47,11 +47,6 @@ import { SvgSettings, SvgLock } from "@opal/icons";
Settings
</SidebarTab>
// Muted variant
<SidebarTab icon={SvgSettings} variant="sidebar-light">
Exit Admin Panel
</SidebarTab>
// Disabled enterprise-only tab
<SidebarTab icon={SvgLock} disabled>
Groups

View File

@@ -1,13 +1,9 @@
import React from "react";
import type { Meta, StoryObj } from "@storybook/react";
import { SidebarTab } from "@opal/components/buttons/sidebar-tab/components";
import {
SvgSettings,
SvgUsers,
SvgLock,
SvgArrowUpCircle,
SvgTrash,
} from "@opal/icons";
import { SvgSettings, SvgUsers, SvgLock, SvgArrowUpCircle } from "@opal/icons";
import { Button } from "@opal/components";
import { SvgTrash } from "@opal/icons";
import * as TooltipPrimitive from "@radix-ui/react-tooltip";
const meta: Meta<typeof SidebarTab> = {
@@ -43,11 +39,11 @@ export const Selected: Story = {
},
};
export const Light: Story = {
export const Lowlight: Story = {
args: {
icon: SvgSettings,
children: "Settings",
variant: "sidebar-light",
lowlight: true,
},
};

View File

@@ -3,7 +3,7 @@
import React from "react";
import type { ButtonType, IconFunctionComponent } from "@opal/types";
import type { Route } from "next";
import { Interactive, type InteractiveStatefulVariant } from "@opal/core";
import { Interactive } from "@opal/core";
import { ContentAction } from "@opal/layouts";
import { Text } from "@opal/components";
import Link from "next/link";
@@ -21,14 +21,8 @@ interface SidebarTabProps {
/** Marks this tab as the currently active/selected item. */
selected?: boolean;
/**
* Sidebar color variant.
* @default "sidebar-heavy"
*/
variant?: Extract<
InteractiveStatefulVariant,
"sidebar-light" | "sidebar-heavy"
>;
/** Uses the muted `sidebar-light` variant instead of `sidebar-heavy`. */
lowlight?: boolean;
/** Renders an empty spacer in place of the icon for nested items. */
nested?: boolean;
@@ -53,14 +47,14 @@ interface SidebarTabProps {
/**
* Sidebar navigation tab built on `Interactive.Stateful` > `Interactive.Container`.
*
* Uses `sidebar-heavy` (default) or `sidebar-light` (via `variant`) variants
* Uses `sidebar-heavy` (default) or `sidebar-light` (when `lowlight`) variants
* for color styling. Supports an overlay `Link` for client-side navigation,
* `rightChildren` for inline actions, and folded mode with an auto-tooltip.
*/
function SidebarTab({
folded,
selected,
variant = "sidebar-heavy",
lowlight,
nested,
disabled,
@@ -88,7 +82,7 @@ function SidebarTab({
const content = (
<div className="relative">
<Interactive.Stateful
variant={variant}
variant={lowlight ? "sidebar-light" : "sidebar-heavy"}
state={selected ? "selected" : "empty"}
disabled={disabled}
onClick={onClick}
@@ -127,7 +121,7 @@ function SidebarTab({
rightChildren={truncationSpacer}
/>
) : (
<div className="flex flex-row items-center gap-2 w-full">
<div className="flex flex-row items-center gap-2 flex-1">
{Icon && (
<div className="flex items-center justify-center p-0.5">
<Icon className="h-[1rem] w-[1rem] text-text-03" />
@@ -153,7 +147,7 @@ function SidebarTab({
side="right"
sideOffset={4}
>
{children}
<Text>{children}</Text>
</TooltipPrimitive.Content>
</TooltipPrimitive.Portal>
</TooltipPrimitive.Root>

View File

@@ -586,10 +586,7 @@ export function Table<TData>(props: DataTableProps<TData>) {
// Data / Display cell
return (
<TableCell
key={cell.id}
data-column-id={cell.column.id}
>
<TableCell key={cell.id}>
{flexRender(
cell.column.columnDef.cell,
cell.getContext()

View File

@@ -554,9 +554,9 @@
Sidebar-Heavy — Disabled (all states)
--------------------------------------------------------------------------- */
.interactive[data-interactive-variant="sidebar-heavy"][data-disabled] {
@apply bg-transparent opacity-50;
--interactive-foreground: var(--text-03);
--interactive-foreground-icon: var(--text-03);
@apply bg-transparent;
--interactive-foreground: var(--text-02);
--interactive-foreground-icon: var(--text-02);
}
/* ===========================================================================
@@ -619,7 +619,7 @@
Sidebar-Light — Disabled (all states)
--------------------------------------------------------------------------- */
.interactive[data-interactive-variant="sidebar-light"][data-disabled] {
@apply bg-transparent opacity-50;
--interactive-foreground: var(--text-03);
--interactive-foreground-icon: var(--text-03);
@apply bg-transparent;
--interactive-foreground: var(--text-02);
--interactive-foreground-icon: var(--text-02);
}

View File

@@ -64,6 +64,7 @@ const BUSINESS_FEATURES: PlanFeature[] = [
{ icon: SvgKey, text: "Service Account API Keys" },
{ icon: SvgHardDrive, text: "Self-hosting (Optional)" },
{ icon: SvgPaintBrush, text: "Custom Theming" },
{ icon: SvgShareWebhook, text: "Hook Extensions" },
];
const ENTERPRISE_FEATURES: PlanFeature[] = [
@@ -71,7 +72,6 @@ const ENTERPRISE_FEATURES: PlanFeature[] = [
{ icon: SvgDashboard, text: "Full White-labeling" },
{ icon: SvgUserManage, text: "Custom Roles and Permissions" },
{ icon: SvgSliders, text: "Configurable Usage Limits" },
{ icon: SvgShareWebhook, text: "Hook Extensions" },
{ icon: SvgServer, text: "Custom Deployments" },
{ icon: SvgGlobe, text: "Region-Specific Data Processing" },
{ icon: SvgHeadsetMic, text: "Enterprise SLAs and Priority Support" },

View File

@@ -3,7 +3,7 @@
import { usePathname } from "next/navigation";
import * as AppLayouts from "@/layouts/app-layouts";
import * as SettingsLayouts from "@/layouts/settings-layouts";
import { SidebarTab } from "@opal/components";
import SidebarTab from "@/refresh-components/buttons/SidebarTab";
import { SvgSliders } from "@opal/icons";
import { useUser } from "@/providers/UserProvider";
import { useAuthType } from "@/lib/hooks";

View File

@@ -11,7 +11,7 @@ import {
} from "@/app/craft/hooks/useBuildSessionStore";
import { useUsageLimits } from "@/app/craft/hooks/useUsageLimits";
import { CRAFT_SEARCH_PARAM_NAMES } from "@/app/craft/services/searchParams";
import { SidebarTab } from "@opal/components";
import SidebarTab from "@/refresh-components/buttons/SidebarTab";
import Text from "@/refresh-components/texts/Text";
import SidebarWrapper from "@/sections/sidebar/SidebarWrapper";
import SidebarBody from "@/sections/sidebar/SidebarBody";
@@ -238,7 +238,9 @@ function BuildSessionButton({
<Text
as="p"
data-state={isActive ? "active" : "inactive"}
className="line-clamp-1 break-all text-left"
className={cn(
"sidebar-tab-text-defaulted line-clamp-1 break-all text-left"
)}
mainUiBody
>
<TypewriterText

View File

@@ -467,10 +467,6 @@
/* Frost Overlay (for FrostedDiv component) - lighter in light mode */
--frost-overlay: var(--alpha-grey-00-10);
/* Scrollbar */
--scrollbar-track: transparent;
--scrollbar-thumb: var(--alpha-grey-100-20);
}
/* Dark Colors */
@@ -675,8 +671,4 @@
/* Frost Overlay (for FrostedDiv component) - darker in dark mode */
--frost-overlay: var(--alpha-grey-100-10);
/* Scrollbar */
--scrollbar-track: transparent;
--scrollbar-thumb: var(--alpha-grey-00-20);
}

View File

@@ -0,0 +1,75 @@
/* Background classes */
.sidebar-tab-background-defaulted[data-state="active"] {
background-color: var(--background-tint-00);
}
.sidebar-tab-background-defaulted[data-state="inactive"] {
background-color: transparent;
}
.sidebar-tab-background-defaulted:hover {
background-color: var(--background-tint-03);
}
.sidebar-tab-background-lowlight[data-state="active"] {
background-color: var(--background-tint-00);
}
.sidebar-tab-background-lowlight[data-state="inactive"] {
background-color: transparent;
}
.sidebar-tab-background-lowlight:hover {
background-color: var(--background-tint-03);
}
.sidebar-tab-background-focused {
border: 2px solid var(--background-tint-04);
background-color: var(--background-neutral-00);
}
/* Text classes */
.sidebar-tab-text-defaulted[data-state="active"] {
color: var(--text-04);
}
.sidebar-tab-text-defaulted[data-state="inactive"] {
color: var(--text-03);
}
.group\/SidebarTab:hover .sidebar-tab-text-defaulted {
color: var(--text-04);
}
.sidebar-tab-text-lowlight[data-state="active"] {
color: var(--text-03);
}
.sidebar-tab-text-lowlight[data-state="inactive"] {
color: var(--text-02);
}
.group\/SidebarTab:hover .sidebar-tab-text-lowlight {
color: var(--text-03);
}
.sidebar-tab-text-focused {
color: var(--text-03);
}
/* Icon classes */
.sidebar-tab-icon-defaulted[data-state="active"] {
stroke: var(--text-04);
}
.sidebar-tab-icon-defaulted[data-state="inactive"] {
stroke: var(--text-03);
}
.group\/SidebarTab:hover .sidebar-tab-icon-defaulted {
stroke: var(--text-04);
}
.sidebar-tab-icon-lowlight[data-state="active"] {
stroke: var(--text-03);
}
.sidebar-tab-icon-lowlight[data-state="inactive"] {
stroke: var(--text-02);
}
.group\/SidebarTab:hover .sidebar-tab-icon-lowlight {
stroke: var(--text-03);
}
.sidebar-tab-icon-focused {
stroke: var(--text-02);
}

View File

@@ -1,3 +1,6 @@
@import url("https://fonts.googleapis.com/css2?family=Hanken+Grotesk:wght@400;500;600;700&display=swap");
@import url("https://fonts.googleapis.com/css2?family=DM+Mono:wght@400&display=swap");
@import "css/attachment-button.css";
@import "css/button.css";
@import "css/card.css";
@@ -9,6 +12,7 @@
@import "css/inputs.css";
@import "css/knowledge-table.css";
@import "css/line-item.css";
@import "css/sidebar-tab.css";
@import "css/sizes.css";
@import "css/square-button.css";
@import "css/switch.css";
@@ -127,8 +131,17 @@
}
@layer utilities {
/* Hide scrollbar for Chrome, Safari and Opera */
.no-scrollbar::-webkit-scrollbar {
display: none;
}
/* Hide scrollbar for IE, Edge and Firefox */
.no-scrollbar {
-ms-overflow-style: none;
/* IE and Edge */
scrollbar-width: none;
/* Firefox */
}
/* SHADOWS */
@@ -353,9 +366,27 @@
/* SCROLL BAR */
.default-scrollbar::-webkit-scrollbar {
width: 6px;
}
.default-scrollbar::-webkit-scrollbar-track {
background: #f1f1f1;
}
.default-scrollbar::-webkit-scrollbar-thumb {
background: #888;
border-radius: 4px;
}
.default-scrollbar::-webkit-scrollbar-thumb:hover {
background: #555;
}
.default-scrollbar {
scrollbar-width: thin;
scrollbar-color: #888 transparent;
overflow: overlay;
overflow-y: scroll;
overflow-x: hidden;
}
@@ -365,21 +396,78 @@
height: 100%;
}
.inputscroll {
.inputscroll::-webkit-scrollbar-track {
background: #e5e7eb;
scrollbar-width: none;
}
/* Ensure native scrollbars are visible */
@layer base {
* {
scrollbar-width: auto;
}
::-webkit-scrollbar {
width: 0px;
/* Vertical scrollbar width */
height: 8px;
/* Horizontal scrollbar height */
}
::-webkit-scrollbar-track {
background: transparent;
/* background: theme("colors.scrollbar.track"); */
/* Track background color */
}
/* Style the scrollbar handle */
::-webkit-scrollbar-thumb {
background: transparent;
/* background: theme("colors.scrollbar.thumb"); */
/* Handle color */
border-radius: 10px;
}
/* Handle on hover */
::-webkit-scrollbar-thumb:hover {
background: transparent;
/* background: theme("colors.scrollbar.thumb-hover"); */
/* Handle color on hover */
}
.dark-scrollbar::-webkit-scrollbar-thumb {
background: transparent;
/* background: theme("colors.scrollbar.dark.thumb"); */
/* Handle color */
border-radius: 10px;
}
.dark-scrollbar::-webkit-scrollbar-thumb:hover {
background: transparent;
/* background: theme("colors.scrollbar.dark.thumb-hover"); */
/* Handle color on hover */
}
/* TEXTAREA */
textarea::-webkit-scrollbar {
width: 8px;
}
textarea::-webkit-scrollbar-track {
background: var(--scrollbar-track);
border-radius: 4px;
}
textarea::-webkit-scrollbar-thumb {
background: var(--scrollbar-thumb);
border-radius: 4px;
}
textarea::-webkit-scrollbar-thumb:hover {
background: var(--scrollbar-thumb-hover);
}
textarea {
resize: vertical;
}
/* For Firefox */
textarea {
scrollbar-width: thin;
scrollbar-color: var(--scrollbar-thumb) var(--scrollbar-track);
}

View File

@@ -9,12 +9,11 @@ import { PHProvider } from "./providers";
import { Suspense } from "react";
import PostHogPageView from "./PostHogPageView";
import Script from "next/script";
import { DM_Mono, Hanken_Grotesk } from "next/font/google";
import { Hanken_Grotesk } from "next/font/google";
import { WebVitals } from "./web-vitals";
import { ThemeProvider } from "next-themes";
import { TooltipProvider } from "@/components/ui/tooltip";
import StatsOverlayLoader from "@/components/dev/StatsOverlayLoader";
import { cn } from "@/lib/utils";
import AppHealthBanner from "@/sections/AppHealthBanner";
import CustomAnalyticsScript from "@/providers/CustomAnalyticsScript";
import ProductGatingWrapper from "@/providers/ProductGatingWrapper";
@@ -24,29 +23,6 @@ const hankenGrotesk = Hanken_Grotesk({
subsets: ["latin"],
variable: "--font-hanken-grotesk",
display: "swap",
fallback: [
"-apple-system",
"BlinkMacSystemFont",
"Segoe UI",
"Roboto",
"sans-serif",
],
});
const dmMono = DM_Mono({
weight: "400",
subsets: ["latin"],
variable: "--font-dm-mono",
display: "swap",
fallback: [
"SF Mono",
"Monaco",
"Cascadia Code",
"Roboto Mono",
"Consolas",
"Courier New",
"monospace",
],
});
export const metadata: Metadata = {
@@ -68,7 +44,7 @@ export default function RootLayout({
return (
<html
lang="en"
className={cn(hankenGrotesk.variable, dmMono.variable)}
className={`${hankenGrotesk.variable}`}
suppressHydrationWarning
>
<head>

View File

@@ -5,7 +5,6 @@ import { SvgDownload, SvgTextLines } from "@opal/icons";
import Modal from "@/refresh-components/Modal";
import SimpleLoader from "@/refresh-components/loaders/SimpleLoader";
import CopyIconButton from "@/refresh-components/buttons/CopyIconButton";
import { Hoverable } from "@opal/core";
import { useHookExecutionLogs } from "@/ee/hooks/useHookExecutionLogs";
import { formatDateTimeLog } from "@/lib/dateUtils";
import { downloadFile } from "@/lib/download";
@@ -41,40 +40,33 @@ function SectionHeader({ label }: { label: string }) {
);
}
function LogRow({ log, group }: { log: HookExecutionRecord; group: string }) {
function LogRow({ log }: { log: HookExecutionRecord }) {
return (
<Hoverable.Root group={group}>
<Section
flexDirection="row"
justifyContent="start"
alignItems="start"
gap={0.5}
height="fit"
className="py-2"
>
{/* 1. Timestamp */}
<span className="shrink-0 text-code-code">
<Text font="secondary-mono-label" color="inherit" nowrap>
{formatDateTimeLog(log.created_at)}
</Text>
</span>
{/* 2. Error message */}
<span className="flex-1 min-w-0 break-all whitespace-pre-wrap text-code-code">
<Text font="secondary-mono" color="inherit">
{log.error_message ?? "Unknown error"}
</Text>
</span>
{/* 3. Copy button */}
<Section width="fit" height="fit" alignItems="center">
<Hoverable.Item group={group} variant="opacity-on-hover">
<CopyIconButton
size="xs"
getCopyText={() => log.error_message ?? ""}
/>
</Hoverable.Item>
</Section>
<Section
flexDirection="row"
justifyContent="start"
alignItems="start"
gap={0.5}
height="fit"
className="py-2"
>
{/* 1. Timestamp */}
<span className="shrink-0 text-code-code">
<Text font="secondary-mono-label" color="inherit" nowrap>
{formatDateTimeLog(log.created_at)}
</Text>
</span>
{/* 2. Error message */}
<span className="flex-1 min-w-0 break-all whitespace-pre-wrap text-code-code">
<Text font="secondary-mono" color="inherit">
{log.error_message ?? "Unknown error"}
</Text>
</span>
{/* 3. Copy button */}
<Section width="fit" height="fit" alignItems="center">
<CopyIconButton size="xs" getCopyText={() => log.error_message ?? ""} />
</Section>
</Hoverable.Root>
</Section>
);
}
@@ -134,11 +126,7 @@ export default function HookLogsModal({ hook, spec }: HookLogsModalProps) {
<>
<SectionHeader label="Past Hour" />
{recentErrors.map((log, idx) => (
<LogRow
key={log.created_at + String(idx)}
log={log}
group={log.created_at + String(idx)}
/>
<LogRow key={log.created_at + String(idx)} log={log} />
))}
</>
)}
@@ -146,11 +134,7 @@ export default function HookLogsModal({ hook, spec }: HookLogsModalProps) {
<>
<SectionHeader label="Older" />
{olderErrors.map((log, idx) => (
<LogRow
key={log.created_at + String(idx)}
log={log}
group={log.created_at + String(idx)}
/>
<LogRow key={log.created_at + String(idx)} log={log} />
))}
</>
)}

View File

@@ -3,7 +3,7 @@
import { useEffect, useRef, useState } from "react";
import { useCreateModal } from "@/refresh-components/contexts/ModalContext";
import { noProp } from "@/lib/utils";
import { formatDateTimeLog } from "@/lib/dateUtils";
import { formatTimeOnly } from "@/lib/dateUtils";
import { Button, Text } from "@opal/components";
import { Content } from "@opal/layouts";
import LineItem from "@/refresh-components/buttons/LineItem";
@@ -18,7 +18,6 @@ import {
SvgXOctagon,
} from "@opal/icons";
import CopyIconButton from "@/refresh-components/buttons/CopyIconButton";
import { Hoverable } from "@opal/core";
import { useHookExecutionLogs } from "@/ee/hooks/useHookExecutionLogs";
import HookLogsModal from "@/ee/refresh-pages/admin/HooksPage/HookLogsModal";
import type {
@@ -27,52 +26,6 @@ import type {
} from "@/ee/refresh-pages/admin/HooksPage/interfaces";
import { cn } from "@opal/utils";
function ErrorLogRow({
log,
group,
}: {
log: { created_at: string; error_message: string | null };
group: string;
}) {
return (
<Hoverable.Root group={group}>
<Section
flexDirection="column"
justifyContent="start"
alignItems="start"
gap={0.25}
padding={0.25}
height="fit"
>
<Section
flexDirection="row"
justifyContent="between"
alignItems="center"
gap={0}
height="fit"
>
<span className="text-code-code">
<Text font="secondary-mono-label" color="inherit">
{formatDateTimeLog(log.created_at)}
</Text>
</span>
<Hoverable.Item group={group} variant="opacity-on-hover">
<CopyIconButton
size="xs"
getCopyText={() => log.error_message ?? ""}
/>
</Hoverable.Item>
</Section>
<span className="break-all">
<Text font="secondary-mono" color="text-03">
{log.error_message ?? "Unknown error"}
</Text>
</span>
</Section>
</Hoverable.Root>
);
}
interface HookStatusPopoverProps {
hook: HookResponse;
spec: HookPointMeta | undefined;
@@ -90,16 +43,9 @@ export default function HookStatusPopover({
const [clickOpened, setClickOpened] = useState(false);
const closeTimerRef = useRef<ReturnType<typeof setTimeout> | null>(null);
const { hasRecentErrors, recentErrors, olderErrors, isLoading, error } =
const { hasRecentErrors, recentErrors, isLoading, error } =
useHookExecutionLogs(hook.id);
const topErrors = [...recentErrors, ...olderErrors]
.sort(
(a, b) =>
new Date(b.created_at).getTime() - new Date(a.created_at).getTime()
)
.slice(0, 3);
useEffect(() => {
return () => {
if (closeTimerRef.current) clearTimeout(closeTimerRef.current);
@@ -179,12 +125,7 @@ export default function HookStatusPopover({
<Button
prominence="tertiary"
rightIcon={({ className, ...props }) =>
hook.is_reachable === false ? (
<SvgXOctagon
{...props}
className={cn("text-status-error-05", className)}
/>
) : hasRecentErrors ? (
hasRecentErrors ? (
<SvgAlertTriangle
{...props}
className={cn("text-status-warning-05", className)}
@@ -201,7 +142,7 @@ export default function HookStatusPopover({
onClick={noProp(handleTriggerClick)}
disabled={isBusy}
>
{hook.is_reachable === false ? "Connection Lost" : "Connected"}
Connected
</Button>
</Popover.Anchor>
@@ -216,15 +157,7 @@ export default function HookStatusPopover({
justifyContent="start"
alignItems="start"
height="fit"
width={
hook.is_reachable === false
? topErrors.length > 0
? 20
: 12.5
: hasRecentErrors
? 20
: 12.5
}
width={hasRecentErrors ? 20 : 12.5}
padding={0.125}
gap={0.25}
>
@@ -236,70 +169,13 @@ export default function HookStatusPopover({
<Text font="secondary-body" color="text-03">
Failed to load logs.
</Text>
) : hook.is_reachable === false ? (
<>
<div className="p-1">
<Content
sizePreset="secondary"
variant="section"
icon={(props) => (
<SvgXOctagon
{...props}
className="text-status-error-05"
/>
)}
title="Most Recent Errors"
/>
</div>
{topErrors.length > 0 ? (
<>
<Separator noPadding className="px-2" />
<Section
flexDirection="column"
justifyContent="start"
alignItems="start"
gap={0.25}
padding={0.25}
height="fit"
>
{topErrors.map((log, idx) => (
<ErrorLogRow
key={log.created_at + String(idx)}
log={log}
group={log.created_at + String(idx)}
/>
))}
</Section>
</>
) : (
<Separator noPadding className="px-2" />
)}
<LineItem
muted
icon={SvgMaximize2}
onClick={noProp(() => {
handleOpenChange(false);
logsModal.toggle(true);
})}
>
View More Lines
</LineItem>
</>
) : hasRecentErrors ? (
<>
<div className="p-1">
<Content
sizePreset="secondary"
variant="section"
icon={(props) => (
<SvgXOctagon
{...props}
className="text-status-error-05"
/>
)}
icon={SvgXOctagon}
title={
recentErrors.length <= 3
? `${recentErrors.length} ${
@@ -323,11 +199,38 @@ export default function HookStatusPopover({
height="fit"
>
{recentErrors.slice(0, 3).map((log, idx) => (
<ErrorLogRow
<Section
key={log.created_at + String(idx)}
log={log}
group={log.created_at + String(idx)}
/>
flexDirection="column"
justifyContent="start"
alignItems="start"
gap={0.25}
padding={0.25}
height="fit"
>
<Section
flexDirection="row"
justifyContent="between"
alignItems="center"
gap={0}
height="fit"
>
<span className="text-code-code">
<Text font="secondary-mono-label" color="inherit">
{formatTimeOnly(log.created_at)}
</Text>
</span>
<CopyIconButton
size="xs"
getCopyText={() => log.error_message ?? ""}
/>
</Section>
<span className="break-all">
<Text font="secondary-mono" color="text-03">
{log.error_message ?? "Unknown error"}
</Text>
</span>
</Section>
))}
</Section>

View File

@@ -41,7 +41,6 @@ import {
activateHook,
deactivateHook,
deleteHook,
getHook,
validateHook,
} from "@/ee/refresh-pages/admin/HooksPage/svc";
import type {
@@ -320,16 +319,9 @@ function ConnectedHookCard({
toast.error(
err instanceof Error ? err.message : "Failed to validate hook."
);
return;
} finally {
setIsBusy(false);
}
try {
const updated = await getHook(hook.id);
onToggled(updated);
} catch (err) {
console.error("Failed to refresh hook after validation:", err);
}
}
const HookIcon = getHookPointIcon(hook.hook_point);
@@ -358,11 +350,8 @@ function ConnectedHookCard({
variant="section"
icon={HookIcon}
title={
!hook.is_active || hook.is_reachable === false
? markdown(`~~${hook.name}~~`)
: hook.name
!hook.is_active ? markdown(`~~${hook.name}~~`) : hook.name
}
suffix={!hook.is_active ? "(Disconnected)" : undefined}
description={`Hook Point: ${
spec?.display_name ?? hook.hook_point
}`}

View File

@@ -87,14 +87,6 @@ export async function deactivateHook(id: number): Promise<HookResponse> {
return res.json();
}
export async function getHook(id: number): Promise<HookResponse> {
const res = await fetch(`/api/admin/hooks/${id}`);
if (!res.ok) {
throw await parseError(res, "Failed to fetch hook");
}
return res.json();
}
export async function validateHook(id: number): Promise<HookValidateResponse> {
const res = await fetch(`/api/admin/hooks/${id}/validate`, {
method: "POST",

View File

@@ -0,0 +1,89 @@
import type { Meta, StoryObj } from "@storybook/react";
import SidebarTab from "./SidebarTab";
import {
SvgDashboard,
SvgSettings,
SvgUser,
SvgFolder,
SvgSearch,
} from "@opal/icons";
import * as TooltipPrimitive from "@radix-ui/react-tooltip";
const meta: Meta<typeof SidebarTab> = {
title: "refresh-components/buttons/SidebarTab",
component: SidebarTab,
tags: ["autodocs"],
decorators: [
(Story) => (
<TooltipPrimitive.Provider>
<div
style={{
width: 260,
background: "var(--background-neutral-01)",
padding: 8,
borderRadius: 12,
}}
>
<Story />
</div>
</TooltipPrimitive.Provider>
),
],
};
export default meta;
type Story = StoryObj<typeof SidebarTab>;
export const Default: Story = {
args: {
icon: SvgDashboard,
children: "Home",
},
};
export const Selected: Story = {
args: {
icon: SvgDashboard,
children: "Home",
selected: true,
},
};
export const Lowlight: Story = {
args: {
icon: SvgFolder,
children: "Archived",
lowlight: true,
},
};
export const Nested: Story = {
args: {
children: "Sub-item",
nested: true,
},
};
export const Folded: Story = {
args: {
icon: SvgDashboard,
children: "Home",
folded: true,
},
};
export const SidebarNavigation: Story = {
render: () => (
<div style={{ display: "flex", flexDirection: "column", gap: 2 }}>
<SidebarTab icon={SvgDashboard} selected>
Home
</SidebarTab>
<SidebarTab icon={SvgSearch}>Search</SidebarTab>
<SidebarTab icon={SvgFolder}>Documents</SidebarTab>
<SidebarTab icon={SvgUser}>Profile</SidebarTab>
<SidebarTab icon={SvgSettings}>Settings</SidebarTab>
<SidebarTab nested>Sub-item A</SidebarTab>
<SidebarTab nested>Sub-item B</SidebarTab>
</div>
),
};

View File

@@ -0,0 +1,4 @@
export {
SidebarTab as default,
type SidebarTabProps,
} from "@opal/components/buttons/sidebar-tab/components";

View File

@@ -11,10 +11,10 @@ import { useUser } from "@/providers/UserProvider";
import { UserRole } from "@/lib/types";
import { usePaidEnterpriseFeaturesEnabled } from "@/components/settings/usePaidEnterpriseFeaturesEnabled";
import { CombinedSettings } from "@/interfaces/settings";
import { SidebarTab } from "@opal/components";
import SidebarTab from "@/refresh-components/buttons/SidebarTab";
import SidebarBody from "@/sections/sidebar/SidebarBody";
import InputTypeIn from "@/refresh-components/inputs/InputTypeIn";
import Separator from "@/refresh-components/Separator";
import { Disabled } from "@opal/core";
import { SvgArrowUpCircle, SvgUserManage, SvgX } from "@opal/icons";
import {
useBillingInformation,
@@ -141,12 +141,15 @@ function buildItems(
if (!isCurator) {
if (hasSubscription) {
add(SECTIONS.ORGANIZATION, ADMIN_ROUTES.BILLING);
} else {
items.push({
section: SECTIONS.ORGANIZATION,
name: "Upgrade Plan",
icon: SvgArrowUpCircle,
link: ADMIN_ROUTES.BILLING.path,
});
}
addDisabled(
SECTIONS.ORGANIZATION,
ADMIN_ROUTES.TOKEN_RATE_LIMITS,
!enableEnterprise
);
add(SECTIONS.ORGANIZATION, ADMIN_ROUTES.TOKEN_RATE_LIMITS);
addDisabled(SECTIONS.ORGANIZATION, ADMIN_ROUTES.THEME, !enableEnterprise);
}
@@ -162,16 +165,6 @@ function buildItems(
}
}
// 8. Upgrade Plan (admin only, no subscription)
if (!isCurator && !hasSubscription) {
items.push({
section: SECTIONS.UNLABELED,
name: "Upgrade Plan",
icon: SvgArrowUpCircle,
link: ADMIN_ROUTES.BILLING.path,
});
}
return items;
}
@@ -231,10 +224,7 @@ export default function AdminSidebar({ enableCloudSS }: AdminSidebarProps) {
const { query, setQuery, filtered } = useFilter(allItems, itemExtractor);
const enabled = filtered.filter((item) => !item.disabled);
const disabled = filtered.filter((item) => item.disabled);
const enabledGroups = groupBySection(enabled);
const disabledGroups = groupBySection(disabled);
const groups = groupBySection(filtered);
return (
<SidebarWrapper>
@@ -245,7 +235,7 @@ export default function AdminSidebar({ enableCloudSS }: AdminSidebarProps) {
<SidebarTab
icon={({ className }) => <SvgX className={className} size={16} />}
href="/app"
variant="sidebar-light"
lowlight
>
Exit Admin Panel
</SidebarTab>
@@ -296,16 +286,26 @@ export default function AdminSidebar({ enableCloudSS }: AdminSidebarProps) {
</Section>
}
>
{enabledGroups.map((group, groupIndex) => {
const tabs = group.items.map(({ link, icon, name }) => (
<SidebarTab
key={link}
icon={icon}
href={link}
selected={pathname.startsWith(link)}
>
{name}
</SidebarTab>
{groups.map((group, groupIndex) => {
const tabs = group.items.map(({ link, icon, name, disabled }) => (
<Disabled key={link} disabled={disabled}>
{/*
# NOTE (@raunakab)
We intentionally add a `div` intermediary here.
Without it, the disabled styling that is default provided by the `Disabled` component (which we want here) would be overridden by the custom disabled styling provided by the `SidebarTab`.
Therefore, in order to avoid that overriding, we add a layer of indirection.
*/}
<div>
<SidebarTab
lowlight={disabled}
icon={icon}
href={disabled ? undefined : link}
selected={!disabled && pathname.startsWith(link)}
>
{name}
</SidebarTab>
</div>
</Disabled>
));
if (!group.section) {
@@ -318,22 +318,6 @@ export default function AdminSidebar({ enableCloudSS }: AdminSidebarProps) {
</SidebarSection>
);
})}
{disabledGroups.length > 0 && <Separator noPadding className="px-2" />}
{disabledGroups.map((group, groupIndex) => (
<SidebarSection
key={`disabled-${groupIndex}`}
title={group.section}
disabled
>
{group.items.map(({ link, icon, name }) => (
<SidebarTab key={link} disabled icon={icon}>
{name}
</SidebarTab>
))}
</SidebarSection>
))}
</SidebarBody>
</SidebarWrapper>
);

View File

@@ -4,7 +4,7 @@ import React, { memo } from "react";
import { MinimalPersonaSnapshot } from "@/app/admin/agents/interfaces";
import { usePinnedAgents, useCurrentAgent } from "@/hooks/useAgents";
import { cn, noProp } from "@/lib/utils";
import { SidebarTab } from "@opal/components";
import SidebarTab from "@/refresh-components/buttons/SidebarTab";
import IconButton from "@/refresh-components/buttons/IconButton";
import { useSortable } from "@dnd-kit/sortable";
import { CSS } from "@dnd-kit/utilities";

View File

@@ -51,7 +51,7 @@ import {
LOCAL_STORAGE_KEYS,
} from "@/sections/sidebar/constants";
import { showErrorNotification, handleMoveOperation } from "./sidebarUtils";
import { SidebarTab } from "@opal/components";
import SidebarTab from "@/refresh-components/buttons/SidebarTab";
import { ChatSession } from "@/app/app/interfaces";
import SidebarBody from "@/sections/sidebar/SidebarBody";
import { useUser } from "@/providers/UserProvider";
@@ -562,7 +562,7 @@ const MemoizedAppSidebarInner = memo(
href="/app/agents"
folded={folded}
selected={activeSidebarTab.isMoreAgents()}
variant={folded ? "sidebar-heavy" : "sidebar-light"}
lowlight={!folded}
>
{visibleAgents.length === 0 ? "Explore Agents" : "More Agents"}
</SidebarTab>
@@ -577,7 +577,7 @@ const MemoizedAppSidebarInner = memo(
onClick={() => createProjectModal.toggle(true)}
selected={createProjectModal.isOpen}
folded={folded}
variant={folded ? "sidebar-heavy" : "sidebar-light"}
lowlight={!folded}
>
New Project
</SidebarTab>

View File

@@ -18,7 +18,7 @@ import { useProjectsContext } from "@/providers/ProjectsContext";
import MoveCustomAgentChatModal from "@/components/modals/MoveCustomAgentChatModal";
import { UNNAMED_CHAT } from "@/lib/constants";
import ShareChatSessionModal from "@/sections/modals/ShareChatSessionModal";
import { SidebarTab } from "@opal/components";
import SidebarTab from "@/refresh-components/buttons/SidebarTab";
import IconButton from "@/refresh-components/buttons/IconButton";
import { Button } from "@opal/components";
import InputTypeIn from "@/refresh-components/inputs/InputTypeIn";

View File

@@ -10,7 +10,7 @@ import ChatButton from "@/sections/sidebar/ChatButton";
import { useAppRouter } from "@/hooks/appNavigation";
import { cn, noProp } from "@/lib/utils";
import { DRAG_TYPES } from "./constants";
import { SidebarTab } from "@opal/components";
import SidebarTab from "@/refresh-components/buttons/SidebarTab";
import IconButton from "@/refresh-components/buttons/IconButton";
import Truncated from "@/refresh-components/texts/Truncated";
import { Button } from "@opal/components";

View File

@@ -2,42 +2,34 @@
import React from "react";
import Text from "@/refresh-components/texts/Text";
import { Disabled, Hoverable } from "@opal/core";
import { cn } from "@/lib/utils";
export interface SidebarSectionProps {
title: string;
children?: React.ReactNode;
action?: React.ReactNode;
disabled?: boolean;
className?: string;
}
export default function SidebarSection({
title,
children,
action,
disabled,
className,
}: SidebarSectionProps) {
return (
<Hoverable.Root group="sidebar-section">
{/* Title */}
{/* NOTE: mr-1.5 is intentionally used instead of padding to avoid the background color
from overlapping with scrollbars on Safari.
*/}
<Disabled disabled={disabled}>
<div className="pl-2 mr-1.5 py-1 sticky top-0 bg-background-tint-02 z-10 flex flex-row items-center justify-between min-h-[2rem]">
<div className="p-0.5 w-full flex flex-col justify-center">
<Text secondaryBody text02>
{title}
</Text>
<div className={cn("flex flex-col group/SidebarSection", className)}>
<div className="pl-2 pr-1.5 py-1 sticky top-[0rem] bg-background-tint-02 z-10 flex flex-row items-center justify-between min-h-[2rem]">
<Text as="p" secondaryBody text02>
{title}
</Text>
{action && (
<div className="flex-shrink-0 opacity-0 group-hover/SidebarSection:opacity-100 transition-opacity">
{action}
</div>
{action && (
<Hoverable.Item group="sidebar-section">{action}</Hoverable.Item>
)}
</div>
</Disabled>
{/* Contents */}
{children}
</Hoverable.Root>
)}
</div>
<div>{children}</div>
</div>
);
}

View File

@@ -1,6 +1,6 @@
import { ReactNode } from "react";
import type { IconProps } from "@opal/types";
import { SidebarTab } from "@opal/components";
import SidebarTab from "@/refresh-components/buttons/SidebarTab";
import SidebarWrapper from "@/sections/sidebar/SidebarWrapper";
export interface StepSidebarProps {

View File

@@ -11,7 +11,7 @@ import { useUser } from "@/providers/UserProvider";
import LineItem from "@/refresh-components/buttons/LineItem";
import Popover, { PopoverMenu } from "@/refresh-components/Popover";
import { usePathname, useRouter, useSearchParams } from "next/navigation";
import { SidebarTab } from "@opal/components";
import SidebarTab from "@/refresh-components/buttons/SidebarTab";
import NotificationsPopover from "@/sections/sidebar/NotificationsPopover";
import {
SvgBell,

View File

@@ -59,10 +59,7 @@ for (const theme of THEMES) {
await expectScreenshot(page, {
name: `admin-${theme}-${slug}`,
mask: [
'[data-testid="admin-date-range-selector-button"]',
'[data-column-id="updated_at"]',
],
mask: ['[data-testid="admin-date-range-selector-button"]'],
});
},
{ box: true }

View File

@@ -142,6 +142,8 @@ test.describe("Chat Search Command Menu", () => {
}
await expect(dialog.getByText("Sessions")).toBeVisible();
await expectScreenshot(page, { name: "command-menu-sessions-filter" });
});
test('"Projects" filter expands to show all 4 projects', async ({ page }) => {

View File

@@ -135,65 +135,6 @@ export async function waitForAnimations(page: Page): Promise<void> {
});
}
/**
* Wait for every **visible** `<img>` on the page to finish loading (or error).
*
* This prevents screenshot flakiness caused by images that have been added to
* the DOM but haven't been decoded yet — `networkidle` only guarantees that
* fewer than 2 connections are in flight, not that every image is painted.
*
* Only images that are actually visible and in (or near) the viewport are
* waited on. Hidden images (e.g. the `dark:hidden` / `hidden dark:block`
* alternates created by `createLogoIcon`) and offscreen lazy-loaded images
* are skipped so they don't force a needless timeout.
*
* Times out after `timeoutMs` (default 5 000 ms) so a single broken image
* doesn't block the entire test forever.
*/
export async function waitForImages(
page: Page,
timeoutMs: number = 5_000
): Promise<void> {
await page.evaluate(async (timeout) => {
const images = Array.from(document.querySelectorAll("img")).filter(
(img) => {
// Skip images hidden via CSS (display:none, visibility:hidden, etc.)
// This covers createLogoIcon's dark-mode alternates.
const style = getComputedStyle(img);
if (
style.display === "none" ||
style.visibility === "hidden" ||
style.opacity === "0"
) {
return false;
}
// Skip images that have no layout box (zero size or detached).
const rect = img.getBoundingClientRect();
if (rect.width === 0 && rect.height === 0) return false;
// Skip images far below the viewport (lazy-loaded, not yet needed).
if (rect.top > window.innerHeight * 2) return false;
return true;
}
);
await Promise.race([
Promise.allSettled(
images.map((img) => {
if (img.complete) return Promise.resolve();
return new Promise<void>((resolve) => {
img.addEventListener("load", () => resolve(), { once: true });
img.addEventListener("error", () => resolve(), { once: true });
});
})
),
new Promise<void>((resolve) => setTimeout(resolve, timeout)),
]);
}, timeoutMs);
}
/**
* Take a screenshot and optionally assert it matches the stored baseline.
*
@@ -247,10 +188,6 @@ export async function expectScreenshot(
page.locator(selector)
);
// Wait for images to finish loading / decoding so that logo icons
// and other <img> elements are fully painted before the screenshot.
await waitForImages(page);
// Wait for any in-flight CSS animations / transitions to settle so that
// screenshots are deterministic (e.g. slide-in card animations on the
// onboarding flow).
@@ -342,9 +279,6 @@ export async function expectElementScreenshot(
page.locator(selector)
);
// Wait for images to finish loading / decoding.
await waitForImages(page);
// Wait for any in-flight CSS animations / transitions to settle so that
// element screenshots are deterministic (same reasoning as expectScreenshot).
await waitForAnimations(page);

View File