diff --git a/docs/superpowers/plans/2026-04-17-log-filters-multiselect-infinite-scroll.md b/docs/superpowers/plans/2026-04-17-log-filters-multiselect-infinite-scroll.md new file mode 100644 index 00000000..428286a4 --- /dev/null +++ b/docs/superpowers/plans/2026-04-17-log-filters-multiselect-infinite-scroll.md @@ -0,0 +1,1733 @@ +# Streaming Views: Multi-Select Filters + Infinite Scroll — Implementation Plan + +> **For agentic workers:** REQUIRED SUB-SKILL: Use superpowers:subagent-driven-development (recommended) or superpowers:executing-plans to implement this plan task-by-task. Steps use checkbox (`- [ ]`) syntax for tracking. + +**Goal:** Convert Application Log + Agent Timeline on the Runtime page into server-side-filtered, cursor-paginated infinite-scroll streams with top-gated auto-refetch. Introduce shared primitives so future streaming views adopt the same pattern. + +**Architecture:** Backend widens `source` to a list on `/logs` and adds cursor pagination on `/agents/events`. UI introduces a `useInfiniteStream` hook + `InfiniteScrollArea` component; `AgentHealth` and `AgentInstance` consume them for both log and timeline blocks. Bounded views (`LogTab`, `StartupLogPanel`) pick up the new `source` badge but keep their single-page hooks. + +**Tech Stack:** Java 17 / Spring Boot 3.4 / ClickHouse 24 (backend); React 19 / Tanstack Query v5 / @cameleer/design-system 0.1.56 (frontend). + +**Spec:** `docs/superpowers/specs/2026-04-17-log-filters-multiselect-infinite-scroll-design.md` + +--- + +## File Map + +Created: +- `cameleer-server-core/src/main/java/com/cameleer/server/core/agent/AgentEventPage.java` +- `cameleer-server-app/src/main/java/com/cameleer/server/app/dto/AgentEventPageResponse.java` +- `ui/src/hooks/useInfiniteStream.ts` +- `ui/src/components/InfiniteScrollArea.tsx` +- `ui/src/components/InfiniteScrollArea.module.css` + +Modified: +- `cameleer-server-core/src/main/java/com/cameleer/server/core/search/LogSearchRequest.java` +- `cameleer-server-core/src/main/java/com/cameleer/server/core/agent/AgentEventRepository.java` +- `cameleer-server-core/src/main/java/com/cameleer/server/core/agent/AgentEventService.java` +- `cameleer-server-app/src/main/java/com/cameleer/server/app/search/ClickHouseLogStore.java` +- `cameleer-server-app/src/main/java/com/cameleer/server/app/controller/LogQueryController.java` +- `cameleer-server-app/src/main/java/com/cameleer/server/app/controller/AgentEventsController.java` +- `cameleer-server-app/src/main/java/com/cameleer/server/app/storage/ClickHouseAgentEventRepository.java` +- `cameleer-server-app/src/test/java/com/cameleer/server/app/search/ClickHouseLogStoreIT.java` +- `cameleer-server-app/src/test/java/com/cameleer/server/app/storage/ClickHouseAgentEventRepositoryIT.java` +- `ui/src/api/queries/logs.ts` +- `ui/src/api/queries/agents.ts` +- `ui/src/pages/AgentHealth/AgentHealth.tsx` +- `ui/src/pages/AgentInstance/AgentInstance.tsx` +- `ui/src/components/ExecutionDiagram/tabs/LogTab.tsx` +- `ui/src/api/openapi.json` (regenerated) +- `ui/src/api/schema.d.ts` (regenerated) +- `.claude/rules/app-classes.md` +- `.claude/rules/ui.md` + +--- + +## Task 1: Widen log `source` filter to a multi-value list + +**Files:** +- Modify: `cameleer-server-core/src/main/java/com/cameleer/server/core/search/LogSearchRequest.java` +- Modify: `cameleer-server-app/src/main/java/com/cameleer/server/app/search/ClickHouseLogStore.java:149-152` +- Modify: `cameleer-server-app/src/main/java/com/cameleer/server/app/controller/LogQueryController.java:46-69` +- Modify: `cameleer-server-app/src/test/java/com/cameleer/server/app/search/ClickHouseLogStoreIT.java` + +- [ ] **Step 1: Add failing IT tests for multi-value `source`** + +Append the two tests below to `ClickHouseLogStoreIT.java` (keeps existing tests untouched): + +```java + @Test + void search_bySources_singleValue_filtersCorrectly() { + Instant now = Instant.parse("2026-03-31T12:00:00Z"); + // "source" column is populated by indexBatch via LogEntry.getSource(); default is "app" when null. + // Force one row to "container" via a direct insert to avoid coupling to LogEntry constructor. + store.indexBatch("agent-1", "my-app", List.of( + entry(now, "INFO", "logger", "app msg", "t1", null, null) + )); + jdbc.update("INSERT INTO logs (tenant_id, environment, timestamp, application, instance_id, level, " + + "logger_name, message, thread_name, stack_trace, exchange_id, mdc, source) VALUES " + + "(?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)", + "default", "default", java.sql.Timestamp.from(now.plusSeconds(1)), "my-app", "agent-1", + "INFO", "logger", "container msg", "t1", "", "", java.util.Map.of(), "container"); + + LogSearchResponse result = store.search(new LogSearchRequest( + null, null, "my-app", null, null, null, null, + List.of("container"), null, null, null, 100, "desc")); + + assertThat(result.data()).hasSize(1); + assertThat(result.data().get(0).message()).isEqualTo("container msg"); + } + + @Test + void search_bySources_multiValue_joinsAsOr() { + Instant now = Instant.parse("2026-03-31T12:00:00Z"); + store.indexBatch("agent-1", "my-app", List.of( + entry(now, "INFO", "logger", "app msg", "t1", null, null) + )); + jdbc.update("INSERT INTO logs (tenant_id, environment, timestamp, application, instance_id, level, " + + "logger_name, message, thread_name, stack_trace, exchange_id, mdc, source) VALUES " + + "(?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)", + "default", "default", java.sql.Timestamp.from(now.plusSeconds(1)), "my-app", "agent-1", + "INFO", "logger", "container msg", "t1", "", "", java.util.Map.of(), "container"); + jdbc.update("INSERT INTO logs (tenant_id, environment, timestamp, application, instance_id, level, " + + "logger_name, message, thread_name, stack_trace, exchange_id, mdc, source) VALUES " + + "(?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)", + "default", "default", java.sql.Timestamp.from(now.plusSeconds(2)), "my-app", "agent-1", + "INFO", "logger", "agent msg", "t1", "", "", java.util.Map.of(), "agent"); + + LogSearchResponse result = store.search(new LogSearchRequest( + null, null, "my-app", null, null, null, null, + List.of("app", "container"), null, null, null, 100, "desc")); + + assertThat(result.data()).hasSize(2); + assertThat(result.data()).extracting(LogEntryResult::message) + .containsExactlyInAnyOrder("app msg", "container msg"); + } +``` + +Also update the existing call-site helper `req(...)` and the inline `new LogSearchRequest(...)` calls: the 8th constructor argument must now be `List` (was `String`). Replace every `null` passed at that position with `null` — the compact constructor will normalize it to an empty list (same behavior as today). Replace every string literal passed at that position with `List.of(literal)` — there are no such cases in the existing IT file, so only the type signature of the record matters. + +- [ ] **Step 2: Run tests to confirm they fail to compile** + +Run: `mvn -pl cameleer-server-app test-compile -am` +Expected: FAIL with `incompatible types: String cannot be converted to List` on the new tests (because `LogSearchRequest.source` is still `String`). + +- [ ] **Step 3: Change `LogSearchRequest.source` (String) → `sources` (List)** + +Replace the whole body of `LogSearchRequest.java` with: + +```java +package com.cameleer.server.core.search; + +import java.time.Instant; +import java.util.List; + +/** + * Immutable search criteria for querying application logs. + * + * @param q free-text search across message and stack trace + * @param levels log level filter (e.g. ["WARN","ERROR"]), OR-joined + * @param application application ID filter (nullable = all apps) + * @param instanceId agent instance ID filter + * @param exchangeId Camel exchange ID filter + * @param logger logger name substring filter + * @param environment optional environment filter (e.g. "dev", "staging", "prod") + * @param sources optional source filter (e.g. ["app","container","agent"]), OR-joined + * @param from inclusive start of time range + * @param to inclusive end of time range + * @param cursor ISO timestamp cursor for keyset pagination + * @param limit page size (1-500, default 100) + * @param sort sort direction: "asc" or "desc" (default "desc") + */ +public record LogSearchRequest( + String q, + List levels, + String application, + String instanceId, + String exchangeId, + String logger, + String environment, + List sources, + Instant from, + Instant to, + String cursor, + int limit, + String sort +) { + + private static final int DEFAULT_LIMIT = 100; + private static final int MAX_LIMIT = 500; + + public LogSearchRequest { + if (limit <= 0) limit = DEFAULT_LIMIT; + if (limit > MAX_LIMIT) limit = MAX_LIMIT; + if (sort == null || !"asc".equalsIgnoreCase(sort)) sort = "desc"; + if (levels == null) levels = List.of(); + if (sources == null) sources = List.of(); + } +} +``` + +- [ ] **Step 4: Update `ClickHouseLogStore.search` to use `IN (…)` on the sources list** + +In `ClickHouseLogStore.java`, replace lines 149–152: + +```java + if (request.source() != null && !request.source().isEmpty()) { + baseConditions.add("source = ?"); + baseParams.add(request.source()); + } +``` + +with: + +```java + if (request.sources() != null && !request.sources().isEmpty()) { + String placeholders = String.join(", ", Collections.nCopies(request.sources().size(), "?")); + baseConditions.add("source IN (" + placeholders + ")"); + for (String s : request.sources()) { + baseParams.add(s); + } + } +``` + +`Collections` is already imported. + +- [ ] **Step 5: Update `LogQueryController.searchLogs` to parse comma-split `source`** + +Replace the `LogSearchRequest request = new LogSearchRequest(...)` construction block (around lines 67–69) — and adjust the variable-declaration block just above it — so the final section of the method reads: + +```java + List sources = List.of(); + if (source != null && !source.isEmpty()) { + sources = Arrays.stream(source.split(",")) + .map(String::trim) + .filter(s -> !s.isEmpty()) + .toList(); + } + + Instant fromInstant = from != null ? Instant.parse(from) : null; + Instant toInstant = to != null ? Instant.parse(to) : null; + + LogSearchRequest request = new LogSearchRequest( + searchText, levels, application, instanceId, exchangeId, + logger, env.slug(), sources, fromInstant, toInstant, cursor, limit, sort); +``` + +Keep the `@RequestParam(required = false) String source` parameter signature — the HTTP contract stays `?source=a,b`. + +- [ ] **Step 6: Run all backend tests** + +Run: `mvn -pl cameleer-server-core,cameleer-server-app verify` +Expected: PASS. The two new tests and every existing `ClickHouseLogStoreIT` test should be green. + +- [ ] **Step 7: Commit** + +```bash +git add cameleer-server-core/src/main/java/com/cameleer/server/core/search/LogSearchRequest.java \ + cameleer-server-app/src/main/java/com/cameleer/server/app/search/ClickHouseLogStore.java \ + cameleer-server-app/src/main/java/com/cameleer/server/app/controller/LogQueryController.java \ + cameleer-server-app/src/test/java/com/cameleer/server/app/search/ClickHouseLogStoreIT.java +git commit -m "feat(logs): widen source filter to multi-value OR list + +Replaces LogSearchRequest.source (String) with sources (List) +and emits 'source IN (...)' when non-empty. LogQueryController parses +?source=a,b,c the same way it parses ?level=a,b,c." +``` + +--- + +## Task 2: Cursor pagination for agent events — core interfaces + +**Files:** +- Create: `cameleer-server-core/src/main/java/com/cameleer/server/core/agent/AgentEventPage.java` +- Modify: `cameleer-server-core/src/main/java/com/cameleer/server/core/agent/AgentEventRepository.java` +- Modify: `cameleer-server-core/src/main/java/com/cameleer/server/core/agent/AgentEventService.java` + +- [ ] **Step 1: Create `AgentEventPage` record** + +Create `cameleer-server-core/src/main/java/com/cameleer/server/core/agent/AgentEventPage.java`: + +```java +package com.cameleer.server.core.agent; + +import java.util.List; + +/** + * Cursor-paginated result page for agent event queries. + * + * @param data events on this page, ordered newest-first + * @param nextCursor opaque cursor to pass back for the next page (null when no more) + * @param hasMore whether more results exist beyond this page + */ +public record AgentEventPage( + List data, + String nextCursor, + boolean hasMore +) {} +``` + +- [ ] **Step 2: Add `queryPage` to `AgentEventRepository`** + +Replace the body of `AgentEventRepository.java` with: + +```java +package com.cameleer.server.core.agent; + +import java.time.Instant; +import java.util.List; + +public interface AgentEventRepository { + + void insert(String instanceId, String applicationId, String environment, String eventType, String detail); + + List query(String applicationId, String instanceId, String environment, Instant from, Instant to, int limit); + + /** + * Cursor-paginated query ordered by (timestamp DESC, instance_id ASC). The cursor + * is an opaque base64 string produced by the implementation; pass {@code null} for + * the first page. + */ + AgentEventPage queryPage(String applicationId, String instanceId, String environment, + Instant from, Instant to, String cursor, int limit); +} +``` + +- [ ] **Step 3: Add `queryEventPage` to `AgentEventService`** + +Append this method to `AgentEventService.java` (after the existing `queryEvents`): + +```java + public AgentEventPage queryEventPage(String applicationId, String instanceId, String environment, + Instant from, Instant to, String cursor, int limit) { + return repository.queryPage(applicationId, instanceId, environment, from, to, cursor, limit); + } +``` + +- [ ] **Step 4: Run core compile** + +Run: `mvn -pl cameleer-server-core compile` +Expected: FAIL — `ClickHouseAgentEventRepository` does not yet implement `queryPage`. That's OK; we'll fix it in Task 3. To confirm the core itself compiles, run only the core module's compile; the failure will surface in the app module. + +Alternative (more useful): run `mvn -pl cameleer-server-core test-compile` — core tests should compile. +Expected: PASS at core-only compilation. + +- [ ] **Step 5: Commit** + +```bash +git add cameleer-server-core/src/main/java/com/cameleer/server/core/agent/AgentEventPage.java \ + cameleer-server-core/src/main/java/com/cameleer/server/core/agent/AgentEventRepository.java \ + cameleer-server-core/src/main/java/com/cameleer/server/core/agent/AgentEventService.java +git commit -m "feat(events): add AgentEventPage + queryPage interface + +Introduces cursor-paginated query on AgentEventRepository. The cursor +format is owned by the implementation. The existing non-paginated +query(...) is kept for internal consumers." +``` + +--- + +## Task 3: ClickHouse implementation of event cursor pagination + +**Files:** +- Modify: `cameleer-server-app/src/main/java/com/cameleer/server/app/storage/ClickHouseAgentEventRepository.java` +- Modify: `cameleer-server-app/src/test/java/com/cameleer/server/app/storage/ClickHouseAgentEventRepositoryIT.java` + +Cursor format: `base64url("{timestampIso}|{instanceId}")`. Sort order is `(timestamp DESC, instance_id ASC)`. Keyset predicate for desc+asc tie-break: +`(timestamp < cursorTs) OR (timestamp = cursorTs AND instance_id > cursorInstance)`. + +- [ ] **Step 1: Add failing pagination tests** + +Append to `ClickHouseAgentEventRepositoryIT.java` (keep existing tests as-is): + +```java + @Test + void queryPage_emptyTable_returnsEmptyPage() { + com.cameleer.server.core.agent.AgentEventPage page = + repo.queryPage(null, null, null, null, null, null, 10); + assertThat(page.data()).isEmpty(); + assertThat(page.hasMore()).isFalse(); + assertThat(page.nextCursor()).isNull(); + } + + @Test + void queryPage_boundary_noHasMoreWhenLimitEqualsRowCount() { + Instant base = Instant.parse("2026-04-01T10:00:00Z"); + for (int i = 0; i < 3; i++) { + insertAt("agent-1", "app-a", "TICK", "t" + i, base.plusSeconds(i)); + } + com.cameleer.server.core.agent.AgentEventPage page = + repo.queryPage(null, null, null, null, null, null, 3); + assertThat(page.data()).hasSize(3); + assertThat(page.hasMore()).isFalse(); + assertThat(page.nextCursor()).isNull(); + } + + @Test + void queryPage_paginatesAcrossThreePages() { + Instant base = Instant.parse("2026-04-01T10:00:00Z"); + for (int i = 0; i < 5; i++) { + insertAt("agent-1", "app-a", "TICK", "t" + i, base.plusSeconds(i)); + } + + com.cameleer.server.core.agent.AgentEventPage p1 = + repo.queryPage(null, null, null, null, null, null, 2); + assertThat(p1.data()).hasSize(2); + assertThat(p1.hasMore()).isTrue(); + assertThat(p1.nextCursor()).isNotBlank(); + assertThat(p1.data().get(0).detail()).isEqualTo("t4"); + assertThat(p1.data().get(1).detail()).isEqualTo("t3"); + + com.cameleer.server.core.agent.AgentEventPage p2 = + repo.queryPage(null, null, null, null, null, p1.nextCursor(), 2); + assertThat(p2.data()).hasSize(2); + assertThat(p2.hasMore()).isTrue(); + assertThat(p2.data().get(0).detail()).isEqualTo("t2"); + assertThat(p2.data().get(1).detail()).isEqualTo("t1"); + + com.cameleer.server.core.agent.AgentEventPage p3 = + repo.queryPage(null, null, null, null, null, p2.nextCursor(), 2); + assertThat(p3.data()).hasSize(1); + assertThat(p3.hasMore()).isFalse(); + assertThat(p3.nextCursor()).isNull(); + assertThat(p3.data().get(0).detail()).isEqualTo("t0"); + } + + @Test + void queryPage_tiebreakByInstanceIdAsc_whenTimestampsEqual() { + Instant ts = Instant.parse("2026-04-01T10:00:00Z"); + insertAt("agent-z", "app-a", "TICK", "z", ts); + insertAt("agent-a", "app-a", "TICK", "a", ts); + insertAt("agent-m", "app-a", "TICK", "m", ts); + + com.cameleer.server.core.agent.AgentEventPage p1 = + repo.queryPage(null, null, null, null, null, null, 2); + assertThat(p1.data()).hasSize(2); + // (timestamp DESC, instance_id ASC): ties resolve to a, m, z + assertThat(p1.data().get(0).instanceId()).isEqualTo("agent-a"); + assertThat(p1.data().get(1).instanceId()).isEqualTo("agent-m"); + assertThat(p1.hasMore()).isTrue(); + + com.cameleer.server.core.agent.AgentEventPage p2 = + repo.queryPage(null, null, null, null, null, p1.nextCursor(), 2); + assertThat(p2.data()).hasSize(1); + assertThat(p2.data().get(0).instanceId()).isEqualTo("agent-z"); + assertThat(p2.hasMore()).isFalse(); + } +``` + +- [ ] **Step 2: Run the new tests to confirm they fail to compile** + +Run: `mvn -pl cameleer-server-app test-compile -am` +Expected: FAIL — `ClickHouseAgentEventRepository` does not declare `queryPage`. + +- [ ] **Step 3: Implement `queryPage` in `ClickHouseAgentEventRepository`** + +Replace the whole file body with: + +```java +package com.cameleer.server.app.storage; + +import com.cameleer.server.core.agent.AgentEventPage; +import com.cameleer.server.core.agent.AgentEventRecord; +import com.cameleer.server.core.agent.AgentEventRepository; +import org.springframework.jdbc.core.JdbcTemplate; + +import java.nio.charset.StandardCharsets; +import java.sql.Timestamp; +import java.time.Instant; +import java.util.ArrayList; +import java.util.Base64; +import java.util.List; + +/** + * ClickHouse implementation of {@link AgentEventRepository}. + *

+ * The ClickHouse table has no {@code id} column (no BIGSERIAL equivalent), + * so all returned {@link AgentEventRecord} instances have {@code id = 0}. + */ +public class ClickHouseAgentEventRepository implements AgentEventRepository { + + private static final String INSERT_SQL = + "INSERT INTO agent_events (tenant_id, instance_id, application_id, environment, event_type, detail) VALUES (?, ?, ?, ?, ?, ?)"; + + private static final String SELECT_BASE = + "SELECT 0 AS id, instance_id, application_id, event_type, detail, timestamp FROM agent_events WHERE tenant_id = ?"; + + private final String tenantId; + private final JdbcTemplate jdbc; + + public ClickHouseAgentEventRepository(String tenantId, JdbcTemplate jdbc) { + this.tenantId = tenantId; + this.jdbc = jdbc; + } + + @Override + public void insert(String instanceId, String applicationId, String environment, String eventType, String detail) { + jdbc.update(INSERT_SQL, tenantId, instanceId, applicationId, + environment != null ? environment : "default", eventType, detail); + } + + @Override + public List query(String applicationId, String instanceId, String environment, Instant from, Instant to, int limit) { + var sql = new StringBuilder(SELECT_BASE); + var params = new ArrayList(); + params.add(tenantId); + + if (applicationId != null) { sql.append(" AND application_id = ?"); params.add(applicationId); } + if (instanceId != null) { sql.append(" AND instance_id = ?"); params.add(instanceId); } + if (environment != null) { sql.append(" AND environment = ?"); params.add(environment); } + if (from != null) { sql.append(" AND timestamp >= ?"); params.add(Timestamp.from(from)); } + if (to != null) { sql.append(" AND timestamp < ?"); params.add(Timestamp.from(to)); } + sql.append(" ORDER BY timestamp DESC LIMIT ?"); + params.add(limit); + + return jdbc.query(sql.toString(), (rs, rowNum) -> new AgentEventRecord( + rs.getLong("id"), + rs.getString("instance_id"), + rs.getString("application_id"), + rs.getString("event_type"), + rs.getString("detail"), + rs.getTimestamp("timestamp").toInstant() + ), params.toArray()); + } + + @Override + public AgentEventPage queryPage(String applicationId, String instanceId, String environment, + Instant from, Instant to, String cursor, int limit) { + var sql = new StringBuilder(SELECT_BASE); + var params = new ArrayList(); + params.add(tenantId); + + if (applicationId != null) { sql.append(" AND application_id = ?"); params.add(applicationId); } + if (instanceId != null) { sql.append(" AND instance_id = ?"); params.add(instanceId); } + if (environment != null) { sql.append(" AND environment = ?"); params.add(environment); } + if (from != null) { sql.append(" AND timestamp >= ?"); params.add(Timestamp.from(from)); } + if (to != null) { sql.append(" AND timestamp < ?"); params.add(Timestamp.from(to)); } + + if (cursor != null && !cursor.isEmpty()) { + String decoded = new String(Base64.getUrlDecoder().decode(cursor), StandardCharsets.UTF_8); + int bar = decoded.indexOf('|'); + if (bar <= 0) { + throw new IllegalArgumentException("Malformed cursor"); + } + Instant cursorTs = Instant.parse(decoded.substring(0, bar)); + String cursorInstance = decoded.substring(bar + 1); + sql.append(" AND (timestamp < ? OR (timestamp = ? AND instance_id > ?))"); + params.add(Timestamp.from(cursorTs)); + params.add(Timestamp.from(cursorTs)); + params.add(cursorInstance); + } + + sql.append(" ORDER BY timestamp DESC, instance_id ASC LIMIT ?"); + int fetchLimit = limit + 1; + params.add(fetchLimit); + + List results = new ArrayList<>(jdbc.query(sql.toString(), + (rs, rowNum) -> new AgentEventRecord( + rs.getLong("id"), + rs.getString("instance_id"), + rs.getString("application_id"), + rs.getString("event_type"), + rs.getString("detail"), + rs.getTimestamp("timestamp").toInstant() + ), params.toArray())); + + boolean hasMore = results.size() > limit; + if (hasMore) { + results = new ArrayList<>(results.subList(0, limit)); + } + + String nextCursor = null; + if (hasMore && !results.isEmpty()) { + AgentEventRecord last = results.get(results.size() - 1); + String raw = last.timestamp().toString() + "|" + last.instanceId(); + nextCursor = Base64.getUrlEncoder().withoutPadding() + .encodeToString(raw.getBytes(StandardCharsets.UTF_8)); + } + + return new AgentEventPage(results, nextCursor, hasMore); + } +} +``` + +- [ ] **Step 4: Run the new tests** + +Run: `mvn -pl cameleer-server-app test -Dtest=ClickHouseAgentEventRepositoryIT` +Expected: PASS on all four new tests plus the existing ones. + +- [ ] **Step 5: Commit** + +```bash +git add cameleer-server-app/src/main/java/com/cameleer/server/app/storage/ClickHouseAgentEventRepository.java \ + cameleer-server-app/src/test/java/com/cameleer/server/app/storage/ClickHouseAgentEventRepositoryIT.java +git commit -m "feat(events): cursor-paginate agent events (ClickHouse impl) + +Orders by (timestamp DESC, instance_id ASC). Cursor is +base64url('timestampIso|instanceId') with a tuple keyset predicate +for stable paging across ties." +``` + +--- + +## Task 4: Cursor-paginated `/agents/events` endpoint + +**Files:** +- Create: `cameleer-server-app/src/main/java/com/cameleer/server/app/dto/AgentEventPageResponse.java` +- Modify: `cameleer-server-app/src/main/java/com/cameleer/server/app/controller/AgentEventsController.java` + +- [ ] **Step 1: Create `AgentEventPageResponse` DTO** + +Create `cameleer-server-app/src/main/java/com/cameleer/server/app/dto/AgentEventPageResponse.java`: + +```java +package com.cameleer.server.app.dto; + +import io.swagger.v3.oas.annotations.media.Schema; + +import java.util.List; + +@Schema(description = "Cursor-paginated agent event list") +public record AgentEventPageResponse( + List data, + String nextCursor, + boolean hasMore +) {} +``` + +- [ ] **Step 2: Rewrite `AgentEventsController.getEvents` to return a page** + +Replace the entire file body with: + +```java +package com.cameleer.server.app.controller; + +import com.cameleer.server.app.dto.AgentEventPageResponse; +import com.cameleer.server.app.dto.AgentEventResponse; +import com.cameleer.server.app.web.EnvPath; +import com.cameleer.server.core.agent.AgentEventPage; +import com.cameleer.server.core.agent.AgentEventService; +import com.cameleer.server.core.runtime.Environment; +import io.swagger.v3.oas.annotations.Operation; +import io.swagger.v3.oas.annotations.responses.ApiResponse; +import io.swagger.v3.oas.annotations.tags.Tag; +import org.springframework.http.ResponseEntity; +import org.springframework.web.bind.annotation.GetMapping; +import org.springframework.web.bind.annotation.RequestMapping; +import org.springframework.web.bind.annotation.RequestParam; +import org.springframework.web.bind.annotation.RestController; + +import java.time.Instant; + +@RestController +@RequestMapping("/api/v1/environments/{envSlug}/agents/events") +@Tag(name = "Agent Events", description = "Agent lifecycle event log (env-scoped)") +public class AgentEventsController { + + private final AgentEventService agentEventService; + + public AgentEventsController(AgentEventService agentEventService) { + this.agentEventService = agentEventService; + } + + @GetMapping + @Operation(summary = "Query agent events in this environment", + description = "Cursor-paginated. Returns newest first. Pass nextCursor back as ?cursor= for the next page.") + @ApiResponse(responseCode = "200", description = "Event page returned") + public ResponseEntity getEvents( + @EnvPath Environment env, + @RequestParam(required = false) String appId, + @RequestParam(required = false) String agentId, + @RequestParam(required = false) String from, + @RequestParam(required = false) String to, + @RequestParam(required = false) String cursor, + @RequestParam(defaultValue = "50") int limit) { + + Instant fromInstant = from != null ? Instant.parse(from) : null; + Instant toInstant = to != null ? Instant.parse(to) : null; + + AgentEventPage page = agentEventService.queryEventPage( + appId, agentId, env.slug(), fromInstant, toInstant, cursor, limit); + + var data = page.data().stream().map(AgentEventResponse::from).toList(); + + return ResponseEntity.ok(new AgentEventPageResponse(data, page.nextCursor(), page.hasMore())); + } +} +``` + +- [ ] **Step 3: Run full backend build** + +Run: `mvn -pl cameleer-server-core,cameleer-server-app verify` +Expected: PASS. + +- [ ] **Step 4: Commit** + +```bash +git add cameleer-server-app/src/main/java/com/cameleer/server/app/dto/AgentEventPageResponse.java \ + cameleer-server-app/src/main/java/com/cameleer/server/app/controller/AgentEventsController.java +git commit -m "feat(events): cursor-paginated GET /agents/events + +Returns {data, nextCursor, hasMore} instead of a bare list. Adds +?cursor= param; existing filters (appId, agentId, from, to, limit) +unchanged. Ordering is (timestamp DESC, instance_id ASC)." +``` + +--- + +## Task 5: Regenerate OpenAPI schema and TypeScript types + +**Files:** +- Modify: `ui/src/api/openapi.json` +- Modify: `ui/src/api/schema.d.ts` + +- [ ] **Step 1: Start the server locally** + +Run: `mvn -pl cameleer-server-app spring-boot:run` (or equivalent launcher in your dev setup; the target is a server listening on `:8081`). + +Verify in another terminal: `curl -sf http://localhost:8081/api/v1/api-docs > /dev/null && echo OK` +Expected: `OK`. + +- [ ] **Step 2: Fetch openapi.json and regenerate `schema.d.ts`** + +Run: `curl -s http://localhost:8081/api/v1/api-docs -o ui/src/api/openapi.json && (cd ui && npm run generate-api)` +Expected: both `openapi.json` and `schema.d.ts` updated with the new `AgentEventPageResponse` schema. + +- [ ] **Step 3: Type-check the SPA** + +Run: `cd ui && npx tsc -p tsconfig.app.json --noEmit` +Expected: errors at existing consumers of the old agent-events shape in `ui/src/api/queries/agents.ts` (fixed in Task 10) and possibly in `AgentHealth.tsx`/`AgentInstance.tsx` (fixed in Tasks 11–12). **Do not fix them in this task.** We'll fix them once all the new UI code is in place. Note the list of errors for your own reference. + +- [ ] **Step 4: Commit (allow the UI to be temporarily type-broken between tasks)** + +```bash +git add ui/src/api/openapi.json ui/src/api/schema.d.ts +git commit -m "chore: regenerate openapi.json + schema.d.ts for cursor-paginated events + +Reflects multi-value ?source= on /logs and the new AgentEventPageResponse +shape on /agents/events. UI type errors from this change are fixed in +subsequent commits." +``` + +--- + +## Task 6: `useInfiniteStream` hook + +**Files:** +- Create: `ui/src/hooks/useInfiniteStream.ts` + +- [ ] **Step 1: Write the hook** + +Create `ui/src/hooks/useInfiniteStream.ts`: + +```ts +import { useInfiniteQuery, useQueryClient } from '@tanstack/react-query'; +import { useMemo } from 'react'; + +export interface StreamPage { + data: T[]; + nextCursor: string | null; + hasMore: boolean; +} + +export interface UseInfiniteStreamArgs { + queryKey: readonly unknown[]; + fetchPage: (cursor: string | undefined) => Promise>; + enabled?: boolean; + /** When true, the query auto-refetches every refetchMs ms. When false, polling pauses. */ + isAtTop: boolean; + refetchMs?: number; + staleTime?: number; +} + +export interface UseInfiniteStreamResult { + items: T[]; + fetchNextPage: () => void; + hasNextPage: boolean; + isFetchingNextPage: boolean; + isLoading: boolean; + refresh: () => void; +} + +/** + * Thin wrapper over tanstack useInfiniteQuery that: + * - flattens pages into a single items[] array (newest first) + * - gates auto-refetch on isAtTop (so a user scrolled down does not lose their viewport) + * - exposes refresh() that invalidates the query (reset to page 1 on next render) + */ +export function useInfiniteStream(args: UseInfiniteStreamArgs): UseInfiniteStreamResult { + const { queryKey, fetchPage, enabled = true, isAtTop, refetchMs = 15_000, staleTime = 300 } = args; + const queryClient = useQueryClient(); + + const query = useInfiniteQuery, Error>({ + queryKey: [...queryKey], + initialPageParam: undefined as string | undefined, + queryFn: ({ pageParam }) => fetchPage(pageParam as string | undefined), + getNextPageParam: (last) => (last.hasMore && last.nextCursor ? last.nextCursor : undefined), + enabled, + refetchInterval: isAtTop ? refetchMs : false, + staleTime, + placeholderData: (prev) => prev, + }); + + const items = useMemo( + () => (query.data?.pages ?? []).flatMap((p) => p.data), + [query.data], + ); + + return { + items, + fetchNextPage: () => { + if (query.hasNextPage && !query.isFetchingNextPage) query.fetchNextPage(); + }, + hasNextPage: !!query.hasNextPage, + isFetchingNextPage: query.isFetchingNextPage, + isLoading: query.isLoading, + refresh: () => queryClient.invalidateQueries({ queryKey: [...queryKey] }), + }; +} +``` + +- [ ] **Step 2: Type-check** + +Run: `cd ui && npx tsc -p tsconfig.app.json --noEmit` +Expected: no new errors from this file (errors from Task 5 may still show — that's fine). + +- [ ] **Step 3: Commit** + +```bash +git add ui/src/hooks/useInfiniteStream.ts +git commit -m "feat(ui): add useInfiniteStream hook + +Wraps tanstack useInfiniteQuery with cursor flattening, top-gated +polling, and a refresh() invalidator. Used by log and agent-event +streaming views." +``` + +--- + +## Task 7: `InfiniteScrollArea` component + +**Files:** +- Create: `ui/src/components/InfiniteScrollArea.tsx` +- Create: `ui/src/components/InfiniteScrollArea.module.css` + +- [ ] **Step 1: Write the component** + +Create `ui/src/components/InfiniteScrollArea.tsx`: + +```tsx +import { useEffect, useRef, type ReactNode, type RefObject } from 'react'; +import styles from './InfiniteScrollArea.module.css'; + +export interface InfiniteScrollAreaProps { + onEndReached: () => void; + onTopVisibilityChange?: (atTop: boolean) => void; + isFetchingNextPage: boolean; + hasNextPage: boolean; + maxHeight?: number | string; + children: ReactNode; + /** Optional caller-owned scroll container ref (e.g. for scroll-to-top on refresh). */ + scrollRef?: RefObject; + className?: string; +} + +export function InfiniteScrollArea({ + onEndReached, + onTopVisibilityChange, + isFetchingNextPage, + hasNextPage, + maxHeight = 360, + children, + scrollRef, + className, +}: InfiniteScrollAreaProps) { + const internalRef = useRef(null); + const containerRef = scrollRef ?? internalRef; + const topSentinel = useRef(null); + const bottomSentinel = useRef(null); + + useEffect(() => { + if (!onTopVisibilityChange) return; + const root = containerRef.current; + const target = topSentinel.current; + if (!root || !target) return; + const obs = new IntersectionObserver( + (entries) => { + for (const e of entries) onTopVisibilityChange(e.isIntersecting); + }, + { root, threshold: 1.0 }, + ); + obs.observe(target); + return () => obs.disconnect(); + }, [containerRef, onTopVisibilityChange]); + + useEffect(() => { + if (!hasNextPage) return; + const root = containerRef.current; + const target = bottomSentinel.current; + if (!root || !target) return; + const obs = new IntersectionObserver( + (entries) => { + for (const e of entries) if (e.isIntersecting) onEndReached(); + }, + { root, rootMargin: '100px', threshold: 0 }, + ); + obs.observe(target); + return () => obs.disconnect(); + }, [containerRef, onEndReached, hasNextPage]); + + return ( +
+ +
+ {feedEvents.length > 0 ? ( + + ) : ( +
No events in the selected time range.
+ )} +``` + +Replace with: + +```tsx + + + + + {feedEvents.length > 0 ? ( + + ) : ( +
+ {eventStream.isLoading ? 'Loading events…' : 'No events in the selected time range.'} +
+ )} +
+``` + +- [ ] **Step 10: Update line-counters shown in headers** + +Two small updates: +1. "X entries" next to the log `SectionHeader` now shows `logStream.items.length` (not `logEntries.length`). +2. "X events" next to the Timeline header now shows `eventStream.items.length`. + +Find: +```tsx + {logEntries.length} entries +``` +Replace with: +```tsx + {logStream.items.length} entries +``` + +Find: +```tsx + {feedEvents.length} events +``` +Replace with: +```tsx + {eventStream.items.length} events +``` + +- [ ] **Step 11: Add `useRef` import** + +If `useRef` isn't already imported at the top of `AgentHealth.tsx`, add it to the React imports. Current import: `import { useState, useMemo, useCallback, useRef, useEffect } from 'react';` — already includes `useRef`. No change. + +- [ ] **Step 12: Type-check** + +Run: `cd ui && npx tsc -p tsconfig.app.json --noEmit` +Expected: no errors in `AgentHealth.tsx`. Errors may remain in `AgentInstance.tsx` until Task 11. + +- [ ] **Step 13: Smoke-test in the browser** + +With a running backend and `cd ui && npm run dev`: + +1. Open the Runtime page with an app that produces logs. Verify logs render. +2. Click App, Agent, Container one at a time — each selects/deselects, and the filter is server-side (inspect Network: `source=app,container` appears on the request). +3. Select multiple levels — confirm Network has `level=INFO,ERROR` (etc). +4. Scroll down in the log panel — verify "Loading more…" flashes and older entries append. +5. Wait 15s at top — logs refresh silently. Scroll down; wait 15s — logs do NOT refresh. +6. Click the Refresh button — list resets to page 1 and scrolls to top. +7. Repeat for the Timeline panel (infinite scroll + top-gated refresh). + +- [ ] **Step 14: Commit** + +```bash +git add ui/src/pages/AgentHealth/AgentHealth.tsx +git commit -m "feat(ui): AgentHealth — server-side multi-select filters + infinite scroll + +Application Log: source + level filters move server-side; text search +stays client-side. Timeline: cursor-paginated via useInfiniteAgentEvents. +Both wrapped in InfiniteScrollArea with top-gated auto-refetch." +``` + +--- + +## Task 11: Refactor `AgentInstance` Application Log + Timeline + +**Files:** +- Modify: `ui/src/pages/AgentInstance/AgentInstance.tsx` + +Same pattern as Task 10, but this page passes `instanceId` (→ `agentId` param) in addition to `appId`. + +- [ ] **Step 1: Update imports** + +At the top of `AgentInstance.tsx`: +- Remove `useAgentEvents` from `from '../../api/queries/agents'`. +- Remove `useApplicationLogs` from `from '../../api/queries/logs'`. +- Add: `import { useInfiniteApplicationLogs } from '../../api/queries/logs';` +- Add: `import { useInfiniteAgentEvents } from '../../api/queries/agents';` +- Add: `import { InfiniteScrollArea } from '../../components/InfiniteScrollArea';` +- Ensure `useRef` is in the React imports (add it if missing). + +- [ ] **Step 2: Replace log state + fetch (around line 39 and line 135)** + +Find: + +```tsx + const [logSource, setLogSource] = useState(''); // '' = all, 'app', 'agent' +``` + +Replace with: + +```tsx + const [logSources, setLogSources] = useState>(new Set()); +``` + +Find (around line 135): + +```tsx + const { data: rawLogs } = useApplicationLogs(appId, instanceId, { toOverride: logRefreshTo, source: logSource || undefined }); +``` + +Replace with: + +```tsx + const [isLogAtTop, setIsLogAtTop] = useState(true); + const logScrollRef = useRef(null); + const logStream = useInfiniteApplicationLogs({ + application: appId, + agentId: instanceId, + sources: [...logSources], + levels: [...logLevels], + isAtTop: isLogAtTop, + }); +``` + +Also remove the line that declares `logRefreshTo` if present (it was used by the old hook). Search for `logRefreshTo` and delete its `useState` line and any remaining uses (the Refresh button is rewired below). + +- [ ] **Step 3: Replace log entry mapping + filtering** + +Apply the same change as AgentHealth Step 3 — replace the `rawLogs.map(...)` mapping with `logStream.items.map(...)`, add `source: l.source ?? undefined`, and remove the level-filter line (server-side now). + +- [ ] **Step 4: Replace Timeline state + fetch (around line 49)** + +Find: + +```tsx + const { data: events } = useAgentEvents(appId, instanceId, 50, eventRefreshTo); +``` + +Replace with: + +```tsx + const [isTimelineAtTop, setIsTimelineAtTop] = useState(true); + const timelineScrollRef = useRef(null); + const eventStream = useInfiniteAgentEvents({ appId, agentId: instanceId, isAtTop: isTimelineAtTop }); +``` + +Delete the `eventRefreshTo` `useState` if present. + +- [ ] **Step 5: Replace `feedEvents` mapping** + +Same as AgentHealth Step 5 — map from `eventStream.items`, synthesize an ID as `${e.timestamp}:${e.instanceId}:${e.eventType}`. + +- [ ] **Step 6: Replace the `` block (around line 430)** + +Same transformation as AgentHealth Step 6 — wrap in `InfiniteScrollArea`, drop `maxHeight` on `LogViewer`, include `logSources.size > 0` in the empty-state predicate. + +- [ ] **Step 7: Rewire the ButtonGroups (around line 430)** + +Same transformation as AgentHealth Step 7 — replace the single-value source ButtonGroup with multi-select bound to `logSources`, add a Clear button for sources. + +- [ ] **Step 8: Rewire the Refresh buttons** + +Same transformation as AgentHealth Steps 8 + 9 — call `logStream.refresh()` + `logScrollRef.current?.scrollTo({ top: 0 })` for the log panel, and `eventStream.refresh()` + `timelineScrollRef.current?.scrollTo({ top: 0 })` for the timeline panel. + +- [ ] **Step 9: Wrap `` in `InfiniteScrollArea`** + +Same transformation as AgentHealth Step 9 — take the existing `` block and wrap it in `` with the timeline refs. Remove `maxItems={50}` — the scroll area now owns max-height and pagination. + +- [ ] **Step 10: Update header counters** + +Replace any `logEntries.length` in a header span with `logStream.items.length`, and any `feedEvents.length` that reads raw count with `eventStream.items.length`. + +- [ ] **Step 11: Type-check** + +Run: `cd ui && npx tsc -p tsconfig.app.json --noEmit` +Expected: PASS (no TypeScript errors anywhere). + +- [ ] **Step 12: Smoke-test in the browser** + +Navigate to `/runtime/{app}/{instance}`. Repeat the verification steps from AgentHealth Step 13 but on the agent-instance page. + +- [ ] **Step 13: Commit** + +```bash +git add ui/src/pages/AgentInstance/AgentInstance.tsx +git commit -m "feat(ui): AgentInstance — server-side multi-select filters + infinite scroll + +Same pattern as AgentHealth, scoped to a single agent instance +(passes agentId to both log and timeline streams)." +``` + +--- + +## Task 12: `LogTab` source badge + +**Files:** +- Modify: `ui/src/components/ExecutionDiagram/tabs/LogTab.tsx` + +- [ ] **Step 1: Include `source` in the `LogEntry` mapping** + +Find (around lines 56–60): + +```tsx + return items.map((e) => ({ + timestamp: e.timestamp ?? '', + level: mapLogLevel(e.level), + message: e.message ?? '', + })); +``` + +Replace with: + +```tsx + return items.map((e) => ({ + timestamp: e.timestamp ?? '', + level: mapLogLevel(e.level), + message: e.message ?? '', + source: e.source ?? undefined, + })); +``` + +- [ ] **Step 2: Type-check** + +Run: `cd ui && npx tsc -p tsconfig.app.json --noEmit` +Expected: PASS. + +- [ ] **Step 3: Commit** + +```bash +git add ui/src/components/ExecutionDiagram/tabs/LogTab.tsx +git commit -m "feat(ui): LogTab — surface source badge on per-exchange log rows" +``` + +--- + +## Task 13: Rule-file updates + +**Files:** +- Modify: `.claude/rules/app-classes.md` +- Modify: `.claude/rules/ui.md` + +- [ ] **Step 1: Update `app-classes.md`** + +In the `LogQueryController` bullet, change the filters list from: + +``` +(filters: source, application, agentId, exchangeId, level, logger, q, time range) +``` + +to: + +``` +(filters: source (multi, comma-split), level (multi, comma-split), application, agentId, exchangeId, logger, q, time range). Cursor-paginated. +``` + +In the `AgentEventsController` bullet, change: + +``` +- `AgentEventsController` — GET `/api/v1/environments/{envSlug}/agents/events` (lifecycle events). +``` + +to: + +``` +- `AgentEventsController` — GET `/api/v1/environments/{envSlug}/agents/events` (lifecycle events; cursor-paginated, returns `{ data, nextCursor, hasMore }`; order `(timestamp DESC, instance_id ASC)`). +``` + +- [ ] **Step 2: Update `ui.md`** + +Under **Key UI Files**, add (after the `useLogs`/`useApplicationLogs` line or near the other hooks): + +``` +- `ui/src/hooks/useInfiniteStream.ts` — tanstack `useInfiniteQuery` wrapper with top-gated auto-refetch and flattened `items[]`. +- `ui/src/components/InfiniteScrollArea.tsx` — scrollable container with IntersectionObserver top/bottom sentinels. Streaming log/event views use this + `useInfiniteStream`. Bounded views (LogTab, StartupLogPanel) keep `useLogs`/`useStartupLogs`. +``` + +- [ ] **Step 3: Commit** + +```bash +git add .claude/rules/app-classes.md .claude/rules/ui.md +git commit -m "docs(rules): note cursor pagination + multi-source/level filters + +Reflects LogQueryController's multi-value filters, AgentEventsController's +cursor pagination shape, and the new useInfiniteStream/InfiniteScrollArea +primitives used by streaming views." +``` + +--- + +## Self-Review Notes + +- **Spec coverage:** all nine design sections map to tasks — §1 → T1, §2 → T2-T4, §3 → T6-T7, §4 → T8-T9, §5 → T10-T11, §6 → T10-T11 (top-gating wiring), §7 → T12 (LogTab source badge; StartupLogPanel inherits the badge from DS with no change), §8 → T5, §9 → T13. +- **Naming consistency:** `logStream`/`eventStream` names, `isLogAtTop`/`isTimelineAtTop`, `logScrollRef`/`timelineScrollRef`, and the hook method names (`items`, `fetchNextPage`, `hasNextPage`, `isFetchingNextPage`, `refresh`) are consistent across tasks 8–11. +- **Type consistency:** `LogSearchRequest.sources` (plural, `List`) — the HTTP param stays `source` (singular, comma-split). `ClickHouseLogStore.request.sources()` matches the record accessor. UI `useInfiniteApplicationLogs({sources: string[]})` → comma-joined on the wire, matching the backend parser. +- **No placeholders.** Every code step shows the full block to write.