123 Commits

Author SHA1 Message Date
hsiegeln
9b7626f6ff fix: diagram rendering improvements
All checks were successful
CI / cleanup-branch (push) Has been skipped
CI / build (push) Successful in 57s
CI / docker (push) Successful in 52s
CI / deploy-feature (push) Has been skipped
CI / deploy (push) Successful in 37s
- Recursive compound rendering: CompoundNode checks if children are
  themselves compound types (WHEN inside CHOICE) and renders them
  recursively. Added EIP_WHEN, EIP_OTHERWISE, DO_CATCH, DO_FINALLY
  to frontend COMPOUND_TYPES.
- Edge z-ordering: edges are distributed to their containing compound
  and rendered after the background rect, so they're not hidden behind
  compound containers.
- Error section sizing: normalize error handler node coordinates to
  start at (0,0), compute red tint background height from actual
  content with symmetric padding for vertical centering.
- Toolbar as HTML overlay: moved from SVG foreignObject to absolute-
  positioned HTML div so it stays fixed size at any zoom level. Uses
  design system tokens for consistent styling.
- Zoom: replaced viewBox approach with CSS transform on content group.
  Default zoom is 100% anchored top-left. Fit-to-view still available
  via button.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-27 16:33:24 +01:00
hsiegeln
20d1182259 fix: recursive compound nesting, fixed node width, zoom crash
All checks were successful
CI / cleanup-branch (push) Has been skipped
CI / build (push) Successful in 1m0s
CI / docker (push) Successful in 52s
CI / deploy-feature (push) Has been skipped
CI / deploy (push) Successful in 36s
ELK renderer:
- Add EIP_WHEN, EIP_OTHERWISE, DO_CATCH, DO_FINALLY to COMPOUND_TYPES
  so branch body processors nest inside their containers
- Rewrite node creation and result extraction as recursive methods
  to support compound-inside-compound (CHOICE → WHEN → processors)
- Use fixed NODE_WIDTH=160 for leaf nodes instead of variable width

Frontend:
- Fix mousewheel crash: capture getBoundingClientRect() before
  setState updater (React nulls currentTarget after handler returns)
- Anchor fitToView to top-left instead of centering

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-27 14:26:35 +01:00
hsiegeln
afcb7d3175 fix: DevDiagram page uses time range and correct catalog shape
All checks were successful
CI / cleanup-branch (push) Has been skipped
CI / build (push) Successful in 1m0s
CI / docker (push) Successful in 54s
CI / deploy-feature (push) Has been skipped
CI / deploy (push) Successful in 37s
The dev diagram page was calling useRouteCatalog() without time range
params (returned empty) and parsing the wrong response shape (expected
flat {application, routeId} but catalog returns {appId, routes[]}).

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-27 14:05:32 +01:00
hsiegeln
ac32396a57 feat: add interactive ProcessDiagram SVG component (sub-project 1/3)
All checks were successful
CI / cleanup-branch (push) Has been skipped
CI / build (push) Successful in 1m0s
CI / docker (push) Successful in 56s
CI / deploy-feature (push) Has been skipped
CI / deploy (push) Successful in 38s
New interactive route diagram component with SVG rendering using
server-computed ELK layout coordinates. TIBCO BW5-inspired top-bar
card node style with zoom/pan, hover toolbars, config badges, and
error handler sections below the main flow.

Backend: add direction query parameter (LR/TB) to diagram render
endpoints, defaulting to left-to-right layout.

Frontend: 14-file ProcessDiagram component in ui/src/components/
with DiagramNode, CompoundNode, DiagramEdge, ConfigBadge, NodeToolbar,
ErrorSection, ZoomControls, and supporting hooks. Dev test page at
/dev/diagram for validation.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-27 13:55:29 +01:00
hsiegeln
78e12f5cf9 fix: separate onException/errorHandler into distinct RouteFlow segments
All checks were successful
CI / cleanup-branch (push) Has been skipped
CI / build (push) Successful in 57s
CI / docker (push) Successful in 52s
CI / deploy-feature (push) Has been skipped
CI / deploy (push) Successful in 37s
ON_EXCEPTION and ERROR_HANDLER nodes are now treated as compound containers
in the ELK diagram renderer, nesting their children. The frontend
diagram-mapping builds separate FlowSegments for each error handler,
displayed as distinct sections in the RouteFlow component.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-27 09:15:06 +01:00
hsiegeln
62709ce80b feat: include tap attributes in cmd-K full-text search
All checks were successful
CI / cleanup-branch (push) Has been skipped
CI / build (push) Successful in 1m4s
CI / docker (push) Successful in 1m13s
CI / deploy-feature (push) Has been skipped
CI / deploy (push) Successful in 36s
Add attributes_text flattened field to OpenSearch indexing for both
execution and processor levels. Include in full-text search queries,
wildcard matching, and highlighting. Merge processor-level attributes
into ExecutionSummary. Add 'attribute' category to CommandPalette
(design-system 0.1.17) with per-key-value results in the search UI.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-27 08:13:58 +01:00
hsiegeln
ea88042ef5 fix: exclude search endpoint from audit log
All checks were successful
CI / cleanup-branch (push) Has been skipped
CI / build (push) Successful in 1m1s
CI / docker (push) Successful in 37s
CI / deploy-feature (push) Has been skipped
CI / deploy (push) Successful in 28s
POST /api/v1/search/executions is a read-only query using POST for the
request body. Skip it in AuditInterceptor to avoid flooding the audit
log with search operations.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-26 23:55:24 +01:00
hsiegeln
cde79bd172 fix: remove stale diagramNodeId from test ProcessorRecord constructors
All checks were successful
CI / cleanup-branch (push) Has been skipped
CI / build (push) Successful in 1m0s
CI / docker (push) Successful in 1m16s
CI / deploy-feature (push) Has been skipped
CI / deploy (push) Successful in 27s
TreeReconstructionTest and PostgresExecutionStoreIT still passed the
removed diagramNodeId parameter. Missed by mvn compile (main only);
caught by mvn verify (test compilation).

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-26 23:40:13 +01:00
hsiegeln
a2a8e4ae3f feat: rename logForwardingLevel to applicationLogLevel, add agentLogLevel
Some checks failed
CI / cleanup-branch (push) Has been skipped
CI / build (push) Failing after 39s
CI / docker (push) Has been skipped
CI / deploy (push) Has been skipped
CI / deploy-feature (push) Has been skipped
Align with cameleer3-common rename: logForwardingLevel → applicationLogLevel
(root logger) and new agentLogLevel (com.cameleer3 logger). Both fields
are on ApplicationConfig, pushed via config-update. UI shows "App Log Level"
and "Agent Log Level" on AppConfig slide-in, AgentHealth config bar, and
AppConfigDetailPage.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-26 23:36:31 +01:00
hsiegeln
6e187ccb48 feat: native TRACE log level with design system 0.1.16
Some checks failed
CI / cleanup-branch (push) Has been skipped
CI / build (push) Failing after 35s
CI / docker (push) Has been skipped
CI / deploy (push) Has been skipped
CI / deploy-feature (push) Has been skipped
Map TRACE to its own 'trace' level instead of grouping with DEBUG,
now that the design system LogViewer supports it natively.
Bump @cameleer/design-system to 0.1.16.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-26 23:07:42 +01:00
hsiegeln
862a27b0b8 feat: add TRACE log level support across UI
Some checks failed
CI / cleanup-branch (push) Has been skipped
CI / build (push) Failing after 34s
CI / docker (push) Has been skipped
CI / deploy (push) Has been skipped
CI / deploy-feature (push) Has been skipped
Add TRACE option to log forwarding level dropdowns (AppConfig,
AgentHealth), badge color mapping, and log filter ButtonGroups
on all pages that display application logs.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-26 23:03:15 +01:00
hsiegeln
d6c1f2c25b refactor: derive processor-route mapping from diagrams instead of executions
Some checks failed
CI / cleanup-branch (push) Has been skipped
CI / build (push) Failing after 37s
CI / docker (push) Has been skipped
CI / deploy (push) Has been skipped
CI / deploy-feature (push) Has been skipped
Store application_name in route_diagrams at ingestion time (V7 migration),
resolve from agent registry same as ExecutionController. Move
findProcessorRouteMapping from ExecutionStore to DiagramStore using a
JSONB query that extracts node IDs directly from stored RouteGraph
definitions. This makes the mapping available as soon as diagrams are
sent, before any executions are recorded.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-26 23:00:10 +01:00
hsiegeln
100b780b47 refactor: remove diagramNodeId indirection, use processorId directly
Some checks failed
CI / cleanup-branch (push) Has been skipped
CI / build (push) Failing after 37s
CI / docker (push) Has been skipped
CI / deploy (push) Has been skipped
CI / deploy-feature (push) Has been skipped
Agent now uses Camel processorId as RouteNode.id, eliminating the
nodeId mapping layer. Drop diagram_node_id column (V6 migration),
remove from ProcessorRecord/ProcessorNode/IngestionService/DetailService,
add /processor-routes endpoint for processorId→routeId lookup,
simplify frontend diagram-mapping and ExchangeDetail overlays,
replace N diagram fetches in AppConfigPage with single hook.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-26 22:44:07 +01:00
hsiegeln
bd63a8ce95 feat: App Config slide-in with Route column, clickable taps, and edit toolbar
All checks were successful
CI / cleanup-branch (push) Has been skipped
CI / build (push) Successful in 1m4s
CI / docker (push) Successful in 1m19s
CI / deploy-feature (push) Has been skipped
CI / deploy (push) Successful in 27s
- Add Route column to Traces & Taps table (diagram-based mapping, pending backend fix)
- Make tap badges clickable to navigate to route's Taps tab
- Add edit/save/cancel toolbar with design system Button components
- Move Sampling Rate to last position in settings grid
- Support ?tab= URL param on RouteDetail for direct tab navigation
- Bump @cameleer/design-system to 0.1.15 (DetailPanel overlay + backdrop)

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-26 22:26:28 +01:00
hsiegeln
ef9ec6069f fix: improve App Config slide-in panel layout
All checks were successful
CI / cleanup-branch (push) Has been skipped
CI / build (push) Successful in 59s
CI / docker (push) Successful in 54s
CI / deploy-feature (push) Has been skipped
CI / deploy (push) Successful in 26s
- Narrowed panel from 640px to 520px so main table columns stay visible
- Settings grid uses CSS grid (3 columns) for proper wrapping
- Removed unused PanelActions component that caused white footer bar

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-26 21:49:03 +01:00
hsiegeln
bf84f1814f feat: convert App Config detail to slide-in DetailPanel
All checks were successful
CI / cleanup-branch (push) Has been skipped
CI / build (push) Successful in 1m20s
CI / docker (push) Successful in 1m24s
CI / deploy-feature (push) Has been skipped
CI / deploy (push) Successful in 28s
Replaces the separate AppConfigDetailPage route with a 640px-wide
DetailPanel that slides in when clicking a row on the App Config
overview table. All editing functionality (settings, traces & taps,
route recording) is preserved inside the panel.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-26 21:44:30 +01:00
hsiegeln
00efaf0ca0 chore: bump @cameleer/design-system to 0.1.14
All checks were successful
CI / cleanup-branch (push) Has been skipped
CI / build (push) Successful in 1m0s
CI / docker (push) Successful in 1m14s
CI / deploy-feature (push) Has been skipped
CI / deploy (push) Successful in 26s
Picks up LogViewer background fix (removes --bg-inset for consistent
card backgrounds).

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-26 21:35:11 +01:00
hsiegeln
900b6f45c5 fix: use pencil and trash icons for tap row actions
All checks were successful
CI / cleanup-branch (push) Has been skipped
CI / build (push) Successful in 1m3s
CI / docker (push) Successful in 1m25s
CI / deploy-feature (push) Has been skipped
CI / deploy (push) Successful in 27s
Replaces text "Edit"/"Del" buttons with pencil and trash can icon
buttons matching the style used elsewhere in the UI.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-26 21:32:05 +01:00
hsiegeln
dd6ea7563f feat: use Toggle switch for metrics setting on AgentHealth config bar
Some checks failed
CI / cleanup-branch (push) Has been skipped
CI / build (push) Successful in 58s
CI / deploy (push) Has been cancelled
CI / deploy-feature (push) Has been cancelled
CI / docker (push) Has been cancelled
Replaces the plain checkbox with the design system Toggle component
for consistency with the recording toggle on RouteDetail and
AppConfigDetailPage.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-26 21:30:35 +01:00
hsiegeln
57bb84a2df fix: align edit and save/cancel buttons after badges on AgentHealth
Some checks failed
CI / cleanup-branch (push) Has been skipped
CI / build (push) Successful in 56s
CI / docker (push) Successful in 54s
CI / deploy-feature (push) Has been skipped
CI / deploy (push) Has been cancelled
Moved edit pencil and save/cancel actions to sit right after the last
badge field instead of at the start or far right of the config bar.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-26 21:28:30 +01:00
hsiegeln
a0fbf785c3 fix: move config edit button to right side of badges on AgentHealth
Some checks failed
CI / cleanup-branch (push) Has been skipped
CI / build (push) Successful in 56s
CI / deploy (push) Has been cancelled
CI / deploy-feature (push) Has been cancelled
CI / docker (push) Has been cancelled
Moved the pencil edit button after the badge fields and added
margin-left: auto to push it to the far right of the config bar.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-26 21:27:01 +01:00
hsiegeln
91e51d4f6a feat: show configured taps count on Admin App Config overview
All checks were successful
CI / cleanup-branch (push) Has been skipped
CI / build (push) Successful in 54s
CI / docker (push) Successful in 55s
CI / deploy-feature (push) Has been skipped
CI / deploy (push) Successful in 37s
New Taps column shows enabled/total count as a badge (e.g. "2/3")
next to the existing Traced column.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-26 21:22:59 +01:00
hsiegeln
b52d588fc5 feat: add tooltips to tap attribute type selector buttons
All checks were successful
CI / cleanup-branch (push) Has been skipped
CI / build (push) Successful in 55s
CI / docker (push) Successful in 50s
CI / deploy-feature (push) Has been skipped
CI / deploy (push) Successful in 36s
Each type option now shows a descriptive tooltip on hover explaining
its purpose: Business Object (key identifiers), Correlation (cross-route
linking), Event (business events), Custom (general purpose).

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-26 19:47:39 +01:00
hsiegeln
23b23bbb66 fix: replace crypto.randomUUID with fallback for non-HTTPS contexts
Some checks failed
CI / cleanup-branch (push) Has been skipped
CI / build (push) Successful in 52s
CI / deploy (push) Has been cancelled
CI / deploy-feature (push) Has been cancelled
CI / docker (push) Has been cancelled
crypto.randomUUID() requires a secure context (HTTPS). Since the server
may be accessed via HTTP, use a timestamp + random string ID instead.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-26 19:46:32 +01:00
hsiegeln
82b47f4364 fix: use design system status tokens for test expression result alerts
All checks were successful
CI / cleanup-branch (push) Has been skipped
CI / build (push) Successful in 55s
CI / docker (push) Successful in 47s
CI / deploy-feature (push) Has been skipped
CI / deploy (push) Successful in 36s
Replaces hardcoded dark-theme hex fallbacks with proper tokens from
tokens.css: --success-bg/--success-border/--success for success and
--error-bg/--error-border/--error for errors. Works in both themes.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-26 19:38:24 +01:00
hsiegeln
e4b2dd2604 fix: use design system tokens for tap type selector active state
Some checks failed
CI / cleanup-branch (push) Has been skipped
CI / build (push) Successful in 56s
CI / deploy (push) Has been cancelled
CI / deploy-feature (push) Has been cancelled
CI / docker (push) Has been cancelled
The active type option was invisible because --accent-primary doesn't
exist in the design system. Now uses --amber-bg/--amber-deep/--amber
from tokens.css for a clearly visible selected state matching the
brand accent palette.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-26 19:37:12 +01:00
hsiegeln
3b31e69ae4 chore: regenerate openapi.json and schema.d.ts from live server
All checks were successful
CI / cleanup-branch (push) Has been skipped
CI / build (push) Successful in 54s
CI / docker (push) Successful in 48s
CI / deploy-feature (push) Has been skipped
CI / deploy (push) Successful in 36s
Updated types now include attributes on ExecutionDetail, ProcessorNode,
and ExecutionSummary from the actual API. Removed stale detail.children
fallback that no longer exists in the schema.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-26 19:22:55 +01:00
hsiegeln
499fd7f8e8 fix: accept ISO datetime for audit log from/to parameters
All checks were successful
CI / cleanup-branch (push) Has been skipped
CI / build (push) Successful in 54s
CI / docker (push) Successful in 37s
CI / deploy-feature (push) Has been skipped
CI / deploy (push) Successful in 36s
The frontend sends full ISO timestamps (e.g. 2026-03-19T17:55:29Z) but
the controller expected LocalDate (yyyy-MM-dd). This caused null parsing,
which threw NullPointerException in the repository WHERE clause. Changed
to accept Instant directly with sensible defaults (last 7 days).

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-26 19:07:09 +01:00
hsiegeln
1080c76e99 feat: wire attributes from RouteExecution/ProcessorExecution into storage
All checks were successful
CI / cleanup-branch (push) Has been skipped
CI / build (push) Successful in 54s
CI / docker (push) Successful in 36s
CI / deploy-feature (push) Has been skipped
CI / deploy (push) Successful in 36s
Replaces null placeholders with actual getAttributes() calls now that
cameleer3-common SNAPSHOT is resolved with attributes support.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-26 19:03:18 +01:00
hsiegeln
7f58bca0e6 chore: update IngestionService TODO comments for attributes wiring
All checks were successful
CI / cleanup-branch (push) Has been skipped
CI / build (push) Successful in 59s
CI / docker (push) Successful in 50s
CI / deploy-feature (push) Has been skipped
CI / deploy (push) Successful in 37s
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-26 18:59:17 +01:00
hsiegeln
c087e4af08 fix: add missing attributes parameter to test record constructors
Tests were not updated when attributes field was added to ExecutionRecord,
ProcessorRecord, ProcessorDoc, and ExecutionDocument records.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-26 18:58:44 +01:00
hsiegeln
387ed44989 fix: add missing attributes parameter to test record constructors
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-26 18:58:32 +01:00
hsiegeln
64b677696e feat(ui): restructure AppConfigDetailPage into 3 sections
Some checks failed
CI / cleanup-branch (push) Has been skipped
CI / build (push) Failing after 32s
CI / docker (push) Has been skipped
CI / deploy (push) Has been skipped
CI / deploy-feature (push) Has been skipped
Merge Logging + Observability into unified "Settings" section with
flex-wrap badge grid including new compressSuccess toggle. Merge
Traced Processors with Taps into "Traces & Taps" section showing
capture mode and tap badges per processor. Add "Route Recording"
section with per-route toggles sourced from route catalog. All new
fields (compressSuccess, routeRecording) included in form state
and save payload.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-26 18:48:14 +01:00
hsiegeln
78813ea15f feat(ui): add taps DataTable, CRUD modal with test expression to RouteDetail
- Replace taps tab placeholder with full DataTable showing all route taps
- Add columns: attribute, processor, expression, language, target, type, enabled toggle, actions
- Add tap modal with form fields: attribute name, processor select, language, target, expression, type selector
- Implement inline enable/disable toggle per tap row
- Add ConfirmDialog for tap deletion
- Add test expression section with Recent Exchange and Custom Payload tabs
- Add save/edit/delete tap operations via application config update
- Add all supporting CSS module classes (no inline styles)

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-26 18:44:36 +01:00
hsiegeln
807e191397 feat(ui): add recording toggle, active taps KPI, and taps tab to RouteDetail
- Add Toggle for route recording on/off in the route header
- Fetch application config to determine recording state and route taps
- Add Active Taps KPI card showing enabled/total tap counts
- Add Taps tab to the tabbed section with placeholder content

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-26 18:44:06 +01:00
hsiegeln
47ff122c48 feat: add Attributes column to Dashboard exchanges table
Shows up to 2 attribute badges (color="auto") per row with a +N overflow
indicator; empty rows render a muted dash. Uses CSS module classes only.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-26 18:36:53 +01:00
hsiegeln
eb796f531f feat(ui): add replay modal to ExchangeDetail page
Add a Replay button in the exchange header that opens a modal allowing
users to re-send the exchange to a live agent. The modal pre-populates
headers and body from the original exchange input, provides an agent
selector filtered to live agents for the application, and supports
editable header key-value rows with add/remove.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-26 18:35:00 +01:00
hsiegeln
a3706cf7c2 feat(ui): display business attributes on ExchangeDetail page
Show route-level attributes as Badge strips in the exchange header
card, and per-processor attributes above the message IN/OUT panels.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-26 18:33:16 +01:00
hsiegeln
2b1d49c032 feat: add TapDefinition, extend ApplicationConfig, and add API hooks
- Add TapDefinition interface for tap configuration
- Extend ApplicationConfig with taps, tapVersion, routeRecording, compressSuccess
- Add useTestExpression mutation hook (manual fetch to new endpoint)
- Add useReplayExchange mutation hook (uses api client, targets single agent)

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-26 18:29:52 +01:00
hsiegeln
ae1ee38441 feat: add attributes fields to schema.d.ts types
Add optional `attributes?: Record<string, string>` to ExecutionSummary,
ExecutionDetail, and ProcessorNode in the manually-maintained OpenAPI
schema to reflect the new backend attributes support.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-26 18:29:47 +01:00
hsiegeln
d6d96aad07 feat: add TEST_EXPRESSION command with request-reply infrastructure
Adds CompletableFuture-based request-reply mechanism for commands that
need synchronous results. CommandReply record in core, pendingReplies
map in AgentRegistryService, test-expression endpoint on config controller
with 5s timeout. CommandAckRequest extended with optional data field.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-26 18:27:59 +01:00
hsiegeln
2d6cc4c634 feat(search): deserialize and surface attributes in detail service and OpenSearch indexing (Task 4)
DetailService deserializes attributes JSON from ExecutionRecord/ProcessorRecord and
passes them to ExecutionDetail and ProcessorNode constructors. ExecutionDocument and
ProcessorDoc carry attributes as a JSON string. SearchIndexer passes attributes when
building documents. OpenSearchIndex includes attributes in indexed maps and
deserializes them when constructing ExecutionSummary from search hits.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-26 18:23:47 +01:00
hsiegeln
ca5250c134 feat(ingestion): wire attributes through ingestion pipeline into PostgreSQL (Task 3)
IngestionService passes attributes (currently null, pending cameleer3-common update)
to ExecutionRecord and ProcessorRecord. PostgresExecutionStore includes the
attributes column in INSERT and ON CONFLICT UPDATE (with COALESCE), and reads
it back in both row mappers.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-26 18:23:38 +01:00
hsiegeln
64f797bd96 feat(core): add attributes field to storage records and detail/summary models (Task 2)
Adds Map<String,String> attributes to ExecutionRecord, ProcessorRecord,
ExecutionDetail, ProcessorNode, and ExecutionSummary. ExecutionStore records
carry attributes as a JSON string; detail/summary models carry deserialized maps.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-26 18:23:32 +01:00
hsiegeln
f08461cf35 feat(db): add attributes JSONB columns to executions and processor_executions (Task 1)
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-26 18:23:26 +01:00
hsiegeln
2b5d803a60 docs: add implementation plan for taps, attributes, replay UI features
14-task plan covering: database migration, attributes pipeline, test-expression
command with request-reply, OpenAPI regeneration, frontend types/hooks,
ExchangeDetail attributes + replay modal, Dashboard attributes column,
RouteDetail recording toggle + taps tab + tap CRUD modal, and
AppConfigDetailPage restructure.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-26 18:13:58 +01:00
hsiegeln
e3902cd85f docs: add UI design spec for taps, attributes, replay, recording & compression
Covers all 5 new agent features: tap management on RouteDetail, business
attributes display on ExchangeDetail/Dashboard, enhanced replay with
editable payload, per-route recording toggles, and success compression.
Includes backend prerequisites, RBAC matrix, and TypeScript interfaces.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-26 17:48:20 +01:00
hsiegeln
25ca8d5132 feat: show log indices on OpenSearch admin page
All checks were successful
CI / cleanup-branch (push) Has been skipped
CI / build (push) Successful in 54s
CI / docker (push) Successful in 47s
CI / deploy-feature (push) Has been skipped
CI / deploy (push) Successful in 35s
Add prefix query parameter to /admin/opensearch/indices endpoint so
the UI can fetch execution and log indices separately. OpenSearch admin
page now shows two card sections: Execution Indices and Log Indices,
each with doc count and size summary. Page restyled with CSS module
replacing inline styles. Delete endpoint also allows log index deletion.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-26 16:47:44 +01:00
hsiegeln
0d94132c98 feat: SOC2 audit log completeness — hybrid interceptor + explicit calls
All checks were successful
CI / cleanup-branch (push) Has been skipped
CI / build (push) Successful in 54s
CI / docker (push) Successful in 51s
CI / deploy-feature (push) Has been skipped
CI / deploy (push) Successful in 37s
Add AuditInterceptor as a safety net that auto-audits any POST/PUT/DELETE
without an explicit audit call (excludes data ingestion + heartbeat).
AuditService sets a request attribute so the interceptor skips when
explicit logging already happened.

New explicit audit calls:
- ApplicationConfigController: view/update app config
- AgentCommandController: send/broadcast commands (AGENT category)
- AgentRegistrationController: agent register + token refresh
- UiAuthController: UI token refresh
- OidcAuthController: OIDC callback failure
- AuditLogController: view audit log (sensitive read)
- UserAdminController: view users (sensitive read)
- OidcConfigAdminController: view OIDC config (sensitive read)

New AuditCategory.AGENT added. Frontend audit log filter updated.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-26 16:41:10 +01:00
hsiegeln
0e6de69cd9 feat: add App Config detail page with view/edit mode
All checks were successful
CI / cleanup-branch (push) Has been skipped
CI / build (push) Successful in 53s
CI / docker (push) Successful in 52s
CI / deploy-feature (push) Has been skipped
CI / deploy (push) Successful in 38s
Click a row in the admin App Config table to navigate to a dedicated
detail page at /admin/appconfig/:appId. Shows all config fields as
badges in view mode; pencil toggles to edit mode with dropdowns.

Traced processors are now editable (capture mode dropdown + remove
button per processor). Sections and header use card styling for
visual contrast. OidcConfigPage gets the same card treatment.

List page simplified to read-only badge overview with row click
navigation.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-26 16:15:27 +01:00
hsiegeln
e53274bcb9 fix: LogViewer and EventFeed scroll to top on load
All checks were successful
CI / cleanup-branch (push) Has been skipped
CI / build (push) Successful in 56s
CI / docker (push) Successful in 1m9s
CI / deploy-feature (push) Has been skipped
CI / deploy (push) Successful in 35s
Update design system to v0.1.13 where both components scroll to the
top (newest entries) instead of the bottom, matching the descending
sort order used across the UI.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-26 15:54:56 +01:00
hsiegeln
4433b26bf8 fix: move pencil/save buttons to start of config bar for consistency
All checks were successful
CI / cleanup-branch (push) Has been skipped
CI / build (push) Successful in 57s
CI / docker (push) Successful in 50s
CI / deploy-feature (push) Has been skipped
CI / deploy (push) Successful in 35s
Pencil icon and Save/Cancel buttons now appear at the left side of
the AgentHealth config bar, matching the admin overview table where
the edit column is at the start of each row.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-26 15:38:36 +01:00
hsiegeln
74fa08f41f fix: visible Save/Cancel buttons on AgentHealth config edit mode
All checks were successful
CI / cleanup-branch (push) Has been skipped
CI / build (push) Successful in 56s
CI / docker (push) Successful in 52s
CI / deploy-feature (push) Has been skipped
CI / deploy (push) Successful in 37s
Replace subtle Unicode checkmark/X with proper labeled buttons styled
as primary (Save) and secondary (Cancel) for better visibility.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-26 13:20:11 +01:00
hsiegeln
4b66d78cf4 refactor: config settings shown as badges with pencil-to-edit
All checks were successful
CI / cleanup-branch (push) Has been skipped
CI / build (push) Successful in 56s
CI / docker (push) Successful in 47s
CI / deploy-feature (push) Has been skipped
CI / deploy (push) Successful in 36s
Settings (log level, engine level, payload capture, metrics) now
display as color-coded badges by default. Clicking the pencil icon
enters edit mode where badges become dropdowns. Save (checkmark)
persists changes and reverts to badge view; cancel discards changes.

Applied consistently on both the admin App Config page and the
AgentHealth config bar.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-26 13:12:56 +01:00
hsiegeln
b1c2950b1e fix: add id field to AppConfigPage DataTable rows
All checks were successful
CI / cleanup-branch (push) Has been skipped
CI / build (push) Successful in 2m51s
CI / docker (push) Successful in 1m9s
CI / deploy-feature (push) Has been skipped
CI / deploy (push) Successful in 35s
DataTable requires rows with an { id: string } constraint. Map
ApplicationConfig to ConfigRow adding id from the application field.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-26 12:55:19 +01:00
hsiegeln
b0484459a2 feat: add application config overview and inline editing
Some checks failed
CI / cleanup-branch (push) Has been skipped
CI / build (push) Failing after 22s
CI / docker (push) Has been skipped
CI / deploy (push) Has been skipped
CI / deploy-feature (push) Has been skipped
Add admin page at /admin/appconfig with a DataTable showing all
application configurations. Inline dropdowns allow editing log level,
engine level, payload capture mode, and metrics toggle directly from
the table. Changes push to agents via SSE immediately.

Also adds a config bar on the AgentHealth page (/agents/:appId) for
per-application config management with the same 4 settings.

Backend: GET /api/v1/config list endpoint, findAll() on repository,
sensible defaults for logForwardingLevel/engineLevel/payloadCaptureMode.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-26 12:51:07 +01:00
hsiegeln
056a6f0ff5 feat: sidebar exchange counts respect selected time range
All checks were successful
CI / cleanup-branch (push) Has been skipped
CI / build (push) Successful in 2m47s
CI / docker (push) Successful in 48s
CI / deploy-feature (push) Has been skipped
CI / deploy (push) Successful in 36s
The /routes/catalog endpoint now accepts optional from/to query
parameters instead of hardcoding a 24h window. The UI passes the
global filter time range so sidebar counts match what the user sees.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-26 12:21:10 +01:00
hsiegeln
f4bf38fcba feat: add inspect column to agent instance data table
All checks were successful
CI / build (push) Successful in 58s
CI / cleanup-branch (push) Has been skipped
CI / docker (push) Successful in 58s
CI / deploy (push) Successful in 35s
CI / deploy-feature (push) Has been skipped
Add a dedicated inspect button column (↗) to navigate to the agent
instance page, consistent with the exchange inspect pattern on the
Dashboard. Row click still opens the detail slide-in panel.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-26 12:04:06 +01:00
hsiegeln
15632a2170 fix: show full exchange ID in breadcrumb
All checks were successful
CI / build (push) Successful in 53s
CI / cleanup-branch (push) Has been skipped
CI / docker (push) Successful in 47s
CI / deploy (push) Successful in 35s
CI / deploy-feature (push) Has been skipped
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-26 11:49:41 +01:00
hsiegeln
479b67cd2d refactor: consolidate breadcrumbs to single TopBar instance
All checks were successful
CI / build (push) Successful in 1m1s
CI / cleanup-branch (push) Has been skipped
CI / docker (push) Successful in 1m11s
CI / deploy (push) Successful in 35s
CI / deploy-feature (push) Has been skipped
Remove duplicate in-page breadcrumbs (ExchangeDetail, AgentHealth scope
trail) and improve the global TopBar breadcrumb with semantic labels and
a context-based override for pages with richer navigation data.

- Add BreadcrumbProvider from design system v0.1.12
- LayoutShell: label map prettifies URL segments (apps→Applications, etc.)
- ExchangeDetail: uses useBreadcrumb() to set semantic trail via context
- AgentHealth: remove scope trail, keep live-count badge standalone

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-26 11:40:37 +01:00
hsiegeln
bde0459416 fix: prevent log viewer flicker on ExchangeDetail page
All checks were successful
CI / build (push) Successful in 1m0s
CI / cleanup-branch (push) Has been skipped
CI / docker (push) Successful in 1m12s
CI / deploy (push) Successful in 35s
CI / deploy-feature (push) Has been skipped
Skip global time range in the logs query key when filtering by
exchangeId (exchange logs are historical, the sliding time window is
irrelevant). Add placeholderData to keep previous results visible
during query key transitions on other pages.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-26 11:03:38 +01:00
hsiegeln
a01712e68c fix: use .keyword suffix on both exchangeId term queries
All checks were successful
CI / build (push) Successful in 1m1s
CI / cleanup-branch (push) Has been skipped
CI / docker (push) Successful in 41s
CI / deploy (push) Successful in 36s
CI / deploy-feature (push) Has been skipped
Defensive: use .keyword on the top-level exchangeId field too, in
case indices were created before the explicit keyword mapping was
added to the template.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-26 10:45:59 +01:00
hsiegeln
9aa78f681d fix: use .keyword suffix for MDC exchangeId term query
Some checks failed
CI / docker (push) Has been cancelled
CI / deploy (push) Has been cancelled
CI / deploy-feature (push) Has been cancelled
CI / cleanup-branch (push) Has been cancelled
CI / build (push) Has been cancelled
Dynamically mapped string fields in OpenSearch are multi-field
(text + keyword). Term queries require the .keyword sub-field for
exact matching.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-26 10:45:14 +01:00
hsiegeln
befefe457f fix: query both top-level and MDC exchangeId for log search
All checks were successful
CI / build (push) Successful in 1m1s
CI / cleanup-branch (push) Has been skipped
CI / docker (push) Successful in 49s
CI / deploy (push) Successful in 39s
CI / deploy-feature (push) Has been skipped
Existing log records only have exchangeId inside the mdc object, not
as a top-level indexed field. Use a bool should clause to match on
either exchangeId (new records) or mdc.camel.exchangeId (old records).

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-26 10:40:42 +01:00
hsiegeln
ea665ff411 feat: exchange-level log viewer on ExchangeDetail page
All checks were successful
CI / build (push) Successful in 1m0s
CI / cleanup-branch (push) Has been skipped
CI / docker (push) Successful in 49s
CI / deploy (push) Successful in 37s
CI / deploy-feature (push) Has been skipped
Index exchangeId from Camel MDC (camel.exchangeId) as a top-level
keyword field in OpenSearch log indices. Add exchangeId filter to
the log query API and frontend hook. Show a LogViewer on the
ExchangeDetail page filtered to that exchange's logs, with search
input and level filter pills.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-26 10:26:30 +01:00
hsiegeln
f9bd492191 chore: update design system to v0.1.11 (live time range fix)
All checks were successful
CI / build (push) Successful in 56s
CI / cleanup-branch (push) Has been skipped
CI / docker (push) Successful in 1m9s
CI / deploy (push) Successful in 39s
CI / deploy-feature (push) Has been skipped
The GlobalFilterProvider now recomputes the preset time range every
10s when auto-refresh is on, so timeRange.end stays fresh instead of
being frozen at the moment the preset was clicked.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-26 09:57:43 +01:00
hsiegeln
1be303b801 feat: add application log panel to agent health page
All checks were successful
CI / build (push) Successful in 55s
CI / cleanup-branch (push) Has been skipped
CI / docker (push) Successful in 48s
CI / deploy (push) Successful in 37s
CI / deploy-feature (push) Has been skipped
Add the same log + timeline side-by-side layout from AgentInstance to
the AgentHealth page (/agents/{appId}). Includes search input, level
filter pills, sort toggle, and refresh button — matching the instance
page design exactly.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-25 22:54:07 +01:00
hsiegeln
d57249906a fix: refresh buttons use "now" as to-date for queries
All checks were successful
CI / build (push) Successful in 56s
CI / cleanup-branch (push) Has been skipped
CI / docker (push) Successful in 47s
CI / deploy (push) Successful in 41s
CI / deploy-feature (push) Has been skipped
Instead of calling refetch() with stale time params, the refresh
buttons now set a toOverride state to new Date().toISOString(). This
flows into the query key, triggering a fresh fetch with the current
time as the upper bound. Both useApplicationLogs and useAgentEvents
hooks accept an optional toOverride parameter.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-25 22:41:00 +01:00
hsiegeln
6a24dd01e9 fix: add exchange body fields to schema.d.ts for CI tsc check
All checks were successful
CI / cleanup-branch (push) Has been skipped
CI / build (push) Successful in 54s
CI / docker (push) Successful in 9s
CI / deploy (push) Successful in 19s
CI / deploy-feature (push) Has been skipped
The CI build runs tsc --noEmit which failed because the ExecutionDetail
type in schema.d.ts was missing the new inputBody/outputBody/inputHeaders/
outputHeaders fields added to the backend DTO.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-25 22:06:26 +01:00
hsiegeln
e10f021c54 use self hosted image for build
Some checks failed
CI / build (push) Failing after 26s
CI / cleanup-branch (push) Has been skipped
CI / docker (push) Has been skipped
CI / deploy (push) Has been skipped
CI / deploy-feature (push) Has been skipped
2026-03-25 22:03:19 +01:00
hsiegeln
b3c5e87230 fix: expose exchange body in API, fix RouteFlow index mapping
Some checks failed
CI / build (push) Failing after 25s
CI / cleanup-branch (push) Has been skipped
CI / docker (push) Has been skipped
CI / deploy (push) Has been skipped
CI / deploy-feature (push) Has been skipped
Add inputBody/outputBody/inputHeaders/outputHeaders to ExecutionDetail
DTO so exchange-level bodies are returned by the detail endpoint. Show
"Exchange Input" and "Exchange Output" panels on the detail page when
the data is available.

Fix RouteFlow node click selecting the wrong processor snapshot by
building a flowToTreeIndex mapping that correctly translates flow
display index → diagram node index → processorId → processor tree
index. Previously the diagram node index was used directly as the
processor tree index, which broke when the two orderings differed.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-25 22:02:26 +01:00
hsiegeln
9b63443842 feat: add sort toggle and refresh buttons to log/timeline panels
All checks were successful
CI / build (push) Successful in 55s
CI / cleanup-branch (push) Has been skipped
CI / docker (push) Successful in 50s
CI / deploy (push) Successful in 42s
CI / deploy-feature (push) Has been skipped
Remove auto-scroll override hack. Add sort order toggle (asc/desc
by time) and manual refresh button to both the application log and
agent events timeline panels on AgentInstance and AgentHealth pages.
Default is descending (newest first); toggling reverses the array.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-25 19:53:33 +01:00
hsiegeln
cd30c2d9b5 fix: match log/timeline height, DESC sort with scroll-to-top
All checks were successful
CI / build (push) Successful in 55s
CI / cleanup-branch (push) Has been skipped
CI / docker (push) Successful in 52s
CI / deploy (push) Successful in 39s
CI / deploy-feature (push) Has been skipped
Give logCard the same max-height and flex layout as timelineCard so
both columns are equal height. Revert .toReversed() so events stay
in DESC order (newest at top). Override EventFeed's auto-scroll-to-
bottom with a requestAnimationFrame that resets scrollTop to 0 after
mount, keeping newest entries visible at the top of both panels.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-25 19:12:08 +01:00
hsiegeln
b612941aae feat: wire up application logs from OpenSearch, fix event autoscroll
All checks were successful
CI / build (push) Successful in 55s
CI / cleanup-branch (push) Has been skipped
CI / docker (push) Successful in 51s
CI / deploy (push) Successful in 37s
CI / deploy-feature (push) Has been skipped
Add GET /api/v1/logs endpoint to query application logs stored in
OpenSearch with filters for application, agent, level, time range,
and text search. Wire up the AgentInstance LogViewer with real data
and an EventFeed-style toolbar (search input + level filter pills).

Fix agent events timeline autoscroll by reversing the DESC-ordered
events so newest entries appear at the bottom where EventFeed
autoscrolls to.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-25 18:56:13 +01:00
hsiegeln
20ee448f4e fix: OpenSearch status field mismatch, adopt RouteFlow flows prop
All checks were successful
CI / build (push) Successful in 56s
CI / cleanup-branch (push) Has been skipped
CI / docker (push) Successful in 1m43s
CI / deploy (push) Successful in 38s
CI / deploy-feature (push) Has been skipped
Fix admin OpenSearch page always showing "Disconnected" by aligning
frontend field names (reachable/nodeCount/host) with backend DTO.

Update design system to v0.1.10 and adopt the new multi-flow RouteFlow
API — error-handler nodes now render as labeled segments with error
variant instead of relying on legacy auto-separation.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-25 18:34:58 +01:00
hsiegeln
2bbca8ae38 fix: force SNAPSHOT update in Docker build (-U flag)
All checks were successful
CI / build (push) Successful in 55s
CI / cleanup-branch (push) Has been skipped
CI / docker (push) Successful in 40s
CI / deploy (push) Successful in 38s
CI / deploy-feature (push) Has been skipped
Same issue as the CI build — Docker layer cache can serve a stale
cameleer3-common SNAPSHOT.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-25 13:36:07 +01:00
hsiegeln
fea50b51ae fix: force SNAPSHOT update in CI build (-U flag)
Some checks failed
CI / build (push) Successful in 55s
CI / cleanup-branch (push) Has been skipped
CI / docker (push) Failing after 23s
CI / deploy (push) Has been skipped
CI / deploy-feature (push) Has been skipped
Maven cache can serve stale cameleer3-common SNAPSHOTs. The -U flag
forces Maven to check the remote registry for updated SNAPSHOTs on
every build.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-25 13:33:59 +01:00
79d37118e0 chore: use pre-baked build images from cameleer-build-images
Some checks failed
CI / build (push) Failing after 40s
CI / cleanup-branch (push) Has been skipped
CI / docker (push) Has been skipped
CI / deploy (push) Has been skipped
CI / deploy-feature (push) Has been skipped
Replace maven:3.9-eclipse-temurin-17 with cameleer-build:1 (includes
Node.js 22, curl, jq). Replace docker:27 with cameleer-docker-builder:1
(includes git, curl, jq). Removes per-build tool installation steps.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-25 13:26:11 +01:00
hsiegeln
7fd55ea8ba fix: remove core LogIndexService to fix CI snapshot resolution
Some checks failed
CI / build (push) Failing after 1m11s
CI / cleanup-branch (push) Has been skipped
CI / docker (push) Has been skipped
CI / deploy (push) Has been skipped
CI / deploy-feature (push) Has been skipped
LogIndexService in server-core imported LogEntry from cameleer3-common,
but the SNAPSHOT on the registry may not have it yet when the server CI
runs. Moved the dependency to server-app where both the controller and
OpenSearch implementation live.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-25 13:11:11 +01:00
hsiegeln
c96fbef5d5 ci: retry after cameleer3-common publish
Some checks failed
CI / build (push) Failing after 50s
CI / cleanup-branch (push) Has been skipped
CI / docker (push) Has been skipped
CI / deploy (push) Has been skipped
CI / deploy-feature (push) Has been skipped
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-25 13:05:23 +01:00
hsiegeln
7423e2ca14 feat: add application log ingestion with OpenSearch storage
Some checks failed
CI / cleanup-branch (push) Has been skipped
CI / build (push) Failing after 59s
CI / docker (push) Has been skipped
CI / deploy (push) Has been skipped
CI / deploy-feature (push) Has been skipped
Agents can now send application log entries in batches via POST /api/v1/data/logs.
Logs are indexed directly into OpenSearch daily indices (logs-{yyyy-MM-dd}) using
the bulk API. Index template defines explicit mappings for full-text search readiness.

New DTOs (LogEntry, LogBatch) added to cameleer3-common in the agent repo.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-25 11:53:27 +01:00
hsiegeln
bf600f8c5f fix: read version and updated_at from SQL columns in config repository
All checks were successful
CI / build (push) Successful in 12m13s
CI / cleanup-branch (push) Has been skipped
CI / docker (push) Successful in 44s
CI / deploy (push) Successful in 39s
CI / deploy-feature (push) Has been skipped
The findByApplication query only read config_val JSONB, ignoring the
version and updated_at SQL columns. The JSON blob contained version 0
from the original save, so agents saw no config and fell back to defaults.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-25 10:22:13 +01:00
hsiegeln
996ea65293 feat: LIVE/PAUSED toggle controls data fetching on sidebar navigation
All checks were successful
CI / build (push) Successful in 1m13s
CI / cleanup-branch (push) Has been skipped
CI / docker (push) Successful in 55s
CI / deploy (push) Successful in 39s
CI / deploy-feature (push) Has been skipped
LIVE: sidebar clicks trigger initial fetch + polling for the new route.
PAUSED: sidebar clicks navigate but queries are disabled — no fetches
until the user switches back to LIVE.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-25 10:01:14 +01:00
hsiegeln
9866dd5f23 fix: move design system dev install after COPY to bust Docker cache
All checks were successful
CI / build (push) Successful in 1m23s
CI / cleanup-branch (push) Has been skipped
CI / docker (push) Successful in 1m12s
CI / deploy (push) Successful in 38s
CI / deploy-feature (push) Has been skipped
The npm install @cameleer/design-system@dev was in the same cached layer
as npm ci, so Docker never re-ran it when the registry had a new version.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-25 09:37:51 +01:00
hsiegeln
d9c8816647 feat: add OpenSearch highlight snippets to search results
All checks were successful
CI / build (push) Successful in 1m23s
CI / cleanup-branch (push) Has been skipped
CI / docker (push) Successful in 54s
CI / deploy (push) Successful in 39s
CI / deploy-feature (push) Has been skipped
- Add highlight field to ExecutionSummary record
- Request highlight fragments from OpenSearch when full-text search is active
- Pass matchContext to command palette for display

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-25 09:29:07 +01:00
hsiegeln
b32c97c02b feat: fix Cmd-K shortcut and add exchange full-text search to command palette
All checks were successful
CI / build (push) Successful in 1m43s
CI / cleanup-branch (push) Has been skipped
CI / docker (push) Successful in 1m17s
CI / deploy (push) Successful in 40s
CI / deploy-feature (push) Has been skipped
- Add missing onOpen prop to CommandPalette (fixes Ctrl+K/Cmd+K)
- Wire server-side exchange search with debounced text query
- Use design system dev snapshot from Gitea registry in CI builds

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-25 08:57:24 +01:00
hsiegeln
552f02d25c fix: add JWT auth to application config API calls
All checks were successful
CI / build (push) Successful in 1m42s
CI / cleanup-branch (push) Has been skipped
CI / docker (push) Successful in 57s
CI / deploy (push) Successful in 39s
CI / deploy-feature (push) Has been skipped
Raw fetch() had no auth headers, causing 401s that silently broke tracing toggle.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-25 08:19:44 +01:00
hsiegeln
9f9968abab chore: upgrade cameleer3-common to 1.0-SNAPSHOT and enable snapshot resolution
All checks were successful
CI / build (push) Successful in 1m44s
CI / cleanup-branch (push) Has been skipped
CI / docker (push) Successful in 3m27s
CI / deploy (push) Successful in 39s
CI / deploy-feature (push) Has been skipped
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-25 08:04:29 +01:00
hsiegeln
69a3eb192f feat: persistent per-application config with GET/PUT endpoints
Some checks failed
CI / build (push) Failing after 1m10s
CI / cleanup-branch (push) Has been skipped
CI / docker (push) Has been skipped
CI / deploy (push) Has been skipped
CI / deploy-feature (push) Has been skipped
Add application_config table (V4 migration), repository, and REST
controller. GET /api/v1/config/{app} returns config, PUT saves and
pushes CONFIG_UPDATE to all LIVE agents via SSE. UI tracing toggle
now uses config API instead of direct SET_TRACED_PROCESSORS command.
Tracing store syncs with server config on load.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-25 07:42:55 +01:00
hsiegeln
488a32f319 feat: show tracing badges on processor nodes
All checks were successful
CI / build (push) Successful in 1m18s
CI / cleanup-branch (push) Has been skipped
CI / docker (push) Successful in 1m12s
CI / deploy (push) Successful in 40s
CI / deploy-feature (push) Has been skipped
Update design system to 0.1.8 and pass NodeBadge[] to both
ProcessorTimeline and RouteFlow. Traced processors display a
blue "TRACED" badge that updates reactively via Zustand store.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-24 23:10:37 +01:00
hsiegeln
bf57fd139b fix: show tracing action on all Flow view nodes
All checks were successful
CI / build (push) Successful in 1m26s
CI / cleanup-branch (push) Has been skipped
CI / docker (push) Successful in 53s
CI / deploy (push) Successful in 39s
CI / deploy-feature (push) Has been skipped
Use diagram node ID as fallback processorId when no processor
execution match exists (e.g. error handlers that didn't trigger).

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-24 22:46:52 +01:00
hsiegeln
581d53a33e fix: match SET_TRACED_PROCESSORS payload to agent protocol
Some checks failed
CI / build (push) Successful in 1m28s
CI / cleanup-branch (push) Has been skipped
CI / docker (push) Successful in 55s
CI / deploy-feature (push) Has been cancelled
CI / deploy (push) Has been cancelled
Payload now sends {processors: {id: "BOTH"}} map instead of
{routeId, processorIds[]} array. Tracing state keyed by application
name (global, not per-route) matching agent behavior.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-24 22:43:55 +01:00
hsiegeln
f4dd2b3415 feat: add processor tracing toggle to exchange detail views
All checks were successful
CI / build (push) Successful in 1m22s
CI / cleanup-branch (push) Has been skipped
CI / docker (push) Successful in 52s
CI / deploy (push) Successful in 39s
CI / deploy-feature (push) Has been skipped
Wire getActions on ProcessorTimeline and RouteFlow to send
SET_TRACED_PROCESSORS commands to all agents of the same application.
Tracing state managed via Zustand store with optimistic UI and rollback.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-24 22:30:26 +01:00
hsiegeln
7532cc9d59 chore: update @cameleer/design-system to 0.1.7
All checks were successful
CI / build (push) Successful in 1m14s
CI / cleanup-branch (push) Has been skipped
CI / docker (push) Successful in 1m8s
CI / deploy (push) Successful in 39s
CI / deploy-feature (push) Has been skipped
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-24 21:59:40 +01:00
hsiegeln
e7590d72fd fix: restore Swagger UI on api-docs page
All checks were successful
CI / build (push) Successful in 1m23s
CI / cleanup-branch (push) Has been skipped
CI / docker (push) Successful in 50s
CI / deploy (push) Successful in 38s
CI / deploy-feature (push) Has been skipped
- Change Vite proxy pattern from /api to /api/ so /api-docs client
  route is not captured and proxied to the backend
- Fix SwaggerUIBundle init: remove empty presets/layout overrides that
  crashed the internal persistConfigs function
- Use correct CSS import (swagger-ui.css instead of index.css)
- Add requestInterceptor to auto-attach JWT token to Try-it-out calls
- Add swagger-ui-bundle to optimizeDeps.include for reliable loading
- Remove unused swagger-ui-dist.d.ts type declarations

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-24 20:53:48 +01:00
hsiegeln
57ce1db248 add metrics ingestion diagnostics and upgrade cameleer3-common to 0.0.3
All checks were successful
CI / build (push) Successful in 1m34s
CI / cleanup-branch (push) Has been skipped
CI / docker (push) Successful in 3m20s
CI / deploy (push) Successful in 39s
CI / deploy-feature (push) Has been skipped
- Add logging to MetricsController: warn on parse failures, debug on
  received metrics, buffer depth on 503
- Add GET /api/v1/admin/database/metrics-pipeline diagnostic endpoint
  (buffer depth, row count, distinct agents/metrics, latest timestamp)
- Fix BackpressureIT test JSON to match actual MetricsSnapshot schema
  (collectedAt/metricName/metricValue instead of timestamp/metrics)
- Upgrade cameleer3-common from 1.0-SNAPSHOT to 0.0.3 (adds engineLevel)

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-24 20:23:26 +01:00
hsiegeln
c97d730a00 fix: show N/A for agent heap/CPU when no JVM metrics available
All checks were successful
CI / build (push) Successful in 1m22s
CI / cleanup-branch (push) Has been skipped
CI / docker (push) Successful in 55s
CI / deploy (push) Successful in 39s
CI / deploy-feature (push) Has been skipped
Indeterminate progress bars were misleading when agents don't report
JVM metrics — replaced with plain "N/A" text.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-24 19:46:58 +01:00
hsiegeln
581c4f9ad9 fix: restore registry URL in package-lock.json for CI
All checks were successful
CI / build (push) Successful in 1m16s
CI / cleanup-branch (push) Has been skipped
CI / docker (push) Successful in 1m12s
CI / deploy (push) Successful in 39s
CI / deploy-feature (push) Has been skipped
The lock file had "resolved": "../../design-system" from a local
install, causing npm ci in CI to silently skip the package.
Reinstalled from registry to fix the resolved URL.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-24 19:15:44 +01:00
hsiegeln
ef6bc4be21 fix: add npm registry auth token for UI build in CI
Some checks failed
CI / build (push) Failing after 39s
CI / cleanup-branch (push) Has been skipped
CI / docker (push) Has been skipped
CI / deploy (push) Has been skipped
CI / deploy-feature (push) Has been skipped
The Build UI step ran npm ci without authenticating to the Gitea npm
registry, causing @cameleer/design-system to fail to resolve. Add
REGISTRY_TOKEN to .npmrc before npm ci.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-24 19:12:35 +01:00
hsiegeln
8534bb8839 chore: upgrade @cameleer/design-system to v0.1.6
Some checks failed
CI / build (push) Failing after 39s
CI / cleanup-branch (push) Has been skipped
CI / docker (push) Has been skipped
CI / deploy (push) Has been skipped
CI / deploy-feature (push) Has been skipped
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-24 19:07:13 +01:00
hsiegeln
a5bc7cf6d1 fix: use self-portaling DetailPanel from design system v0.1.5
Some checks failed
CI / build (push) Failing after 57s
CI / cleanup-branch (push) Has been skipped
CI / docker (push) Has been skipped
CI / deploy (push) Has been skipped
CI / deploy-feature (push) Has been skipped
DetailPanel now portals itself to #cameleer-detail-panel-root (a div
AppShell places as a sibling of .main in the top-level flex row).
Pages just render <DetailPanel> inline — no manual createPortal,
no context, no prop drilling.

Remove the old #detail-panel-portal div from LayoutShell and the
createPortal wrappers from Dashboard and AgentHealth.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-24 19:00:02 +01:00
hsiegeln
5d2eff4f73 fix: normalize null fields from unconfigured OIDC response
All checks were successful
CI / build (push) Successful in 1m16s
CI / cleanup-branch (push) Has been skipped
CI / docker (push) Successful in 53s
CI / deploy (push) Successful in 40s
CI / deploy-feature (push) Has been skipped
When no OIDC config exists, the backend returns an object with all
null fields (via OidcAdminConfigResponse.unconfigured()). Normalize
all null values to sensible defaults when loading the form instead
of passing nulls through to Input components and .map() calls.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-24 18:44:02 +01:00
hsiegeln
9a4a4dc1af fix: handle null defaultRoles in OIDC config page
Some checks failed
CI / build (push) Has been cancelled
CI / docker (push) Has been cancelled
CI / deploy (push) Has been cancelled
CI / deploy-feature (push) Has been cancelled
CI / cleanup-branch (push) Has been cancelled
The API returns defaultRoles as null when no roles are configured.
Add null guards on all defaultRoles accesses to prevent .map() crash.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-24 18:41:59 +01:00
hsiegeln
f3241e904f fix: use createPortal for DetailPanel instead of context+useEffect
Some checks failed
CI / build (push) Successful in 1m21s
CI / cleanup-branch (push) Has been skipped
CI / docker (push) Successful in 53s
CI / deploy-feature (push) Has been cancelled
CI / deploy (push) Has been cancelled
The previous approach used useEffect+context to hoist DetailPanel
content to the AppShell level, but the dependency-free useEffect
caused a re-render loop that broke sidebar navigation.

Replace with createPortal: pages render DetailPanel inline in their
JSX but portal it to a target div (#detail-panel-portal) at the
AppShell level. No state lifting, no re-render loops.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-24 18:38:59 +01:00
hsiegeln
5de792744e fix: hoist DetailPanel into AppShell detail slot for proper slide-in
All checks were successful
CI / build (push) Successful in 1m22s
CI / cleanup-branch (push) Has been skipped
CI / docker (push) Successful in 51s
CI / deploy (push) Successful in 38s
CI / deploy-feature (push) Has been skipped
DetailPanel is a flex sibling that slides in from the right — it must
be rendered at the AppShell level via the detail prop, not inside the
page content. Add DetailPanelContext so pages can push their panel
content up to LayoutShell, which passes it to AppShell.detail.

Applied to Dashboard (exchange detail) and AgentHealth (instance detail).

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-24 18:28:03 +01:00
hsiegeln
0a5f4a03b5 chore: upgrade @cameleer/design-system to v0.1.4
All checks were successful
CI / build (push) Successful in 1m13s
CI / cleanup-branch (push) Has been skipped
CI / docker (push) Successful in 1m11s
CI / deploy (push) Successful in 37s
CI / deploy-feature (push) Has been skipped
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-24 18:18:20 +01:00
hsiegeln
4ac11551c9 feat: add auto-refresh toggle wired to all polling queries
Some checks failed
CI / build (push) Failing after 51s
CI / cleanup-branch (push) Has been skipped
CI / docker (push) Has been skipped
CI / deploy (push) Has been skipped
CI / deploy-feature (push) Has been skipped
Upgrade @cameleer/design-system to ^0.1.3 which adds LIVE/PAUSED
toggle to TopBar backed by autoRefresh state in GlobalFilterProvider.

Add useRefreshInterval() hook that returns the polling interval when
auto-refresh is on, or false when paused. Wire it into all query
hooks that use refetchInterval (executions, catalog, agents, metrics,
admin database/opensearch).

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-24 18:10:32 +01:00
hsiegeln
6fea5f2c5b fix: use .keyword suffix for text field sorting in OpenSearch
All checks were successful
CI / build (push) Successful in 1m22s
CI / cleanup-branch (push) Has been skipped
CI / docker (push) Successful in 44s
CI / deploy (push) Successful in 39s
CI / deploy-feature (push) Has been skipped
OpenSearch dynamically maps string fields as text with a .keyword
subfield. Sorting on text fields throws an error; only .keyword,
date, and numeric fields support sorting. Add .keyword suffix to
all string sort columns (status, routeId, agentId, executionId,
correlationId, applicationName) while keeping start_time and
duration_ms as-is.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-24 17:56:18 +01:00
hsiegeln
b7cac68ee1 fix: filter exchanges by application and restore snake_case sort columns
All checks were successful
CI / build (push) Successful in 1m23s
CI / cleanup-branch (push) Has been skipped
CI / docker (push) Successful in 41s
CI / deploy (push) Successful in 39s
CI / deploy-feature (push) Has been skipped
Add application_name filter to OpenSearch query builder — sidebar
app selection now correctly filters the exchange list. The
application field was being resolved to agentIds in the controller
but never applied as a query filter in OpenSearch.

Also restore snake_case sort column mapping since the OpenSearch
toMap() serializer uses snake_case field names (start_time, route_id,
etc.), not camelCase.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-24 17:41:07 +01:00
hsiegeln
cdbe330c47 fix: support all sortable columns and use camelCase for OpenSearch
All checks were successful
CI / build (push) Successful in 1m24s
CI / cleanup-branch (push) Has been skipped
CI / docker (push) Successful in 45s
CI / deploy (push) Successful in 37s
CI / deploy-feature (push) Has been skipped
Add executionId and applicationName to allowed sort fields. Fix sort
column mapping to use camelCase field names matching the OpenSearch
ExecutionDocument fields instead of snake_case DB column names. This
was causing sorts on most columns to either silently fall back to
startTime or return empty results from OpenSearch.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-24 17:37:01 +01:00
53e9073dca fix: update ExecutionRecord constructor in stats test for new fields
All checks were successful
CI / build (push) Successful in 1m13s
CI / cleanup-branch (push) Has been skipped
CI / docker (push) Successful in 1m9s
CI / deploy (push) Successful in 38s
CI / deploy-feature (push) Has been skipped
2026-03-24 17:26:07 +01:00
b8c316727e fix: update ExecutionRecord constructor calls in tests for new fields
Some checks failed
CI / build (push) Has started running
CI / docker (push) Has been cancelled
CI / deploy (push) Has been cancelled
CI / deploy-feature (push) Has been cancelled
CI / cleanup-branch (push) Has been cancelled
2026-03-24 17:25:48 +01:00
hsiegeln
48455cd559 fix: use server-side sorting for paginated tables
Some checks failed
CI / cleanup-branch (push) Has been skipped
CI / build (push) Failing after 1m10s
CI / docker (push) Has been skipped
CI / deploy (push) Has been skipped
CI / deploy-feature (push) Has been skipped
Upgrade @cameleer/design-system to v0.1.1 which adds onSortChange
callback to DataTable. Wire it up in Dashboard (exchanges), AuditLog,
and RouteDetail (recent executions) so sorting triggers a new API
request with sortField/sortDir instead of only sorting the current page.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-24 17:05:17 +01:00
aa3d9f375b Merge pull request 'feat: agent protocol v2 — engine levels, enriched acks, route snapshots' (#91) from fix/agent-protocol-v2 into main
Some checks failed
CI / build (push) Failing after 1m0s
CI / cleanup-branch (push) Has been skipped
CI / docker (push) Has been skipped
CI / deploy (push) Has been skipped
CI / deploy-feature (push) Has been skipped
Reviewed-on: cameleer/cameleer3-server#91
2026-03-24 16:50:09 +01:00
hsiegeln
e54d20bcb7 feat: migrate login page to design system styling
All checks were successful
CI / build (push) Successful in 1m26s
CI / cleanup-branch (push) Has been skipped
CI / docker (push) Successful in 57s
CI / deploy (push) Successful in 38s
CI / deploy-feature (push) Has been skipped
Replace inline styles with CSS module matching the design system's
LoginForm visual patterns. Uses proper DS class structure (divider,
social section, form fields) while keeping username-based auth
instead of the DS component's email validation.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-24 16:44:52 +01:00
hsiegeln
81f85aa82d feat: replace UI with design system example pages wired to real API
Some checks failed
CI / build (push) Successful in 1m18s
CI / cleanup-branch (push) Has been skipped
CI / docker (push) Successful in 55s
CI / deploy-feature (push) Has been cancelled
CI / deploy (push) Has been cancelled
Migrate all page components from the @cameleer/design-system v0.0.3
example UI, replacing mock data with real backend API hooks. This brings
richer visuals (KpiStrip, GroupCard, RouteFlow, ProcessorTimeline,
DateRangePicker, expandable rows) while preserving all existing API
integration, auth, and routing infrastructure.

Pages migrated: Dashboard, RoutesMetrics, RouteDetail, ExchangeDetail,
AgentHealth, AgentInstance, OidcConfig, AuditLog, RBAC (Users/Groups/Roles).
Also enhanced LayoutShell CommandPalette with real search data from catalog.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-24 16:42:16 +01:00
2887fe9599 feat: add V3 migration for engine_level and route-level snapshot columns
Some checks failed
CI / build (push) Failing after 51s
CI / cleanup-branch (push) Has been skipped
CI / build (pull_request) Failing after 52s
CI / cleanup-branch (pull_request) Has been skipped
CI / docker (push) Has been skipped
CI / docker (pull_request) Has been skipped
CI / deploy (push) Has been skipped
CI / deploy-feature (push) Has been skipped
CI / deploy (pull_request) Has been skipped
CI / deploy-feature (pull_request) Has been skipped
2026-03-24 16:13:11 +01:00
b1679b110c feat: add engine_level and route-level snapshot columns to PostgresExecutionStore
Some checks failed
CI / docker (push) Has been cancelled
CI / build (push) Has been cancelled
CI / deploy (push) Has been cancelled
CI / deploy-feature (push) Has been cancelled
CI / cleanup-branch (push) Has been cancelled
Add engine_level, input_body, output_body, input_headers, output_headers
to the executions INSERT/SELECT/UPSERT and row mapper. Required for
REGULAR mode where route-level payloads exist but no processor records.

Note: requires ALTER TABLE migration to add the new columns.
2026-03-24 16:12:46 +01:00
e7835e1100 feat: map engineLevel and route-level snapshots in IngestionService
Some checks failed
CI / docker (push) Has been cancelled
CI / deploy (push) Has been cancelled
CI / deploy-feature (push) Has been cancelled
CI / cleanup-branch (push) Has been cancelled
CI / build (push) Has been cancelled
Extract inputBody/outputBody/inputHeaders/outputHeaders from RouteExecution
snapshots and pass to ExecutionRecord. Maps engineLevel field. Critical for
REGULAR mode where no processor records exist but route-level payloads do.
2026-03-24 16:11:55 +01:00
ed65b87af2 feat: add engineLevel and route-level snapshot fields to ExecutionRecord
Some checks failed
CI / docker (push) Has been cancelled
CI / deploy (push) Has been cancelled
CI / deploy-feature (push) Has been cancelled
CI / cleanup-branch (push) Has been cancelled
CI / build (push) Has been cancelled
Adds engineLevel (NONE/MINIMAL/REGULAR/COMPLETE) and inputBody/outputBody/
inputHeaders/outputHeaders to ExecutionRecord so REGULAR mode route-level
payloads are persisted (previously only processor-level records had payloads).
2026-03-24 16:11:26 +01:00
4a99e6cf6b feat: support enriched command ack with status/message + set-traced-processors command type
Some checks failed
CI / docker (push) Has been cancelled
CI / deploy (push) Has been cancelled
CI / deploy-feature (push) Has been cancelled
CI / cleanup-branch (push) Has been cancelled
CI / build (push) Has been cancelled
- Add @RequestBody(required=false) CommandAckRequest to ack endpoint for
  receiving agent command results (backward compat with old agents)
- Record command results in agent event log via AgentEventService
- Add set-traced-processors to mapCommandType switch
- Inject AgentEventService dependency
2026-03-24 16:11:04 +01:00
4d9a9ff851 feat: add CommandAckRequest DTO for enriched command acknowledgments
Some checks failed
CI / build (push) Has started running
CI / docker (push) Has been cancelled
CI / deploy (push) Has been cancelled
CI / deploy-feature (push) Has been cancelled
CI / cleanup-branch (push) Has been cancelled
2026-03-24 16:10:27 +01:00
292a38fe30 feat: add SET_TRACED_PROCESSORS command type for per-processor overrides
Some checks failed
CI / docker (push) Has been cancelled
CI / deploy (push) Has been cancelled
CI / deploy-feature (push) Has been cancelled
CI / cleanup-branch (push) Has been cancelled
CI / build (push) Has been cancelled
2026-03-24 16:10:21 +01:00
139 changed files with 18927 additions and 3063 deletions

View File

@@ -14,16 +14,11 @@ jobs:
runs-on: ubuntu-latest
if: github.event_name != 'delete'
container:
image: maven:3.9-eclipse-temurin-17
image: gitea.siegeln.net/cameleer/cameleer-build:1
credentials:
username: cameleer
password: ${{ secrets.REGISTRY_TOKEN }}
steps:
- name: Install Node.js 22
run: |
apt-get update && apt-get install -y ca-certificates curl gnupg
mkdir -p /etc/apt/keyrings
curl -fsSL https://deb.nodesource.com/gpgkey/nodesource-repo.gpg.key | gpg --dearmor -o /etc/apt/keyrings/nodesource.gpg
echo "deb [signed-by=/etc/apt/keyrings/nodesource.gpg] https://deb.nodesource.com/node_22.x nodistro main" > /etc/apt/sources.list.d/nodesource.list
apt-get update && apt-get install -y nodejs
- uses: actions/checkout@v4
- name: Configure Gitea Maven Registry
@@ -53,22 +48,27 @@ jobs:
- name: Build UI
working-directory: ui
run: |
echo '//gitea.siegeln.net/api/packages/cameleer/npm/:_authToken=${REGISTRY_TOKEN}' >> .npmrc
npm ci
npm run build
env:
REGISTRY_TOKEN: ${{ secrets.REGISTRY_TOKEN }}
- name: Build and Test
run: mvn clean verify -DskipITs --batch-mode
run: mvn clean verify -DskipITs -U --batch-mode
docker:
needs: build
runs-on: ubuntu-latest
if: github.event_name == 'push'
container:
image: docker:27
image: gitea.siegeln.net/cameleer/cameleer-docker-builder:1
credentials:
username: cameleer
password: ${{ secrets.REGISTRY_TOKEN }}
steps:
- name: Checkout
run: |
apk add --no-cache git
git clone --depth=1 --branch=${GITHUB_REF_NAME} https://cameleer:${REGISTRY_TOKEN}@gitea.siegeln.net/${GITHUB_REPOSITORY}.git .
env:
REGISTRY_TOKEN: ${{ secrets.REGISTRY_TOKEN }}
@@ -95,7 +95,7 @@ jobs:
echo "IMAGE_TAGS=branch-$SLUG" >> "$GITHUB_ENV"
fi
- name: Set up QEMU for cross-platform builds
run: docker run --rm --privileged tonistiigi/binfmt --install all
run: docker run --rm --privileged gitea.siegeln.net/cameleer/binfmt:1 --install all
- name: Build and push server
run: |
docker buildx create --use --name cibuilder
@@ -133,7 +133,6 @@ jobs:
if: always()
- name: Cleanup old container images
run: |
apk add --no-cache curl jq
API="https://gitea.siegeln.net/api/v1"
AUTH="Authorization: token ${REGISTRY_TOKEN}"
CURRENT_SHA="${{ github.sha }}"

View File

@@ -36,9 +36,9 @@ java -jar cameleer3-server-app/target/cameleer3-server-app-1.0-SNAPSHOT.jar
- Spring Boot 3.4.3 parent POM
- Depends on `com.cameleer3:cameleer3-common` from Gitea Maven registry
- Jackson `JavaTimeModule` for `Instant` deserialization
- Communication: receives HTTP POST data from agents, serves SSE event streams for config push/commands
- Communication: receives HTTP POST data from agents (executions, diagrams, metrics, logs), serves SSE event streams for config push/commands
- Maintains agent instance registry with states: LIVE → STALE → DEAD
- Storage: PostgreSQL (TimescaleDB) for structured data, OpenSearch for full-text search
- Storage: PostgreSQL (TimescaleDB) for structured data, OpenSearch for full-text search and application log storage
- Security: JWT auth with RBAC (AGENT/VIEWER/OPERATOR/ADMIN roles), Ed25519 config signing, bootstrap token for registration
- OIDC: Optional external identity provider support (token exchange pattern). Configured via admin API, stored in database (`server_config` table)
- User persistence: PostgreSQL `users` table, admin CRUD at `/api/v1/admin/users`

View File

@@ -12,7 +12,7 @@ COPY cameleer3-server-app/pom.xml cameleer3-server-app/
# Cache deps — only re-downloaded when POMs change
RUN mvn dependency:go-offline -B || true
COPY . .
RUN mvn clean package -DskipTests -B
RUN mvn clean package -DskipTests -U -B
FROM eclipse-temurin:17-jre
WORKDIR /app

View File

@@ -100,7 +100,7 @@ JWTs carry a `roles` claim. Endpoints are restricted by role:
| Role | Access |
|------|--------|
| `AGENT` | Data ingestion (`/data/**`), heartbeat, SSE events, command ack |
| `AGENT` | Data ingestion (`/data/**` — executions, diagrams, metrics, logs), heartbeat, SSE events, command ack |
| `VIEWER` | Search, execution detail, diagrams, agent list |
| `OPERATOR` | VIEWER + send commands to agents |
| `ADMIN` | OPERATOR + user management (`/admin/**`) |
@@ -220,6 +220,20 @@ curl -s -X POST http://localhost:8081/api/v1/data/metrics \
-H "X-Protocol-Version: 1" \
-H "Authorization: Bearer $TOKEN" \
-d '[{"agentId":"agent-1","metricName":"cpu","value":42.0,"timestamp":"2026-03-11T00:00:00Z","tags":{}}]'
# Post application log entries (batch)
curl -s -X POST http://localhost:8081/api/v1/data/logs \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $TOKEN" \
-d '{
"entries": [{
"timestamp": "2026-03-25T10:00:00Z",
"level": "INFO",
"loggerName": "com.acme.MyService",
"message": "Processing order #12345",
"threadName": "main"
}]
}'
```
**Note:** The `X-Protocol-Version: 1` header is required on all `/api/v1/data/**` endpoints. Missing or wrong version returns 400.
@@ -361,6 +375,8 @@ Key settings in `cameleer3-server-app/src/main/resources/application.yml`:
| `security.oidc.client-secret` | | OAuth2 client secret (`CAMELEER_OIDC_CLIENT_SECRET`) |
| `security.oidc.roles-claim` | `realm_access.roles` | JSONPath to roles in OIDC id_token (`CAMELEER_OIDC_ROLES_CLAIM`) |
| `security.oidc.default-roles` | `VIEWER` | Default roles for new OIDC users (`CAMELEER_OIDC_DEFAULT_ROLES`) |
| `opensearch.log-index-prefix` | `logs-` | OpenSearch index prefix for application logs (`CAMELEER_LOG_INDEX_PREFIX`) |
| `opensearch.log-retention-days` | `7` | Days before log indices are deleted (`CAMELEER_LOG_RETENTION_DAYS`) |
## Web UI Development

View File

@@ -1,5 +1,6 @@
package com.cameleer3.server.app.config;
import com.cameleer3.server.app.interceptor.AuditInterceptor;
import com.cameleer3.server.app.interceptor.ProtocolVersionInterceptor;
import org.springframework.context.annotation.Configuration;
import org.springframework.web.servlet.config.annotation.InterceptorRegistry;
@@ -7,17 +8,17 @@ import org.springframework.web.servlet.config.annotation.WebMvcConfigurer;
/**
* Web MVC configuration.
* <p>
* Registers the {@link ProtocolVersionInterceptor} on data and agent endpoint paths,
* excluding health, API docs, and Swagger UI paths that do not require protocol versioning.
*/
@Configuration
public class WebConfig implements WebMvcConfigurer {
private final ProtocolVersionInterceptor protocolVersionInterceptor;
private final AuditInterceptor auditInterceptor;
public WebConfig(ProtocolVersionInterceptor protocolVersionInterceptor) {
public WebConfig(ProtocolVersionInterceptor protocolVersionInterceptor,
AuditInterceptor auditInterceptor) {
this.protocolVersionInterceptor = protocolVersionInterceptor;
this.auditInterceptor = auditInterceptor;
}
@Override
@@ -33,5 +34,14 @@ public class WebConfig implements WebMvcConfigurer {
"/api/v1/agents/register",
"/api/v1/agents/*/refresh"
);
// Safety-net audit: catches any unaudited POST/PUT/DELETE
registry.addInterceptor(auditInterceptor)
.addPathPatterns("/api/v1/**")
.excludePathPatterns(
"/api/v1/data/**",
"/api/v1/agents/*/heartbeat",
"/api/v1/health"
);
}
}

View File

@@ -1,16 +1,22 @@
package com.cameleer3.server.app.controller;
import com.cameleer3.server.app.agent.SseConnectionManager;
import com.cameleer3.server.app.dto.CommandAckRequest;
import com.cameleer3.server.app.dto.CommandBroadcastResponse;
import com.cameleer3.server.app.dto.CommandRequest;
import com.cameleer3.server.app.dto.CommandSingleResponse;
import com.cameleer3.server.core.admin.AuditCategory;
import com.cameleer3.server.core.admin.AuditResult;
import com.cameleer3.server.core.admin.AuditService;
import com.cameleer3.server.core.agent.AgentCommand;
import com.cameleer3.server.core.agent.AgentEventService;
import com.cameleer3.server.core.agent.AgentInfo;
import com.cameleer3.server.core.agent.AgentRegistryService;
import com.cameleer3.server.core.agent.AgentState;
import com.cameleer3.server.core.agent.CommandType;
import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.databind.ObjectMapper;
import jakarta.servlet.http.HttpServletRequest;
import io.swagger.v3.oas.annotations.Operation;
import io.swagger.v3.oas.annotations.responses.ApiResponse;
import io.swagger.v3.oas.annotations.tags.Tag;
@@ -48,23 +54,30 @@ public class AgentCommandController {
private final AgentRegistryService registryService;
private final SseConnectionManager connectionManager;
private final ObjectMapper objectMapper;
private final AgentEventService agentEventService;
private final AuditService auditService;
public AgentCommandController(AgentRegistryService registryService,
SseConnectionManager connectionManager,
ObjectMapper objectMapper) {
ObjectMapper objectMapper,
AgentEventService agentEventService,
AuditService auditService) {
this.registryService = registryService;
this.connectionManager = connectionManager;
this.objectMapper = objectMapper;
this.agentEventService = agentEventService;
this.auditService = auditService;
}
@PostMapping("/{id}/commands")
@Operation(summary = "Send command to a specific agent",
description = "Sends a config-update, deep-trace, or replay command to the specified agent")
description = "Sends a command to the specified agent via SSE")
@ApiResponse(responseCode = "202", description = "Command accepted")
@ApiResponse(responseCode = "400", description = "Invalid command payload")
@ApiResponse(responseCode = "404", description = "Agent not registered")
public ResponseEntity<CommandSingleResponse> sendCommand(@PathVariable String id,
@RequestBody CommandRequest request) throws JsonProcessingException {
@RequestBody CommandRequest request,
HttpServletRequest httpRequest) throws JsonProcessingException {
AgentInfo agent = registryService.findById(id);
if (agent == null) {
throw new ResponseStatusException(HttpStatus.NOT_FOUND, "Agent not found: " + id);
@@ -76,6 +89,10 @@ public class AgentCommandController {
String status = connectionManager.isConnected(id) ? "DELIVERED" : "PENDING";
auditService.log("send_agent_command", AuditCategory.AGENT, id,
java.util.Map.of("type", request.type(), "status", status),
AuditResult.SUCCESS, httpRequest);
return ResponseEntity.status(HttpStatus.ACCEPTED)
.body(new CommandSingleResponse(command.id(), status));
}
@@ -86,7 +103,8 @@ public class AgentCommandController {
@ApiResponse(responseCode = "202", description = "Commands accepted")
@ApiResponse(responseCode = "400", description = "Invalid command payload")
public ResponseEntity<CommandBroadcastResponse> sendGroupCommand(@PathVariable String group,
@RequestBody CommandRequest request) throws JsonProcessingException {
@RequestBody CommandRequest request,
HttpServletRequest httpRequest) throws JsonProcessingException {
CommandType type = mapCommandType(request.type());
String payloadJson = request.payload() != null ? objectMapper.writeValueAsString(request.payload()) : "{}";
@@ -101,6 +119,10 @@ public class AgentCommandController {
commandIds.add(command.id());
}
auditService.log("broadcast_group_command", AuditCategory.AGENT, group,
java.util.Map.of("type", request.type(), "agentCount", agents.size()),
AuditResult.SUCCESS, httpRequest);
return ResponseEntity.status(HttpStatus.ACCEPTED)
.body(new CommandBroadcastResponse(commandIds, agents.size()));
}
@@ -110,7 +132,8 @@ public class AgentCommandController {
description = "Sends a command to all agents currently in LIVE state")
@ApiResponse(responseCode = "202", description = "Commands accepted")
@ApiResponse(responseCode = "400", description = "Invalid command payload")
public ResponseEntity<CommandBroadcastResponse> broadcastCommand(@RequestBody CommandRequest request) throws JsonProcessingException {
public ResponseEntity<CommandBroadcastResponse> broadcastCommand(@RequestBody CommandRequest request,
HttpServletRequest httpRequest) throws JsonProcessingException {
CommandType type = mapCommandType(request.type());
String payloadJson = request.payload() != null ? objectMapper.writeValueAsString(request.payload()) : "{}";
@@ -122,21 +145,42 @@ public class AgentCommandController {
commandIds.add(command.id());
}
auditService.log("broadcast_all_command", AuditCategory.AGENT, null,
java.util.Map.of("type", request.type(), "agentCount", liveAgents.size()),
AuditResult.SUCCESS, httpRequest);
return ResponseEntity.status(HttpStatus.ACCEPTED)
.body(new CommandBroadcastResponse(commandIds, liveAgents.size()));
}
@PostMapping("/{id}/commands/{commandId}/ack")
@Operation(summary = "Acknowledge command receipt",
description = "Agent acknowledges that it has received and processed a command")
description = "Agent acknowledges that it has received and processed a command, with result status and message")
@ApiResponse(responseCode = "200", description = "Command acknowledged")
@ApiResponse(responseCode = "404", description = "Command not found")
public ResponseEntity<Void> acknowledgeCommand(@PathVariable String id,
@PathVariable String commandId) {
@PathVariable String commandId,
@RequestBody(required = false) CommandAckRequest body) {
boolean acknowledged = registryService.acknowledgeCommand(id, commandId);
if (!acknowledged) {
throw new ResponseStatusException(HttpStatus.NOT_FOUND, "Command not found: " + commandId);
}
// Complete any pending reply future (for synchronous request-reply commands like TEST_EXPRESSION)
registryService.completeReply(commandId,
body != null ? body.status() : "SUCCESS",
body != null ? body.message() : null,
body != null ? body.data() : null);
// Record command result in agent event log
if (body != null && body.status() != null) {
AgentInfo agent = registryService.findById(id);
String application = agent != null ? agent.application() : "unknown";
agentEventService.recordEvent(id, application, "COMMAND_" + body.status(),
"Command " + commandId + ": " + body.message());
log.debug("Command {} ack from agent {}: {} - {}", commandId, id, body.status(), body.message());
}
return ResponseEntity.ok().build();
}
@@ -145,8 +189,10 @@ public class AgentCommandController {
case "config-update" -> CommandType.CONFIG_UPDATE;
case "deep-trace" -> CommandType.DEEP_TRACE;
case "replay" -> CommandType.REPLAY;
case "set-traced-processors" -> CommandType.SET_TRACED_PROCESSORS;
case "test-expression" -> CommandType.TEST_EXPRESSION;
default -> throw new ResponseStatusException(HttpStatus.BAD_REQUEST,
"Invalid command type: " + typeStr + ". Valid: config-update, deep-trace, replay");
"Invalid command type: " + typeStr + ". Valid: config-update, deep-trace, replay, set-traced-processors, test-expression");
};
}
}

View File

@@ -8,6 +8,9 @@ import com.cameleer3.server.app.dto.AgentRegistrationRequest;
import com.cameleer3.server.app.dto.AgentRegistrationResponse;
import com.cameleer3.server.app.dto.ErrorResponse;
import com.cameleer3.server.app.security.BootstrapTokenValidator;
import com.cameleer3.server.core.admin.AuditCategory;
import com.cameleer3.server.core.admin.AuditResult;
import com.cameleer3.server.core.admin.AuditService;
import com.cameleer3.server.core.agent.AgentEventService;
import com.cameleer3.server.core.agent.AgentInfo;
import com.cameleer3.server.core.agent.AgentRegistryService;
@@ -58,6 +61,7 @@ public class AgentRegistrationController {
private final JwtService jwtService;
private final Ed25519SigningService ed25519SigningService;
private final AgentEventService agentEventService;
private final AuditService auditService;
private final JdbcTemplate jdbc;
public AgentRegistrationController(AgentRegistryService registryService,
@@ -66,6 +70,7 @@ public class AgentRegistrationController {
JwtService jwtService,
Ed25519SigningService ed25519SigningService,
AgentEventService agentEventService,
AuditService auditService,
JdbcTemplate jdbc) {
this.registryService = registryService;
this.config = config;
@@ -73,6 +78,7 @@ public class AgentRegistrationController {
this.jwtService = jwtService;
this.ed25519SigningService = ed25519SigningService;
this.agentEventService = agentEventService;
this.auditService = auditService;
this.jdbc = jdbc;
}
@@ -113,6 +119,10 @@ public class AgentRegistrationController {
agentEventService.recordEvent(request.agentId(), application, "REGISTERED",
"Agent registered: " + request.name());
auditService.log(request.agentId(), "agent_register", AuditCategory.AGENT, request.agentId(),
Map.of("application", application, "name", request.name()),
AuditResult.SUCCESS, httpRequest);
// Issue JWT tokens with AGENT role
List<String> roles = List.of("AGENT");
String accessToken = jwtService.createAccessToken(request.agentId(), application, roles);
@@ -135,7 +145,8 @@ public class AgentRegistrationController {
@ApiResponse(responseCode = "401", description = "Invalid or expired refresh token")
@ApiResponse(responseCode = "404", description = "Agent not found")
public ResponseEntity<AgentRefreshResponse> refresh(@PathVariable String id,
@RequestBody AgentRefreshRequest request) {
@RequestBody AgentRefreshRequest request,
HttpServletRequest httpRequest) {
if (request.refreshToken() == null || request.refreshToken().isBlank()) {
return ResponseEntity.status(401).build();
}
@@ -169,6 +180,9 @@ public class AgentRegistrationController {
String newAccessToken = jwtService.createAccessToken(agentId, agent.application(), roles);
String newRefreshToken = jwtService.createRefreshToken(agentId, agent.application(), roles);
auditService.log(agentId, "agent_token_refresh", AuditCategory.AUTH, agentId,
null, AuditResult.SUCCESS, httpRequest);
return ResponseEntity.ok(new AgentRefreshResponse(newAccessToken, newRefreshToken));
}

View File

@@ -0,0 +1,208 @@
package com.cameleer3.server.app.controller;
import com.cameleer3.common.model.ApplicationConfig;
import com.cameleer3.server.app.dto.TestExpressionRequest;
import com.cameleer3.server.app.dto.TestExpressionResponse;
import com.cameleer3.server.app.storage.PostgresApplicationConfigRepository;
import com.cameleer3.server.core.admin.AuditCategory;
import com.cameleer3.server.core.admin.AuditResult;
import com.cameleer3.server.core.admin.AuditService;
import com.cameleer3.server.core.agent.AgentCommand;
import com.cameleer3.server.core.agent.AgentInfo;
import com.cameleer3.server.core.agent.AgentRegistryService;
import com.cameleer3.server.core.agent.AgentState;
import com.cameleer3.server.core.agent.CommandReply;
import com.cameleer3.server.core.agent.CommandType;
import com.cameleer3.server.core.storage.DiagramStore;
import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.databind.ObjectMapper;
import io.swagger.v3.oas.annotations.Operation;
import io.swagger.v3.oas.annotations.responses.ApiResponse;
import io.swagger.v3.oas.annotations.tags.Tag;
import jakarta.servlet.http.HttpServletRequest;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.http.HttpStatus;
import org.springframework.http.ResponseEntity;
import org.springframework.security.core.Authentication;
import org.springframework.web.bind.annotation.*;
import java.util.List;
import java.util.Map;
import java.util.concurrent.CompletableFuture;
import java.util.concurrent.CompletionException;
import java.util.concurrent.TimeUnit;
import java.util.concurrent.TimeoutException;
/**
* Per-application configuration management.
* Agents fetch config at startup; the UI modifies config which is persisted and pushed to agents via SSE.
*/
@RestController
@RequestMapping("/api/v1/config")
@Tag(name = "Application Config", description = "Per-application observability configuration")
public class ApplicationConfigController {
private static final Logger log = LoggerFactory.getLogger(ApplicationConfigController.class);
private final PostgresApplicationConfigRepository configRepository;
private final AgentRegistryService registryService;
private final ObjectMapper objectMapper;
private final AuditService auditService;
private final DiagramStore diagramStore;
public ApplicationConfigController(PostgresApplicationConfigRepository configRepository,
AgentRegistryService registryService,
ObjectMapper objectMapper,
AuditService auditService,
DiagramStore diagramStore) {
this.configRepository = configRepository;
this.registryService = registryService;
this.objectMapper = objectMapper;
this.auditService = auditService;
this.diagramStore = diagramStore;
}
@GetMapping
@Operation(summary = "List all application configs",
description = "Returns stored configurations for all applications")
@ApiResponse(responseCode = "200", description = "Configs returned")
public ResponseEntity<List<ApplicationConfig>> listConfigs(HttpServletRequest httpRequest) {
auditService.log("view_app_configs", AuditCategory.CONFIG, null, null, AuditResult.SUCCESS, httpRequest);
return ResponseEntity.ok(configRepository.findAll());
}
@GetMapping("/{application}")
@Operation(summary = "Get application config",
description = "Returns the current configuration for an application. Returns defaults if none stored.")
@ApiResponse(responseCode = "200", description = "Config returned")
public ResponseEntity<ApplicationConfig> getConfig(@PathVariable String application,
HttpServletRequest httpRequest) {
auditService.log("view_app_config", AuditCategory.CONFIG, application, null, AuditResult.SUCCESS, httpRequest);
return ResponseEntity.ok(
configRepository.findByApplication(application)
.orElse(defaultConfig(application)));
}
@PutMapping("/{application}")
@Operation(summary = "Update application config",
description = "Saves config and pushes CONFIG_UPDATE to all LIVE agents of this application")
@ApiResponse(responseCode = "200", description = "Config saved and pushed")
public ResponseEntity<ApplicationConfig> updateConfig(@PathVariable String application,
@RequestBody ApplicationConfig config,
Authentication auth,
HttpServletRequest httpRequest) {
String updatedBy = auth != null ? auth.getName() : "system";
config.setApplication(application);
ApplicationConfig saved = configRepository.save(application, config, updatedBy);
int pushed = pushConfigToAgents(application, saved);
log.info("Config v{} saved for '{}', pushed to {} agent(s)", saved.getVersion(), application, pushed);
auditService.log("update_app_config", AuditCategory.CONFIG, application,
Map.of("version", saved.getVersion(), "agentsPushed", pushed),
AuditResult.SUCCESS, httpRequest);
return ResponseEntity.ok(saved);
}
@GetMapping("/{application}/processor-routes")
@Operation(summary = "Get processor to route mapping",
description = "Returns a map of processorId → routeId for all processors seen in this application")
@ApiResponse(responseCode = "200", description = "Mapping returned")
public ResponseEntity<Map<String, String>> getProcessorRouteMapping(@PathVariable String application) {
return ResponseEntity.ok(diagramStore.findProcessorRouteMapping(application));
}
@PostMapping("/{application}/test-expression")
@Operation(summary = "Test a tap expression against sample data via a live agent")
@ApiResponse(responseCode = "200", description = "Expression evaluated successfully")
@ApiResponse(responseCode = "404", description = "No live agent available for this application")
@ApiResponse(responseCode = "504", description = "Agent did not respond in time")
public ResponseEntity<TestExpressionResponse> testExpression(
@PathVariable String application,
@RequestBody TestExpressionRequest request) {
// Find a LIVE agent for this application
AgentInfo agent = registryService.findAll().stream()
.filter(a -> application.equals(a.application()))
.filter(a -> a.state() == AgentState.LIVE)
.findFirst()
.orElse(null);
if (agent == null) {
return ResponseEntity.status(HttpStatus.NOT_FOUND)
.body(new TestExpressionResponse(null, "No live agent available for application: " + application));
}
// Build payload JSON
String payloadJson;
try {
payloadJson = objectMapper.writeValueAsString(Map.of(
"expression", request.expression() != null ? request.expression() : "",
"language", request.language() != null ? request.language() : "",
"body", request.body() != null ? request.body() : "",
"target", request.target() != null ? request.target() : ""
));
} catch (JsonProcessingException e) {
log.error("Failed to serialize test-expression payload", e);
return ResponseEntity.status(HttpStatus.INTERNAL_SERVER_ERROR)
.body(new TestExpressionResponse(null, "Failed to serialize request"));
}
// Send command and await reply
CompletableFuture<CommandReply> future = registryService.addCommandWithReply(
agent.id(), CommandType.TEST_EXPRESSION, payloadJson);
try {
CommandReply reply = future.orTimeout(5, TimeUnit.SECONDS).join();
if ("SUCCESS".equals(reply.status())) {
return ResponseEntity.ok(new TestExpressionResponse(reply.data(), null));
} else {
return ResponseEntity.ok(new TestExpressionResponse(null, reply.message()));
}
} catch (CompletionException e) {
if (e.getCause() instanceof TimeoutException) {
return ResponseEntity.status(HttpStatus.GATEWAY_TIMEOUT)
.body(new TestExpressionResponse(null, "Agent did not respond within 5 seconds"));
}
log.error("Error awaiting test-expression reply from agent {}", agent.id(), e);
return ResponseEntity.status(HttpStatus.INTERNAL_SERVER_ERROR)
.body(new TestExpressionResponse(null, "Internal error: " + e.getCause().getMessage()));
}
}
private int pushConfigToAgents(String application, ApplicationConfig config) {
String payloadJson;
try {
payloadJson = objectMapper.writeValueAsString(config);
} catch (JsonProcessingException e) {
log.error("Failed to serialize config for push", e);
return 0;
}
List<AgentInfo> agents = registryService.findAll().stream()
.filter(a -> a.state() == AgentState.LIVE)
.filter(a -> application.equals(a.application()))
.toList();
for (AgentInfo agent : agents) {
registryService.addCommand(agent.id(), CommandType.CONFIG_UPDATE, payloadJson);
}
return agents.size();
}
private static ApplicationConfig defaultConfig(String application) {
ApplicationConfig config = new ApplicationConfig();
config.setApplication(application);
config.setVersion(0);
config.setMetricsEnabled(true);
config.setSamplingRate(1.0);
config.setTracedProcessors(Map.of());
config.setApplicationLogLevel("INFO");
config.setAgentLogLevel("INFO");
config.setEngineLevel("REGULAR");
config.setPayloadCaptureMode("NONE");
return config;
}
}

View File

@@ -5,8 +5,11 @@ import com.cameleer3.server.core.admin.AuditCategory;
import com.cameleer3.server.core.admin.AuditRepository;
import com.cameleer3.server.core.admin.AuditRepository.AuditPage;
import com.cameleer3.server.core.admin.AuditRepository.AuditQuery;
import com.cameleer3.server.core.admin.AuditResult;
import com.cameleer3.server.core.admin.AuditService;
import io.swagger.v3.oas.annotations.Operation;
import io.swagger.v3.oas.annotations.tags.Tag;
import jakarta.servlet.http.HttpServletRequest;
import org.springframework.format.annotation.DateTimeFormat;
import org.springframework.http.ResponseEntity;
import org.springframework.security.access.prepost.PreAuthorize;
@@ -16,8 +19,6 @@ import org.springframework.web.bind.annotation.RequestParam;
import org.springframework.web.bind.annotation.RestController;
import java.time.Instant;
import java.time.LocalDate;
import java.time.ZoneOffset;
@RestController
@RequestMapping("/api/v1/admin/audit")
@@ -26,19 +27,22 @@ import java.time.ZoneOffset;
public class AuditLogController {
private final AuditRepository auditRepository;
private final AuditService auditService;
public AuditLogController(AuditRepository auditRepository) {
public AuditLogController(AuditRepository auditRepository, AuditService auditService) {
this.auditRepository = auditRepository;
this.auditService = auditService;
}
@GetMapping
@Operation(summary = "Search audit log entries with pagination")
public ResponseEntity<AuditLogPageResponse> getAuditLog(
HttpServletRequest httpRequest,
@RequestParam(required = false) String username,
@RequestParam(required = false) String category,
@RequestParam(required = false) String search,
@RequestParam(required = false) @DateTimeFormat(iso = DateTimeFormat.ISO.DATE) LocalDate from,
@RequestParam(required = false) @DateTimeFormat(iso = DateTimeFormat.ISO.DATE) LocalDate to,
@RequestParam(required = false) @DateTimeFormat(iso = DateTimeFormat.ISO.DATE_TIME) Instant from,
@RequestParam(required = false) @DateTimeFormat(iso = DateTimeFormat.ISO.DATE_TIME) Instant to,
@RequestParam(defaultValue = "timestamp") String sort,
@RequestParam(defaultValue = "desc") String order,
@RequestParam(defaultValue = "0") int page,
@@ -46,8 +50,8 @@ public class AuditLogController {
size = Math.min(size, 100);
Instant fromInstant = from != null ? from.atStartOfDay(ZoneOffset.UTC).toInstant() : null;
Instant toInstant = to != null ? to.plusDays(1).atStartOfDay(ZoneOffset.UTC).toInstant() : null;
Instant fromInstant = from != null ? from : Instant.now().minus(java.time.Duration.ofDays(7));
Instant toInstant = to != null ? to : Instant.now();
AuditCategory cat = null;
if (category != null && !category.isEmpty()) {
@@ -58,6 +62,8 @@ public class AuditLogController {
}
}
auditService.log("view_audit_log", AuditCategory.AUTH, null, null, AuditResult.SUCCESS, httpRequest);
AuditQuery query = new AuditQuery(username, cat, search, fromInstant, toInstant, sort, order, page, size);
AuditPage result = auditRepository.find(query);

View File

@@ -7,6 +7,7 @@ import com.cameleer3.server.app.dto.TableSizeResponse;
import com.cameleer3.server.core.admin.AuditCategory;
import com.cameleer3.server.core.admin.AuditResult;
import com.cameleer3.server.core.admin.AuditService;
import com.cameleer3.server.core.ingestion.IngestionService;
import com.zaxxer.hikari.HikariDataSource;
import com.zaxxer.hikari.HikariPoolMXBean;
import io.swagger.v3.oas.annotations.Operation;
@@ -24,7 +25,9 @@ import org.springframework.web.bind.annotation.RestController;
import org.springframework.web.server.ResponseStatusException;
import javax.sql.DataSource;
import java.time.Instant;
import java.util.List;
import java.util.Map;
@RestController
@RequestMapping("/api/v1/admin/database")
@@ -35,11 +38,14 @@ public class DatabaseAdminController {
private final JdbcTemplate jdbc;
private final DataSource dataSource;
private final AuditService auditService;
private final IngestionService ingestionService;
public DatabaseAdminController(JdbcTemplate jdbc, DataSource dataSource, AuditService auditService) {
public DatabaseAdminController(JdbcTemplate jdbc, DataSource dataSource,
AuditService auditService, IngestionService ingestionService) {
this.jdbc = jdbc;
this.dataSource = dataSource;
this.auditService = auditService;
this.ingestionService = ingestionService;
}
@GetMapping("/status")
@@ -117,6 +123,29 @@ public class DatabaseAdminController {
return ResponseEntity.ok().build();
}
@GetMapping("/metrics-pipeline")
@Operation(summary = "Get metrics ingestion pipeline diagnostics")
public ResponseEntity<Map<String, Object>> getMetricsPipeline() {
int bufferDepth = ingestionService.getMetricsBufferDepth();
Long totalRows = jdbc.queryForObject(
"SELECT count(*) FROM agent_metrics", Long.class);
List<String> agentIds = jdbc.queryForList(
"SELECT DISTINCT agent_id FROM agent_metrics ORDER BY agent_id", String.class);
Instant latestCollected = jdbc.queryForObject(
"SELECT max(collected_at) FROM agent_metrics", Instant.class);
List<String> metricNames = jdbc.queryForList(
"SELECT DISTINCT metric_name FROM agent_metrics ORDER BY metric_name", String.class);
return ResponseEntity.ok(Map.of(
"bufferDepth", bufferDepth,
"totalRows", totalRows != null ? totalRows : 0,
"distinctAgents", agentIds,
"distinctMetrics", metricNames,
"latestCollectedAt", latestCollected != null ? latestCollected.toString() : "none"
));
}
private String extractHost(DataSource ds) {
try {
if (ds instanceof HikariDataSource hds) {

View File

@@ -1,6 +1,8 @@
package com.cameleer3.server.app.controller;
import com.cameleer3.common.graph.RouteGraph;
import com.cameleer3.server.core.agent.AgentInfo;
import com.cameleer3.server.core.agent.AgentRegistryService;
import com.cameleer3.server.core.ingestion.IngestionService;
import com.cameleer3.server.core.ingestion.TaggedDiagram;
import com.fasterxml.jackson.core.JsonProcessingException;
@@ -35,10 +37,14 @@ public class DiagramController {
private static final Logger log = LoggerFactory.getLogger(DiagramController.class);
private final IngestionService ingestionService;
private final AgentRegistryService registryService;
private final ObjectMapper objectMapper;
public DiagramController(IngestionService ingestionService, ObjectMapper objectMapper) {
public DiagramController(IngestionService ingestionService,
AgentRegistryService registryService,
ObjectMapper objectMapper) {
this.ingestionService = ingestionService;
this.registryService = registryService;
this.objectMapper = objectMapper;
}
@@ -48,10 +54,11 @@ public class DiagramController {
@ApiResponse(responseCode = "202", description = "Data accepted for processing")
public ResponseEntity<Void> ingestDiagrams(@RequestBody String body) throws JsonProcessingException {
String agentId = extractAgentId();
String applicationName = resolveApplicationName(agentId);
List<RouteGraph> graphs = parsePayload(body);
for (RouteGraph graph : graphs) {
ingestionService.ingestDiagram(new TaggedDiagram(agentId, graph));
ingestionService.ingestDiagram(new TaggedDiagram(agentId, applicationName, graph));
}
return ResponseEntity.accepted().build();
@@ -62,6 +69,11 @@ public class DiagramController {
return auth != null ? auth.getName() : "";
}
private String resolveApplicationName(String agentId) {
AgentInfo agent = registryService.findById(agentId);
return agent != null ? agent.application() : "";
}
private List<RouteGraph> parsePayload(String body) throws JsonProcessingException {
String trimmed = body.strip();
if (trimmed.startsWith("[")) {

View File

@@ -62,6 +62,7 @@ public class DiagramRenderController {
@ApiResponse(responseCode = "404", description = "Diagram not found")
public ResponseEntity<?> renderDiagram(
@PathVariable String contentHash,
@RequestParam(defaultValue = "LR") String direction,
HttpServletRequest request) {
Optional<RouteGraph> graphOpt = diagramStore.findByContentHash(contentHash);
@@ -76,7 +77,7 @@ public class DiagramRenderController {
// without also accepting everything (*/*). This means "application/json"
// must appear and wildcards must not dominate the preference.
if (accept != null && isJsonPreferred(accept)) {
DiagramLayout layout = diagramRenderer.layoutJson(graph);
DiagramLayout layout = diagramRenderer.layoutJson(graph, direction);
return ResponseEntity.ok()
.contentType(MediaType.APPLICATION_JSON)
.body(layout);
@@ -96,7 +97,8 @@ public class DiagramRenderController {
@ApiResponse(responseCode = "404", description = "No diagram found for the given application and route")
public ResponseEntity<DiagramLayout> findByApplicationAndRoute(
@RequestParam String application,
@RequestParam String routeId) {
@RequestParam String routeId,
@RequestParam(defaultValue = "LR") String direction) {
List<String> agentIds = registryService.findByApplication(application).stream()
.map(AgentInfo::id)
.toList();
@@ -115,7 +117,7 @@ public class DiagramRenderController {
return ResponseEntity.notFound().build();
}
DiagramLayout layout = diagramRenderer.layoutJson(graphOpt.get());
DiagramLayout layout = diagramRenderer.layoutJson(graphOpt.get(), direction);
return ResponseEntity.ok(layout);
}

View File

@@ -0,0 +1,61 @@
package com.cameleer3.server.app.controller;
import com.cameleer3.common.model.LogBatch;
import com.cameleer3.server.app.search.OpenSearchLogIndex;
import com.cameleer3.server.core.agent.AgentInfo;
import com.cameleer3.server.core.agent.AgentRegistryService;
import io.swagger.v3.oas.annotations.Operation;
import io.swagger.v3.oas.annotations.responses.ApiResponse;
import io.swagger.v3.oas.annotations.tags.Tag;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.http.ResponseEntity;
import org.springframework.security.core.Authentication;
import org.springframework.security.core.context.SecurityContextHolder;
import org.springframework.web.bind.annotation.PostMapping;
import org.springframework.web.bind.annotation.RequestBody;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RestController;
@RestController
@RequestMapping("/api/v1/data")
@Tag(name = "Ingestion", description = "Data ingestion endpoints")
public class LogIngestionController {
private static final Logger log = LoggerFactory.getLogger(LogIngestionController.class);
private final OpenSearchLogIndex logIndex;
private final AgentRegistryService registryService;
public LogIngestionController(OpenSearchLogIndex logIndex,
AgentRegistryService registryService) {
this.logIndex = logIndex;
this.registryService = registryService;
}
@PostMapping("/logs")
@Operation(summary = "Ingest application log entries",
description = "Accepts a batch of log entries from an agent. Entries are indexed in OpenSearch.")
@ApiResponse(responseCode = "202", description = "Logs accepted for indexing")
public ResponseEntity<Void> ingestLogs(@RequestBody LogBatch batch) {
String agentId = extractAgentId();
String application = resolveApplicationName(agentId);
if (batch.getEntries() != null && !batch.getEntries().isEmpty()) {
log.debug("Received {} log entries from agent={}, app={}", batch.getEntries().size(), agentId, application);
logIndex.indexBatch(agentId, application, batch.getEntries());
}
return ResponseEntity.accepted().build();
}
private String extractAgentId() {
Authentication auth = SecurityContextHolder.getContext().getAuthentication();
return auth != null ? auth.getName() : "";
}
private String resolveApplicationName(String agentId) {
AgentInfo agent = registryService.findById(agentId);
return agent != null ? agent.application() : "";
}
}

View File

@@ -0,0 +1,50 @@
package com.cameleer3.server.app.controller;
import com.cameleer3.server.app.dto.LogEntryResponse;
import com.cameleer3.server.app.search.OpenSearchLogIndex;
import io.swagger.v3.oas.annotations.Operation;
import io.swagger.v3.oas.annotations.tags.Tag;
import org.springframework.http.ResponseEntity;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RequestParam;
import org.springframework.web.bind.annotation.RestController;
import java.time.Instant;
import java.util.List;
@RestController
@RequestMapping("/api/v1/logs")
@Tag(name = "Application Logs", description = "Query application logs stored in OpenSearch")
public class LogQueryController {
private final OpenSearchLogIndex logIndex;
public LogQueryController(OpenSearchLogIndex logIndex) {
this.logIndex = logIndex;
}
@GetMapping
@Operation(summary = "Search application log entries",
description = "Returns log entries for a given application, optionally filtered by agent, level, time range, and text query")
public ResponseEntity<List<LogEntryResponse>> searchLogs(
@RequestParam String application,
@RequestParam(required = false) String agentId,
@RequestParam(required = false) String level,
@RequestParam(required = false) String query,
@RequestParam(required = false) String exchangeId,
@RequestParam(required = false) String from,
@RequestParam(required = false) String to,
@RequestParam(defaultValue = "200") int limit) {
limit = Math.min(limit, 1000);
Instant fromInstant = from != null ? Instant.parse(from) : null;
Instant toInstant = to != null ? Instant.parse(to) : null;
List<LogEntryResponse> entries = logIndex.search(
application, agentId, level, query, exchangeId, fromInstant, toInstant, limit);
return ResponseEntity.ok(entries);
}
}

View File

@@ -44,13 +44,23 @@ public class MetricsController {
@Operation(summary = "Ingest agent metrics",
description = "Accepts an array of MetricsSnapshot objects")
@ApiResponse(responseCode = "202", description = "Data accepted for processing")
@ApiResponse(responseCode = "400", description = "Invalid payload")
@ApiResponse(responseCode = "503", description = "Buffer full, retry later")
public ResponseEntity<Void> ingestMetrics(@RequestBody String body) throws JsonProcessingException {
List<MetricsSnapshot> metrics = parsePayload(body);
boolean accepted = ingestionService.acceptMetrics(metrics);
public ResponseEntity<Void> ingestMetrics(@RequestBody String body) {
List<MetricsSnapshot> metrics;
try {
metrics = parsePayload(body);
} catch (JsonProcessingException e) {
log.warn("Failed to parse metrics payload: {}", e.getMessage());
return ResponseEntity.badRequest().build();
}
log.debug("Received {} metric(s) from agent(s)", metrics.size());
boolean accepted = ingestionService.acceptMetrics(metrics);
if (!accepted) {
log.warn("Metrics buffer full, returning 503");
log.warn("Metrics buffer full ({} items), returning 503",
ingestionService.getMetricsBufferDepth());
return ResponseEntity.status(HttpStatus.SERVICE_UNAVAILABLE)
.header("Retry-After", "5")
.build();

View File

@@ -61,7 +61,8 @@ public class OidcConfigAdminController {
@GetMapping
@Operation(summary = "Get OIDC configuration")
@ApiResponse(responseCode = "200", description = "Current OIDC configuration (client_secret masked)")
public ResponseEntity<OidcAdminConfigResponse> getConfig() {
public ResponseEntity<OidcAdminConfigResponse> getConfig(HttpServletRequest httpRequest) {
auditService.log("view_oidc_config", AuditCategory.CONFIG, null, null, AuditResult.SUCCESS, httpRequest);
Optional<OidcConfig> config = configRepository.find();
if (config.isEmpty()) {
return ResponseEntity.ok(OidcAdminConfigResponse.unconfigured());

View File

@@ -49,12 +49,14 @@ public class OpenSearchAdminController {
private final ObjectMapper objectMapper;
private final String opensearchUrl;
private final String indexPrefix;
private final String logIndexPrefix;
public OpenSearchAdminController(OpenSearchClient client, RestClient restClient,
SearchIndexerStats indexerStats, AuditService auditService,
ObjectMapper objectMapper,
@Value("${opensearch.url:http://localhost:9200}") String opensearchUrl,
@Value("${opensearch.index-prefix:executions-}") String indexPrefix) {
@Value("${opensearch.index-prefix:executions-}") String indexPrefix,
@Value("${opensearch.log-index-prefix:logs-}") String logIndexPrefix) {
this.client = client;
this.restClient = restClient;
this.indexerStats = indexerStats;
@@ -62,6 +64,7 @@ public class OpenSearchAdminController {
this.objectMapper = objectMapper;
this.opensearchUrl = opensearchUrl;
this.indexPrefix = indexPrefix;
this.logIndexPrefix = logIndexPrefix;
}
@GetMapping("/status")
@@ -100,7 +103,8 @@ public class OpenSearchAdminController {
public ResponseEntity<IndicesPageResponse> getIndices(
@RequestParam(defaultValue = "0") int page,
@RequestParam(defaultValue = "20") int size,
@RequestParam(defaultValue = "") String search) {
@RequestParam(defaultValue = "") String search,
@RequestParam(defaultValue = "executions") String prefix) {
try {
Response response = restClient.performRequest(
new Request("GET", "/_cat/indices?format=json&h=index,health,docs.count,store.size,pri,rep&bytes=b"));
@@ -109,10 +113,12 @@ public class OpenSearchAdminController {
indices = objectMapper.readTree(is);
}
String filterPrefix = "logs".equals(prefix) ? logIndexPrefix : indexPrefix;
List<IndexInfoResponse> allIndices = new ArrayList<>();
for (JsonNode idx : indices) {
String name = idx.path("index").asText("");
if (!name.startsWith(indexPrefix)) {
if (!name.startsWith(filterPrefix)) {
continue;
}
if (!search.isEmpty() && !name.contains(search)) {
@@ -152,7 +158,7 @@ public class OpenSearchAdminController {
@Operation(summary = "Delete an OpenSearch index")
public ResponseEntity<Void> deleteIndex(@PathVariable String name, HttpServletRequest request) {
try {
if (!name.startsWith(indexPrefix)) {
if (!name.startsWith(indexPrefix) && !name.startsWith(logIndexPrefix)) {
throw new ResponseStatusException(HttpStatus.FORBIDDEN, "Cannot delete index outside application scope");
}
boolean exists = client.indices().exists(r -> r.index(name)).value();

View File

@@ -14,6 +14,7 @@ import org.springframework.http.ResponseEntity;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RequestParam;
import org.springframework.web.bind.annotation.RestController;
import java.sql.Timestamp;
@@ -44,7 +45,9 @@ public class RouteCatalogController {
@Operation(summary = "Get route catalog",
description = "Returns all applications with their routes, agents, and health status")
@ApiResponse(responseCode = "200", description = "Catalog returned")
public ResponseEntity<List<AppCatalogEntry>> getCatalog() {
public ResponseEntity<List<AppCatalogEntry>> getCatalog(
@RequestParam(required = false) String from,
@RequestParam(required = false) String to) {
List<AgentInfo> allAgents = registryService.findAll();
// Group agents by application name
@@ -63,9 +66,10 @@ public class RouteCatalogController {
routesByApp.put(entry.getKey(), routes);
}
// Query route-level stats for the last 24 hours
// Time range for exchange counts — use provided range or default to last 24h
Instant now = Instant.now();
Instant from24h = now.minus(24, ChronoUnit.HOURS);
Instant rangeFrom = from != null ? Instant.parse(from) : now.minus(24, ChronoUnit.HOURS);
Instant rangeTo = to != null ? Instant.parse(to) : now;
Instant from1m = now.minus(1, ChronoUnit.MINUTES);
// Route exchange counts from continuous aggregate
@@ -82,7 +86,7 @@ public class RouteCatalogController {
Timestamp ts = rs.getTimestamp("last_seen");
if (ts != null) routeLastSeen.put(key, ts.toInstant());
},
Timestamp.from(from24h), Timestamp.from(now));
Timestamp.from(rangeFrom), Timestamp.from(rangeTo));
} catch (Exception e) {
// Continuous aggregate may not exist yet
}

View File

@@ -58,7 +58,8 @@ public class UserAdminController {
@GetMapping
@Operation(summary = "List all users with RBAC detail")
@ApiResponse(responseCode = "200", description = "User list returned")
public ResponseEntity<List<UserDetail>> listUsers() {
public ResponseEntity<List<UserDetail>> listUsers(HttpServletRequest httpRequest) {
auditService.log("view_users", AuditCategory.USER_MGMT, null, null, AuditResult.SUCCESS, httpRequest);
return ResponseEntity.ok(rbacService.listUsers());
}

View File

@@ -45,6 +45,7 @@ public class ElkDiagramRenderer implements DiagramRenderer {
private static final int PADDING = 20;
private static final int NODE_HEIGHT = 40;
private static final int NODE_WIDTH = 160;
private static final int MIN_NODE_WIDTH = 80;
private static final int CHAR_WIDTH = 8;
private static final int LABEL_PADDING = 32;
@@ -97,9 +98,11 @@ public class ElkDiagramRenderer implements DiagramRenderer {
/** NodeTypes that act as compound containers with children. */
private static final Set<NodeType> COMPOUND_TYPES = EnumSet.of(
NodeType.EIP_CHOICE, NodeType.EIP_SPLIT, NodeType.TRY_CATCH,
NodeType.DO_TRY, NodeType.EIP_LOOP, NodeType.EIP_MULTICAST,
NodeType.EIP_AGGREGATE
NodeType.EIP_CHOICE, NodeType.EIP_WHEN, NodeType.EIP_OTHERWISE,
NodeType.EIP_SPLIT, NodeType.TRY_CATCH,
NodeType.DO_TRY, NodeType.DO_CATCH, NodeType.DO_FINALLY,
NodeType.EIP_LOOP, NodeType.EIP_MULTICAST,
NodeType.EIP_AGGREGATE, NodeType.ON_EXCEPTION, NodeType.ERROR_HANDLER
);
public ElkDiagramRenderer() {
@@ -112,7 +115,7 @@ public class ElkDiagramRenderer implements DiagramRenderer {
@Override
public String renderSvg(RouteGraph graph) {
LayoutResult result = computeLayout(graph);
LayoutResult result = computeLayout(graph, Direction.DOWN);
DiagramLayout layout = result.layout;
int svgWidth = (int) Math.ceil(layout.width()) + 2 * PADDING;
@@ -153,97 +156,54 @@ public class ElkDiagramRenderer implements DiagramRenderer {
@Override
public DiagramLayout layoutJson(RouteGraph graph) {
return computeLayout(graph).layout;
return computeLayout(graph, Direction.RIGHT).layout;
}
@Override
public DiagramLayout layoutJson(RouteGraph graph, String direction) {
Direction dir = "TB".equalsIgnoreCase(direction) ? Direction.DOWN : Direction.RIGHT;
return computeLayout(graph, dir).layout;
}
// ----------------------------------------------------------------
// Layout computation
// ----------------------------------------------------------------
private LayoutResult computeLayout(RouteGraph graph) {
private LayoutResult computeLayout(RouteGraph graph, Direction rootDirection) {
ElkGraphFactory factory = ElkGraphFactory.eINSTANCE;
// Create root node
ElkNode rootNode = factory.createElkNode();
rootNode.setIdentifier("root");
rootNode.setProperty(CoreOptions.ALGORITHM, "org.eclipse.elk.layered");
rootNode.setProperty(CoreOptions.DIRECTION, Direction.DOWN);
rootNode.setProperty(CoreOptions.DIRECTION, rootDirection);
rootNode.setProperty(CoreOptions.SPACING_NODE_NODE, NODE_SPACING);
rootNode.setProperty(CoreOptions.SPACING_EDGE_NODE, EDGE_SPACING);
rootNode.setProperty(CoreOptions.HIERARCHY_HANDLING, HierarchyHandling.INCLUDE_CHILDREN);
// Build index of RouteNodes
// Build index of all RouteNodes (flat list from graph + recursive children)
Map<String, RouteNode> routeNodeMap = new HashMap<>();
if (graph.getNodes() != null) {
for (RouteNode rn : graph.getNodes()) {
routeNodeMap.put(rn.getId(), rn);
indexNodeRecursive(rn, routeNodeMap);
}
}
// Identify compound node IDs and their children
Set<String> compoundNodeIds = new HashSet<>();
Map<String, String> childToParent = new HashMap<>();
for (RouteNode rn : routeNodeMap.values()) {
if (rn.getType() != null && COMPOUND_TYPES.contains(rn.getType())
&& rn.getChildren() != null && !rn.getChildren().isEmpty()) {
compoundNodeIds.add(rn.getId());
for (RouteNode child : rn.getChildren()) {
childToParent.put(child.getId(), rn.getId());
}
}
}
// Track which nodes are children of a compound (at any depth)
Set<String> childNodeIds = new HashSet<>();
// Create ELK nodes
// Create ELK nodes recursively — compounds contain their children
Map<String, ElkNode> elkNodeMap = new HashMap<>();
Map<String, Color> nodeColors = new HashMap<>();
Set<String> compoundNodeIds = new HashSet<>();
// First, create compound (parent) nodes
for (String compoundId : compoundNodeIds) {
RouteNode rn = routeNodeMap.get(compoundId);
ElkNode elkCompound = factory.createElkNode();
elkCompound.setIdentifier(rn.getId());
elkCompound.setParent(rootNode);
// Compound nodes are larger initially -- ELK will resize
elkCompound.setWidth(200);
elkCompound.setHeight(100);
// Set properties for compound layout
elkCompound.setProperty(CoreOptions.ALGORITHM, "org.eclipse.elk.layered");
elkCompound.setProperty(CoreOptions.DIRECTION, Direction.DOWN);
elkCompound.setProperty(CoreOptions.SPACING_NODE_NODE, NODE_SPACING * 0.5);
elkCompound.setProperty(CoreOptions.SPACING_EDGE_NODE, EDGE_SPACING * 0.5);
elkCompound.setProperty(CoreOptions.PADDING,
new org.eclipse.elk.core.math.ElkPadding(COMPOUND_TOP_PADDING,
COMPOUND_SIDE_PADDING, COMPOUND_SIDE_PADDING, COMPOUND_SIDE_PADDING));
elkNodeMap.put(rn.getId(), elkCompound);
nodeColors.put(rn.getId(), colorForType(rn.getType()));
// Create child nodes inside compound
for (RouteNode child : rn.getChildren()) {
ElkNode elkChild = factory.createElkNode();
elkChild.setIdentifier(child.getId());
elkChild.setParent(elkCompound);
int w = Math.max(MIN_NODE_WIDTH, (child.getLabel() != null ? child.getLabel().length() : 0) * CHAR_WIDTH + LABEL_PADDING);
elkChild.setWidth(w);
elkChild.setHeight(NODE_HEIGHT);
elkNodeMap.put(child.getId(), elkChild);
nodeColors.put(child.getId(), colorForType(child.getType()));
}
}
// Then, create non-compound, non-child nodes
for (RouteNode rn : routeNodeMap.values()) {
if (!elkNodeMap.containsKey(rn.getId())) {
ElkNode elkNode = factory.createElkNode();
elkNode.setIdentifier(rn.getId());
elkNode.setParent(rootNode);
int w = Math.max(MIN_NODE_WIDTH, (rn.getLabel() != null ? rn.getLabel().length() : 0) * CHAR_WIDTH + LABEL_PADDING);
elkNode.setWidth(w);
elkNode.setHeight(NODE_HEIGHT);
elkNodeMap.put(rn.getId(), elkNode);
nodeColors.put(rn.getId(), colorForType(rn.getType()));
// Process top-level nodes from the graph
if (graph.getNodes() != null) {
for (RouteNode rn : graph.getNodes()) {
if (!elkNodeMap.containsKey(rn.getId())) {
createElkNodeRecursive(rn, rootNode, factory, elkNodeMap, nodeColors,
compoundNodeIds, childNodeIds);
}
}
}
@@ -270,64 +230,21 @@ public class ElkDiagramRenderer implements DiagramRenderer {
RecursiveGraphLayoutEngine engine = new RecursiveGraphLayoutEngine();
engine.layout(rootNode, new BasicProgressMonitor());
// Extract results
// Extract results — only top-level nodes (children collected recursively)
List<PositionedNode> positionedNodes = new ArrayList<>();
Map<String, CompoundInfo> compoundInfos = new HashMap<>();
for (RouteNode rn : routeNodeMap.values()) {
if (childToParent.containsKey(rn.getId())) {
// Skip children -- they are collected under their parent
continue;
}
ElkNode elkNode = elkNodeMap.get(rn.getId());
if (elkNode == null) continue;
if (compoundNodeIds.contains(rn.getId())) {
// Compound node: collect children
List<PositionedNode> children = new ArrayList<>();
if (rn.getChildren() != null) {
for (RouteNode child : rn.getChildren()) {
ElkNode childElk = elkNodeMap.get(child.getId());
if (childElk != null) {
children.add(new PositionedNode(
child.getId(),
child.getLabel() != null ? child.getLabel() : "",
child.getType() != null ? child.getType().name() : "UNKNOWN",
elkNode.getX() + childElk.getX(),
elkNode.getY() + childElk.getY(),
childElk.getWidth(),
childElk.getHeight(),
List.of()
));
}
}
if (graph.getNodes() != null) {
for (RouteNode rn : graph.getNodes()) {
if (childNodeIds.contains(rn.getId())) {
// Skip — collected under its parent compound
continue;
}
ElkNode elkNode = elkNodeMap.get(rn.getId());
if (elkNode == null) continue;
positionedNodes.add(new PositionedNode(
rn.getId(),
rn.getLabel() != null ? rn.getLabel() : "",
rn.getType() != null ? rn.getType().name() : "UNKNOWN",
elkNode.getX(),
elkNode.getY(),
elkNode.getWidth(),
elkNode.getHeight(),
children
));
compoundInfos.put(rn.getId(), new CompoundInfo(
rn.getId(), colorForType(rn.getType())));
} else {
positionedNodes.add(new PositionedNode(
rn.getId(),
rn.getLabel() != null ? rn.getLabel() : "",
rn.getType() != null ? rn.getType().name() : "UNKNOWN",
elkNode.getX(),
elkNode.getY(),
elkNode.getWidth(),
elkNode.getHeight(),
List.of()
));
positionedNodes.add(extractPositionedNode(rn, elkNode, elkNodeMap,
compoundNodeIds, compoundInfos, rootNode));
}
}
@@ -481,6 +398,98 @@ public class ElkDiagramRenderer implements DiagramRenderer {
}
}
// ----------------------------------------------------------------
// Recursive node building
// ----------------------------------------------------------------
/** Index a RouteNode and all its descendants into the map. */
private void indexNodeRecursive(RouteNode node, Map<String, RouteNode> map) {
map.put(node.getId(), node);
if (node.getChildren() != null) {
for (RouteNode child : node.getChildren()) {
indexNodeRecursive(child, map);
}
}
}
/**
* Recursively create ELK nodes. Compound nodes become ELK containers
* with their children nested inside. Non-compound nodes become leaf nodes.
*/
private void createElkNodeRecursive(
RouteNode rn, ElkNode parentElk, ElkGraphFactory factory,
Map<String, ElkNode> elkNodeMap, Map<String, Color> nodeColors,
Set<String> compoundNodeIds, Set<String> childNodeIds) {
boolean isCompound = rn.getType() != null && COMPOUND_TYPES.contains(rn.getType())
&& rn.getChildren() != null && !rn.getChildren().isEmpty();
ElkNode elkNode = factory.createElkNode();
elkNode.setIdentifier(rn.getId());
elkNode.setParent(parentElk);
if (isCompound) {
compoundNodeIds.add(rn.getId());
elkNode.setWidth(200);
elkNode.setHeight(100);
elkNode.setProperty(CoreOptions.ALGORITHM, "org.eclipse.elk.layered");
elkNode.setProperty(CoreOptions.DIRECTION, Direction.DOWN);
elkNode.setProperty(CoreOptions.SPACING_NODE_NODE, NODE_SPACING * 0.5);
elkNode.setProperty(CoreOptions.SPACING_EDGE_NODE, EDGE_SPACING * 0.5);
elkNode.setProperty(CoreOptions.PADDING,
new org.eclipse.elk.core.math.ElkPadding(COMPOUND_TOP_PADDING,
COMPOUND_SIDE_PADDING, COMPOUND_SIDE_PADDING, COMPOUND_SIDE_PADDING));
// Recursively create children inside this compound
for (RouteNode child : rn.getChildren()) {
childNodeIds.add(child.getId());
createElkNodeRecursive(child, elkNode, factory, elkNodeMap, nodeColors,
compoundNodeIds, childNodeIds);
}
} else {
elkNode.setWidth(NODE_WIDTH);
elkNode.setHeight(NODE_HEIGHT);
}
elkNodeMap.put(rn.getId(), elkNode);
nodeColors.put(rn.getId(), colorForType(rn.getType()));
}
/**
* Recursively extract a PositionedNode from the ELK layout result.
* Compound nodes include their children with absolute coordinates.
*/
private PositionedNode extractPositionedNode(
RouteNode rn, ElkNode elkNode, Map<String, ElkNode> elkNodeMap,
Set<String> compoundNodeIds, Map<String, CompoundInfo> compoundInfos,
ElkNode rootNode) {
double absX = getAbsoluteX(elkNode, rootNode);
double absY = getAbsoluteY(elkNode, rootNode);
List<PositionedNode> children = List.of();
if (compoundNodeIds.contains(rn.getId()) && rn.getChildren() != null) {
children = new ArrayList<>();
for (RouteNode child : rn.getChildren()) {
ElkNode childElk = elkNodeMap.get(child.getId());
if (childElk != null) {
children.add(extractPositionedNode(child, childElk, elkNodeMap,
compoundNodeIds, compoundInfos, rootNode));
}
}
compoundInfos.put(rn.getId(), new CompoundInfo(rn.getId(), colorForType(rn.getType())));
}
return new PositionedNode(
rn.getId(),
rn.getLabel() != null ? rn.getLabel() : "",
rn.getType() != null ? rn.getType().name() : "UNKNOWN",
absX, absY,
elkNode.getWidth(), elkNode.getHeight(),
children
);
}
// ----------------------------------------------------------------
// ELK graph helpers
// ----------------------------------------------------------------
@@ -539,8 +548,8 @@ public class ElkDiagramRenderer implements DiagramRenderer {
List<PositionedNode> all = new ArrayList<>();
for (PositionedNode n : nodes) {
all.add(n);
if (n.children() != null) {
all.addAll(n.children());
if (n.children() != null && !n.children().isEmpty()) {
all.addAll(allNodes(n.children()));
}
}
return all;

View File

@@ -0,0 +1,11 @@
package com.cameleer3.server.app.dto;
/**
* Request body for command acknowledgment from agents.
* Contains the result status and message of the command execution.
*
* @param status "SUCCESS" or "FAILURE"
* @param message human-readable description of the result
* @param data optional structured JSON data returned by the agent (e.g. expression evaluation results)
*/
public record CommandAckRequest(String status, String message, String data) {}

View File

@@ -0,0 +1,13 @@
package com.cameleer3.server.app.dto;
import io.swagger.v3.oas.annotations.media.Schema;
@Schema(description = "Application log entry from OpenSearch")
public record LogEntryResponse(
@Schema(description = "Log timestamp (ISO-8601)") String timestamp,
@Schema(description = "Log level (INFO, WARN, ERROR, DEBUG)") String level,
@Schema(description = "Logger name") String loggerName,
@Schema(description = "Log message") String message,
@Schema(description = "Thread name") String threadName,
@Schema(description = "Stack trace (if present)") String stackTrace
) {}

View File

@@ -0,0 +1,11 @@
package com.cameleer3.server.app.dto;
/**
* Request body for testing a tap expression against sample data via a live agent.
*
* @param expression the expression to evaluate (e.g. Simple, JSONPath, XPath)
* @param language the expression language identifier
* @param body sample message body to evaluate the expression against
* @param target what the expression targets (e.g. "body", "header", "property")
*/
public record TestExpressionRequest(String expression, String language, String body, String target) {}

View File

@@ -0,0 +1,9 @@
package com.cameleer3.server.app.dto;
/**
* Response from testing a tap expression against sample data.
*
* @param result the evaluation result (null if an error occurred)
* @param error error message if evaluation failed (null on success)
*/
public record TestExpressionResponse(String result, String error) {}

View File

@@ -0,0 +1,57 @@
package com.cameleer3.server.app.interceptor;
import com.cameleer3.server.core.admin.AuditCategory;
import com.cameleer3.server.core.admin.AuditResult;
import com.cameleer3.server.core.admin.AuditService;
import jakarta.servlet.http.HttpServletRequest;
import jakarta.servlet.http.HttpServletResponse;
import org.springframework.stereotype.Component;
import org.springframework.web.servlet.HandlerInterceptor;
import java.util.Map;
import java.util.Set;
/**
* Safety-net audit interceptor that logs a basic entry for any state-changing
* request (POST/PUT/DELETE) that was not explicitly audited by the controller.
* <p>
* Controllers that call {@link AuditService#log} set the {@code audit.logged}
* request attribute, which this interceptor checks to avoid double-recording.
*/
@Component
public class AuditInterceptor implements HandlerInterceptor {
private static final Set<String> AUDITABLE_METHODS = Set.of("POST", "PUT", "DELETE");
private static final Set<String> EXCLUDED_PATHS = Set.of("/api/v1/search/executions");
private final AuditService auditService;
public AuditInterceptor(AuditService auditService) {
this.auditService = auditService;
}
@Override
public void afterCompletion(HttpServletRequest request, HttpServletResponse response,
Object handler, Exception ex) {
if (!AUDITABLE_METHODS.contains(request.getMethod())) {
return;
}
if (Boolean.TRUE.equals(request.getAttribute("audit.logged"))) {
return;
}
String path = request.getRequestURI();
if (EXCLUDED_PATHS.contains(path)) {
return;
}
AuditResult result = response.getStatus() < 400 ? AuditResult.SUCCESS : AuditResult.FAILURE;
auditService.log(
"HTTP " + request.getMethod() + " " + path,
AuditCategory.INFRA,
path,
Map.of("status", response.getStatus()),
result,
request);
}
}

View File

@@ -6,6 +6,8 @@ import com.cameleer3.server.core.search.SearchResult;
import com.cameleer3.server.core.storage.SearchIndex;
import com.cameleer3.server.core.storage.model.ExecutionDocument;
import com.cameleer3.server.core.storage.model.ExecutionDocument.ProcessorDoc;
import com.fasterxml.jackson.core.type.TypeReference;
import com.fasterxml.jackson.databind.ObjectMapper;
import jakarta.annotation.PostConstruct;
import org.opensearch.client.json.JsonData;
import org.opensearch.client.opensearch.OpenSearchClient;
@@ -33,6 +35,8 @@ public class OpenSearchIndex implements SearchIndex {
private static final Logger log = LoggerFactory.getLogger(OpenSearchIndex.class);
private static final DateTimeFormatter DAY_FMT = DateTimeFormatter.ofPattern("yyyy-MM-dd")
.withZone(ZoneOffset.UTC);
private static final ObjectMapper JSON = new ObjectMapper();
private static final TypeReference<Map<String, String>> STR_MAP = new TypeReference<>() {};
private final OpenSearchClient client;
private final String indexPrefix;
@@ -125,6 +129,12 @@ public class OpenSearchIndex implements SearchIndex {
}
}
private static final List<String> HIGHLIGHT_FIELDS = List.of(
"error_message", "attributes_text",
"processors.input_body", "processors.output_body",
"processors.input_headers", "processors.output_headers",
"processors.attributes_text");
private org.opensearch.client.opensearch.core.SearchRequest buildSearchRequest(
SearchRequest request, int size) {
return org.opensearch.client.opensearch.core.SearchRequest.of(b -> {
@@ -137,6 +147,17 @@ public class OpenSearchIndex implements SearchIndex {
.field(request.sortColumn())
.order("asc".equalsIgnoreCase(request.sortDir())
? SortOrder.Asc : SortOrder.Desc)));
// Add highlight when full-text search is active
if (request.text() != null && !request.text().isBlank()) {
b.highlight(h -> {
for (String field : HIGHLIGHT_FIELDS) {
h.fields(field, hf -> hf
.fragmentSize(120)
.numberOfFragments(1));
}
return h;
});
}
return b;
});
}
@@ -166,6 +187,8 @@ public class OpenSearchIndex implements SearchIndex {
filter.add(termQuery("agent_id.keyword", request.agentId()));
if (request.correlationId() != null)
filter.add(termQuery("correlation_id.keyword", request.correlationId()));
if (request.application() != null && !request.application().isBlank())
filter.add(termQuery("application_name.keyword", request.application()));
// Full-text search across all fields + nested processor fields
if (request.text() != null && !request.text().isBlank()) {
@@ -176,11 +199,13 @@ public class OpenSearchIndex implements SearchIndex {
// Search top-level text fields (analyzed match + wildcard for substring)
textQueries.add(Query.of(q -> q.multiMatch(m -> m
.query(text)
.fields("error_message", "error_stacktrace"))));
.fields("error_message", "error_stacktrace", "attributes_text"))));
textQueries.add(Query.of(q -> q.wildcard(w -> w
.field("error_message").value(wildcard).caseInsensitive(true))));
textQueries.add(Query.of(q -> q.wildcard(w -> w
.field("error_stacktrace").value(wildcard).caseInsensitive(true))));
textQueries.add(Query.of(q -> q.wildcard(w -> w
.field("attributes_text").value(wildcard).caseInsensitive(true))));
// Search nested processor fields (analyzed match + wildcard)
textQueries.add(Query.of(q -> q.nested(n -> n
@@ -189,14 +214,16 @@ public class OpenSearchIndex implements SearchIndex {
.query(text)
.fields("processors.input_body", "processors.output_body",
"processors.input_headers", "processors.output_headers",
"processors.error_message", "processors.error_stacktrace"))))));
"processors.error_message", "processors.error_stacktrace",
"processors.attributes_text"))))));
textQueries.add(Query.of(q -> q.nested(n -> n
.path("processors")
.query(nq -> nq.bool(nb -> nb.should(
wildcardQuery("processors.input_body", wildcard),
wildcardQuery("processors.output_body", wildcard),
wildcardQuery("processors.input_headers", wildcard),
wildcardQuery("processors.output_headers", wildcard)
wildcardQuery("processors.output_headers", wildcard),
wildcardQuery("processors.attributes_text", wildcard)
).minimumShouldMatch("1"))))));
// Also try keyword fields for exact matches
@@ -297,6 +324,11 @@ public class OpenSearchIndex implements SearchIndex {
map.put("duration_ms", doc.durationMs());
map.put("error_message", doc.errorMessage());
map.put("error_stacktrace", doc.errorStacktrace());
if (doc.attributes() != null) {
Map<String, String> attrs = parseAttributesJson(doc.attributes());
map.put("attributes", attrs);
map.put("attributes_text", flattenAttributes(attrs));
}
if (doc.processors() != null) {
map.put("processors", doc.processors().stream().map(p -> {
Map<String, Object> pm = new LinkedHashMap<>();
@@ -309,6 +341,11 @@ public class OpenSearchIndex implements SearchIndex {
pm.put("output_body", p.outputBody());
pm.put("input_headers", p.inputHeaders());
pm.put("output_headers", p.outputHeaders());
if (p.attributes() != null) {
Map<String, String> pAttrs = parseAttributesJson(p.attributes());
pm.put("attributes", pAttrs);
pm.put("attributes_text", flattenAttributes(pAttrs));
}
return pm;
}).toList());
}
@@ -319,6 +356,22 @@ public class OpenSearchIndex implements SearchIndex {
private ExecutionSummary hitToSummary(Hit<Map> hit) {
Map<String, Object> src = hit.source();
if (src == null) return null;
@SuppressWarnings("unchecked")
Map<String, String> attributes = src.get("attributes") instanceof Map
? new LinkedHashMap<>((Map<String, String>) src.get("attributes")) : null;
// Merge processor-level attributes (execution-level takes precedence)
if (src.get("processors") instanceof List<?> procs) {
for (Object pObj : procs) {
if (pObj instanceof Map<?, ?> pm && pm.get("attributes") instanceof Map<?, ?> pa) {
if (attributes == null) attributes = new LinkedHashMap<>();
for (var entry : pa.entrySet()) {
attributes.putIfAbsent(
String.valueOf(entry.getKey()),
String.valueOf(entry.getValue()));
}
}
}
}
return new ExecutionSummary(
(String) src.get("execution_id"),
(String) src.get("route_id"),
@@ -330,7 +383,35 @@ public class OpenSearchIndex implements SearchIndex {
src.get("duration_ms") != null ? ((Number) src.get("duration_ms")).longValue() : 0L,
(String) src.get("correlation_id"),
(String) src.get("error_message"),
null // diagramContentHash not stored in index
null, // diagramContentHash not stored in index
extractHighlight(hit),
attributes
);
}
private String extractHighlight(Hit<Map> hit) {
if (hit.highlight() == null || hit.highlight().isEmpty()) return null;
for (List<String> fragments : hit.highlight().values()) {
if (fragments != null && !fragments.isEmpty()) {
return fragments.get(0);
}
}
return null;
}
private static Map<String, String> parseAttributesJson(String json) {
if (json == null || json.isBlank()) return null;
try {
return JSON.readValue(json, STR_MAP);
} catch (Exception e) {
return null;
}
}
private static String flattenAttributes(Map<String, String> attrs) {
if (attrs == null || attrs.isEmpty()) return "";
return attrs.entrySet().stream()
.map(e -> e.getKey() + "=" + e.getValue())
.collect(Collectors.joining(" "));
}
}

View File

@@ -0,0 +1,223 @@
package com.cameleer3.server.app.search;
import com.cameleer3.common.model.LogEntry;
import com.cameleer3.server.app.dto.LogEntryResponse;
import jakarta.annotation.PostConstruct;
import org.opensearch.client.json.JsonData;
import org.opensearch.client.opensearch.OpenSearchClient;
import org.opensearch.client.opensearch._types.FieldValue;
import org.opensearch.client.opensearch._types.SortOrder;
import org.opensearch.client.opensearch._types.mapping.Property;
import org.opensearch.client.opensearch._types.query_dsl.BoolQuery;
import org.opensearch.client.opensearch._types.query_dsl.Query;
import org.opensearch.client.opensearch.core.BulkRequest;
import org.opensearch.client.opensearch.core.BulkResponse;
import org.opensearch.client.opensearch.core.bulk.BulkResponseItem;
import org.opensearch.client.opensearch.indices.ExistsIndexTemplateRequest;
import org.opensearch.client.opensearch.indices.PutIndexTemplateRequest;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.stereotype.Repository;
import java.io.IOException;
import java.time.Instant;
import java.time.ZoneOffset;
import java.time.format.DateTimeFormatter;
import java.util.ArrayList;
import java.util.LinkedHashMap;
import java.util.List;
import java.util.Map;
@Repository
public class OpenSearchLogIndex {
private static final Logger log = LoggerFactory.getLogger(OpenSearchLogIndex.class);
private static final DateTimeFormatter DAY_FMT = DateTimeFormatter.ofPattern("yyyy-MM-dd")
.withZone(ZoneOffset.UTC);
private final OpenSearchClient client;
private final String indexPrefix;
private final int retentionDays;
public OpenSearchLogIndex(OpenSearchClient client,
@Value("${opensearch.log-index-prefix:logs-}") String indexPrefix,
@Value("${opensearch.log-retention-days:7}") int retentionDays) {
this.client = client;
this.indexPrefix = indexPrefix;
this.retentionDays = retentionDays;
}
@PostConstruct
void init() {
ensureIndexTemplate();
ensureIsmPolicy();
}
private void ensureIndexTemplate() {
String templateName = indexPrefix.replace("-", "") + "-template";
String indexPattern = indexPrefix + "*";
try {
boolean exists = client.indices().existsIndexTemplate(
ExistsIndexTemplateRequest.of(b -> b.name(templateName))).value();
if (!exists) {
client.indices().putIndexTemplate(PutIndexTemplateRequest.of(b -> b
.name(templateName)
.indexPatterns(List.of(indexPattern))
.template(t -> t
.settings(s -> s
.numberOfShards("1")
.numberOfReplicas("1"))
.mappings(m -> m
.properties("@timestamp", Property.of(p -> p.date(d -> d)))
.properties("level", Property.of(p -> p.keyword(k -> k)))
.properties("loggerName", Property.of(p -> p.keyword(k -> k)))
.properties("message", Property.of(p -> p.text(tx -> tx)))
.properties("threadName", Property.of(p -> p.keyword(k -> k)))
.properties("stackTrace", Property.of(p -> p.text(tx -> tx)))
.properties("agentId", Property.of(p -> p.keyword(k -> k)))
.properties("application", Property.of(p -> p.keyword(k -> k)))
.properties("exchangeId", Property.of(p -> p.keyword(k -> k)))))));
log.info("OpenSearch log index template '{}' created", templateName);
}
} catch (IOException e) {
log.error("Failed to create log index template", e);
}
}
private void ensureIsmPolicy() {
String policyId = "logs-retention";
try {
// Use the low-level REST client to manage ISM policies
var restClient = client._transport();
// Check if the ISM policy exists via a GET; create if not
// ISM is managed via the _plugins/_ism/policies API
// For now, log a reminder — ISM policy should be created via OpenSearch API or dashboard
log.info("Log retention policy: indices matching '{}*' should be deleted after {} days. " +
"Ensure ISM policy '{}' is configured in OpenSearch.", indexPrefix, retentionDays, policyId);
} catch (Exception e) {
log.warn("Could not verify ISM policy for log retention", e);
}
}
public List<LogEntryResponse> search(String application, String agentId, String level,
String query, String exchangeId,
Instant from, Instant to, int limit) {
try {
BoolQuery.Builder bool = new BoolQuery.Builder();
bool.must(Query.of(q -> q.term(t -> t.field("application").value(FieldValue.of(application)))));
if (agentId != null && !agentId.isEmpty()) {
bool.must(Query.of(q -> q.term(t -> t.field("agentId").value(FieldValue.of(agentId)))));
}
if (exchangeId != null && !exchangeId.isEmpty()) {
// Match on top-level field (new records) or MDC nested field (old records)
bool.must(Query.of(q -> q.bool(b -> b
.should(Query.of(s -> s.term(t -> t.field("exchangeId.keyword").value(FieldValue.of(exchangeId)))))
.should(Query.of(s -> s.term(t -> t.field("mdc.camel.exchangeId.keyword").value(FieldValue.of(exchangeId)))))
.minimumShouldMatch("1"))));
}
if (level != null && !level.isEmpty()) {
bool.must(Query.of(q -> q.term(t -> t.field("level").value(FieldValue.of(level.toUpperCase())))));
}
if (query != null && !query.isEmpty()) {
bool.must(Query.of(q -> q.match(m -> m.field("message").query(FieldValue.of(query)))));
}
if (from != null || to != null) {
bool.must(Query.of(q -> q.range(r -> {
r.field("@timestamp");
if (from != null) r.gte(JsonData.of(from.toString()));
if (to != null) r.lte(JsonData.of(to.toString()));
return r;
})));
}
var response = client.search(s -> s
.index(indexPrefix + "*")
.query(Query.of(q -> q.bool(bool.build())))
.sort(so -> so.field(f -> f.field("@timestamp").order(SortOrder.Desc)))
.size(limit), Map.class);
List<LogEntryResponse> results = new ArrayList<>();
for (var hit : response.hits().hits()) {
@SuppressWarnings("unchecked")
Map<String, Object> src = (Map<String, Object>) hit.source();
if (src == null) continue;
results.add(new LogEntryResponse(
str(src, "@timestamp"),
str(src, "level"),
str(src, "loggerName"),
str(src, "message"),
str(src, "threadName"),
str(src, "stackTrace")));
}
return results;
} catch (IOException e) {
log.error("Failed to search log entries for application={}", application, e);
return List.of();
}
}
private static String str(Map<String, Object> map, String key) {
Object v = map.get(key);
return v != null ? v.toString() : null;
}
public void indexBatch(String agentId, String application, List<LogEntry> entries) {
if (entries == null || entries.isEmpty()) {
return;
}
try {
BulkRequest.Builder bulkBuilder = new BulkRequest.Builder();
for (LogEntry entry : entries) {
String indexName = indexPrefix + DAY_FMT.format(
entry.getTimestamp() != null ? entry.getTimestamp() : java.time.Instant.now());
Map<String, Object> doc = toMap(entry, agentId, application);
bulkBuilder.operations(op -> op
.index(idx -> idx
.index(indexName)
.document(doc)));
}
BulkResponse response = client.bulk(bulkBuilder.build());
if (response.errors()) {
int errorCount = 0;
for (BulkResponseItem item : response.items()) {
if (item.error() != null) {
errorCount++;
if (errorCount == 1) {
log.error("Bulk log index error: {}", item.error().reason());
}
}
}
log.error("Bulk log indexing had {} error(s) out of {} entries", errorCount, entries.size());
} else {
log.debug("Indexed {} log entries for agent={}, app={}", entries.size(), agentId, application);
}
} catch (IOException e) {
log.error("Failed to bulk index {} log entries for agent={}", entries.size(), agentId, e);
}
}
private Map<String, Object> toMap(LogEntry entry, String agentId, String application) {
Map<String, Object> doc = new LinkedHashMap<>();
doc.put("@timestamp", entry.getTimestamp() != null ? entry.getTimestamp().toString() : null);
doc.put("level", entry.getLevel());
doc.put("loggerName", entry.getLoggerName());
doc.put("message", entry.getMessage());
doc.put("threadName", entry.getThreadName());
doc.put("stackTrace", entry.getStackTrace());
doc.put("mdc", entry.getMdc());
doc.put("agentId", agentId);
doc.put("application", application);
if (entry.getMdc() != null) {
String exId = entry.getMdc().get("camel.exchangeId");
if (exId != null) doc.put("exchangeId", exId);
}
return doc;
}
}

View File

@@ -159,6 +159,9 @@ public class OidcAuthController {
throw e;
} catch (Exception e) {
log.error("OIDC callback failed: {}", e.getMessage(), e);
auditService.log("unknown", "login_oidc", AuditCategory.AUTH, null,
Map.of("reason", e.getMessage() != null ? e.getMessage() : "unknown"),
AuditResult.FAILURE, httpRequest);
throw new ResponseStatusException(HttpStatus.UNAUTHORIZED,
"OIDC authentication failed: " + e.getMessage());
}

View File

@@ -77,6 +77,10 @@ public class SecurityConfig {
.requestMatchers(HttpMethod.GET, "/api/v1/search/**").hasAnyRole("VIEWER", "OPERATOR", "ADMIN", "AGENT")
.requestMatchers(HttpMethod.POST, "/api/v1/search/**").hasAnyRole("VIEWER", "OPERATOR", "ADMIN")
// Application config endpoints
.requestMatchers(HttpMethod.GET, "/api/v1/config/*").hasAnyRole("VIEWER", "OPERATOR", "ADMIN", "AGENT")
.requestMatchers(HttpMethod.PUT, "/api/v1/config/*").hasAnyRole("OPERATOR", "ADMIN")
// Read-only data endpoints — viewer+
.requestMatchers(HttpMethod.GET, "/api/v1/executions/**").hasAnyRole("VIEWER", "OPERATOR", "ADMIN")
.requestMatchers(HttpMethod.GET, "/api/v1/diagrams/**").hasAnyRole("VIEWER", "OPERATOR", "ADMIN")

View File

@@ -123,7 +123,8 @@ public class UiAuthController {
@ApiResponse(responseCode = "200", description = "Token refreshed")
@ApiResponse(responseCode = "401", description = "Invalid refresh token",
content = @Content(schema = @Schema(implementation = ErrorResponse.class)))
public ResponseEntity<AuthTokenResponse> refresh(@RequestBody RefreshRequest request) {
public ResponseEntity<AuthTokenResponse> refresh(@RequestBody RefreshRequest request,
HttpServletRequest httpRequest) {
try {
JwtValidationResult result = jwtService.validateRefreshToken(request.refreshToken());
if (!result.subject().startsWith("user:")) {
@@ -138,6 +139,7 @@ public class UiAuthController {
String displayName = userRepository.findById(result.subject())
.map(UserInfo::displayName)
.orElse(result.subject());
auditService.log(result.subject(), "token_refresh", AuditCategory.AUTH, null, null, AuditResult.SUCCESS, httpRequest);
return ResponseEntity.ok(new AuthTokenResponse(accessToken, refreshToken, displayName, null));
} catch (ResponseStatusException e) {
throw e;

View File

@@ -0,0 +1,77 @@
package com.cameleer3.server.app.storage;
import com.cameleer3.common.model.ApplicationConfig;
import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.databind.ObjectMapper;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.stereotype.Repository;
import java.util.List;
import java.util.Optional;
@Repository
public class PostgresApplicationConfigRepository {
private final JdbcTemplate jdbc;
private final ObjectMapper objectMapper;
public PostgresApplicationConfigRepository(JdbcTemplate jdbc, ObjectMapper objectMapper) {
this.jdbc = jdbc;
this.objectMapper = objectMapper;
}
public List<ApplicationConfig> findAll() {
return jdbc.query(
"SELECT config_val, version, updated_at FROM application_config ORDER BY application",
(rs, rowNum) -> {
try {
ApplicationConfig cfg = objectMapper.readValue(rs.getString("config_val"), ApplicationConfig.class);
cfg.setVersion(rs.getInt("version"));
cfg.setUpdatedAt(rs.getTimestamp("updated_at").toInstant());
return cfg;
} catch (JsonProcessingException e) {
throw new RuntimeException("Failed to deserialize application config", e);
}
});
}
public Optional<ApplicationConfig> findByApplication(String application) {
List<ApplicationConfig> results = jdbc.query(
"SELECT config_val, version, updated_at FROM application_config WHERE application = ?",
(rs, rowNum) -> {
try {
ApplicationConfig cfg = objectMapper.readValue(rs.getString("config_val"), ApplicationConfig.class);
cfg.setVersion(rs.getInt("version"));
cfg.setUpdatedAt(rs.getTimestamp("updated_at").toInstant());
return cfg;
} catch (JsonProcessingException e) {
throw new RuntimeException("Failed to deserialize application config", e);
}
},
application);
return results.isEmpty() ? Optional.empty() : Optional.of(results.get(0));
}
public ApplicationConfig save(String application, ApplicationConfig config, String updatedBy) {
String json;
try {
json = objectMapper.writeValueAsString(config);
} catch (JsonProcessingException e) {
throw new RuntimeException("Failed to serialize application config", e);
}
// Upsert: insert or update, auto-increment version
int updated = jdbc.update("""
INSERT INTO application_config (application, config_val, version, updated_at, updated_by)
VALUES (?, ?::jsonb, 1, now(), ?)
ON CONFLICT (application) DO UPDATE SET
config_val = EXCLUDED.config_val,
version = application_config.version + 1,
updated_at = now(),
updated_by = EXCLUDED.updated_by
""",
application, json, updatedBy);
return findByApplication(application).orElseThrow();
}
}

View File

@@ -16,6 +16,7 @@ import java.security.MessageDigest;
import java.security.NoSuchAlgorithmException;
import java.util.ArrayList;
import java.util.Collections;
import java.util.HashMap;
import java.util.HexFormat;
import java.util.List;
import java.util.Map;
@@ -33,8 +34,8 @@ public class PostgresDiagramStore implements DiagramStore {
private static final Logger log = LoggerFactory.getLogger(PostgresDiagramStore.class);
private static final String INSERT_SQL = """
INSERT INTO route_diagrams (content_hash, route_id, agent_id, definition)
VALUES (?, ?, ?, ?::jsonb)
INSERT INTO route_diagrams (content_hash, route_id, agent_id, application_name, definition)
VALUES (?, ?, ?, ?, ?::jsonb)
ON CONFLICT (content_hash) DO NOTHING
""";
@@ -62,11 +63,12 @@ public class PostgresDiagramStore implements DiagramStore {
try {
RouteGraph graph = diagram.graph();
String agentId = diagram.agentId() != null ? diagram.agentId() : "";
String applicationName = diagram.applicationName() != null ? diagram.applicationName() : "";
String json = objectMapper.writeValueAsString(graph);
String contentHash = sha256Hex(json);
String routeId = graph.getRouteId() != null ? graph.getRouteId() : "";
jdbcTemplate.update(INSERT_SQL, contentHash, routeId, agentId, json);
jdbcTemplate.update(INSERT_SQL, contentHash, routeId, agentId, applicationName, json);
log.debug("Stored diagram for route={} agent={} with hash={}", routeId, agentId, contentHash);
} catch (JsonProcessingException e) {
throw new RuntimeException("Failed to serialize RouteGraph to JSON", e);
@@ -116,6 +118,21 @@ public class PostgresDiagramStore implements DiagramStore {
return Optional.of((String) rows.get(0).get("content_hash"));
}
@Override
public Map<String, String> findProcessorRouteMapping(String applicationName) {
Map<String, String> mapping = new HashMap<>();
jdbcTemplate.query("""
SELECT DISTINCT rd.route_id, node_elem->>'id' AS processor_id
FROM route_diagrams rd,
jsonb_array_elements(rd.definition::jsonb->'nodes') AS node_elem
WHERE rd.application_name = ?
AND node_elem->>'id' IS NOT NULL
""",
rs -> { mapping.put(rs.getString("processor_id"), rs.getString("route_id")); },
applicationName);
return mapping;
}
static String sha256Hex(String input) {
try {
MessageDigest digest = MessageDigest.getInstance("SHA-256");

View File

@@ -27,8 +27,9 @@ public class PostgresExecutionStore implements ExecutionStore {
INSERT INTO executions (execution_id, route_id, agent_id, application_name,
status, correlation_id, exchange_id, start_time, end_time,
duration_ms, error_message, error_stacktrace, diagram_content_hash,
created_at, updated_at)
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, now(), now())
engine_level, input_body, output_body, input_headers, output_headers,
attributes, created_at, updated_at)
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?::jsonb, ?::jsonb, ?::jsonb, now(), now())
ON CONFLICT (execution_id, start_time) DO UPDATE SET
status = CASE
WHEN EXCLUDED.status IN ('COMPLETED', 'FAILED')
@@ -42,6 +43,12 @@ public class PostgresExecutionStore implements ExecutionStore {
error_message = COALESCE(EXCLUDED.error_message, executions.error_message),
error_stacktrace = COALESCE(EXCLUDED.error_stacktrace, executions.error_stacktrace),
diagram_content_hash = COALESCE(EXCLUDED.diagram_content_hash, executions.diagram_content_hash),
engine_level = COALESCE(EXCLUDED.engine_level, executions.engine_level),
input_body = COALESCE(EXCLUDED.input_body, executions.input_body),
output_body = COALESCE(EXCLUDED.output_body, executions.output_body),
input_headers = COALESCE(EXCLUDED.input_headers, executions.input_headers),
output_headers = COALESCE(EXCLUDED.output_headers, executions.output_headers),
attributes = COALESCE(EXCLUDED.attributes, executions.attributes),
updated_at = now()
""",
execution.executionId(), execution.routeId(), execution.agentId(),
@@ -50,7 +57,11 @@ public class PostgresExecutionStore implements ExecutionStore {
Timestamp.from(execution.startTime()),
execution.endTime() != null ? Timestamp.from(execution.endTime()) : null,
execution.durationMs(), execution.errorMessage(),
execution.errorStacktrace(), execution.diagramContentHash());
execution.errorStacktrace(), execution.diagramContentHash(),
execution.engineLevel(),
execution.inputBody(), execution.outputBody(),
execution.inputHeaders(), execution.outputHeaders(),
execution.attributes());
}
@Override
@@ -59,10 +70,10 @@ public class PostgresExecutionStore implements ExecutionStore {
List<ProcessorRecord> processors) {
jdbc.batchUpdate("""
INSERT INTO processor_executions (execution_id, processor_id, processor_type,
diagram_node_id, application_name, route_id, depth, parent_processor_id,
application_name, route_id, depth, parent_processor_id,
status, start_time, end_time, duration_ms, error_message, error_stacktrace,
input_body, output_body, input_headers, output_headers)
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?::jsonb, ?::jsonb)
input_body, output_body, input_headers, output_headers, attributes)
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?::jsonb, ?::jsonb, ?::jsonb)
ON CONFLICT (execution_id, processor_id, start_time) DO UPDATE SET
status = EXCLUDED.status,
end_time = COALESCE(EXCLUDED.end_time, processor_executions.end_time),
@@ -72,16 +83,18 @@ public class PostgresExecutionStore implements ExecutionStore {
input_body = COALESCE(EXCLUDED.input_body, processor_executions.input_body),
output_body = COALESCE(EXCLUDED.output_body, processor_executions.output_body),
input_headers = COALESCE(EXCLUDED.input_headers, processor_executions.input_headers),
output_headers = COALESCE(EXCLUDED.output_headers, processor_executions.output_headers)
output_headers = COALESCE(EXCLUDED.output_headers, processor_executions.output_headers),
attributes = COALESCE(EXCLUDED.attributes, processor_executions.attributes)
""",
processors.stream().map(p -> new Object[]{
p.executionId(), p.processorId(), p.processorType(),
p.diagramNodeId(), p.applicationName(), p.routeId(),
p.applicationName(), p.routeId(),
p.depth(), p.parentProcessorId(), p.status(),
Timestamp.from(p.startTime()),
p.endTime() != null ? Timestamp.from(p.endTime()) : null,
p.durationMs(), p.errorMessage(), p.errorStacktrace(),
p.inputBody(), p.outputBody(), p.inputHeaders(), p.outputHeaders()
p.inputBody(), p.outputBody(), p.inputHeaders(), p.outputHeaders(),
p.attributes()
}).toList());
}
@@ -109,12 +122,16 @@ public class PostgresExecutionStore implements ExecutionStore {
toInstant(rs, "start_time"), toInstant(rs, "end_time"),
rs.getObject("duration_ms") != null ? rs.getLong("duration_ms") : null,
rs.getString("error_message"), rs.getString("error_stacktrace"),
rs.getString("diagram_content_hash"));
rs.getString("diagram_content_hash"),
rs.getString("engine_level"),
rs.getString("input_body"), rs.getString("output_body"),
rs.getString("input_headers"), rs.getString("output_headers"),
rs.getString("attributes"));
private static final RowMapper<ProcessorRecord> PROCESSOR_MAPPER = (rs, rowNum) ->
new ProcessorRecord(
rs.getString("execution_id"), rs.getString("processor_id"),
rs.getString("processor_type"), rs.getString("diagram_node_id"),
rs.getString("processor_type"),
rs.getString("application_name"), rs.getString("route_id"),
rs.getInt("depth"), rs.getString("parent_processor_id"),
rs.getString("status"),
@@ -122,7 +139,8 @@ public class PostgresExecutionStore implements ExecutionStore {
rs.getObject("duration_ms") != null ? rs.getLong("duration_ms") : null,
rs.getString("error_message"), rs.getString("error_stacktrace"),
rs.getString("input_body"), rs.getString("output_body"),
rs.getString("input_headers"), rs.getString("output_headers"));
rs.getString("input_headers"), rs.getString("output_headers"),
rs.getString("attributes"));
private static Instant toInstant(ResultSet rs, String column) throws SQLException {
Timestamp ts = rs.getTimestamp(column);

View File

@@ -42,6 +42,8 @@ opensearch:
index-prefix: ${CAMELEER_OPENSEARCH_INDEX_PREFIX:executions-}
queue-size: ${CAMELEER_OPENSEARCH_QUEUE_SIZE:10000}
debounce-ms: ${CAMELEER_OPENSEARCH_DEBOUNCE_MS:2000}
log-index-prefix: ${CAMELEER_LOG_INDEX_PREFIX:logs-}
log-retention-days: ${CAMELEER_LOG_RETENTION_DAYS:7}
cameleer:
body-size-limit: ${CAMELEER_BODY_SIZE_LIMIT:16384}

View File

@@ -0,0 +1,9 @@
-- Add engine level and route-level snapshot columns to executions table.
-- Required for REGULAR engine level where route-level payloads exist but
-- no processor execution records are created.
ALTER TABLE executions ADD COLUMN IF NOT EXISTS engine_level VARCHAR(16);
ALTER TABLE executions ADD COLUMN IF NOT EXISTS input_body TEXT;
ALTER TABLE executions ADD COLUMN IF NOT EXISTS output_body TEXT;
ALTER TABLE executions ADD COLUMN IF NOT EXISTS input_headers JSONB;
ALTER TABLE executions ADD COLUMN IF NOT EXISTS output_headers JSONB;

View File

@@ -0,0 +1,9 @@
-- Per-application configuration for agent observability settings.
-- Agents download this at startup and receive updates via SSE CONFIG_UPDATE.
CREATE TABLE application_config (
application TEXT PRIMARY KEY,
config_val JSONB NOT NULL,
version INTEGER NOT NULL DEFAULT 1,
updated_at TIMESTAMPTZ NOT NULL DEFAULT now(),
updated_by TEXT
);

View File

@@ -0,0 +1,2 @@
ALTER TABLE executions ADD COLUMN IF NOT EXISTS attributes JSONB;
ALTER TABLE processor_executions ADD COLUMN IF NOT EXISTS attributes JSONB;

View File

@@ -0,0 +1 @@
ALTER TABLE processor_executions DROP COLUMN IF EXISTS diagram_node_id;

View File

@@ -0,0 +1,2 @@
ALTER TABLE route_diagrams ADD COLUMN IF NOT EXISTS application_name TEXT NOT NULL DEFAULT '';
CREATE INDEX IF NOT EXISTS idx_diagrams_application ON route_diagrams (application_name);

View File

@@ -50,11 +50,11 @@ class BackpressureIT extends AbstractPostgresIT {
// Fill the metrics buffer completely with a batch of 5
String batchJson = """
[
{"agentId":"bp-agent","timestamp":"2026-03-11T10:00:00Z","metrics":{}},
{"agentId":"bp-agent","timestamp":"2026-03-11T10:00:01Z","metrics":{}},
{"agentId":"bp-agent","timestamp":"2026-03-11T10:00:02Z","metrics":{}},
{"agentId":"bp-agent","timestamp":"2026-03-11T10:00:03Z","metrics":{}},
{"agentId":"bp-agent","timestamp":"2026-03-11T10:00:04Z","metrics":{}}
{"agentId":"bp-agent","collectedAt":"2026-03-11T10:00:00Z","metricName":"test.metric","metricValue":1.0,"tags":{}},
{"agentId":"bp-agent","collectedAt":"2026-03-11T10:00:01Z","metricName":"test.metric","metricValue":2.0,"tags":{}},
{"agentId":"bp-agent","collectedAt":"2026-03-11T10:00:02Z","metricName":"test.metric","metricValue":3.0,"tags":{}},
{"agentId":"bp-agent","collectedAt":"2026-03-11T10:00:03Z","metricName":"test.metric","metricValue":4.0,"tags":{}},
{"agentId":"bp-agent","collectedAt":"2026-03-11T10:00:04Z","metricName":"test.metric","metricValue":5.0,"tags":{}}
]
""";
@@ -66,7 +66,7 @@ class BackpressureIT extends AbstractPostgresIT {
// Now buffer should be full -- next POST should get 503
String overflowJson = """
[{"agentId":"bp-agent","timestamp":"2026-03-11T10:00:05Z","metrics":{}}]
[{"agentId":"bp-agent","collectedAt":"2026-03-11T10:00:05Z","metricName":"test.metric","metricValue":6.0,"tags":{}}]
""";
ResponseEntity<String> response = restTemplate.postForEntity(

View File

@@ -65,7 +65,6 @@ class DetailControllerIT extends AbstractPostgresIT {
"startTime": "2026-03-10T10:00:00Z",
"endTime": "2026-03-10T10:00:01Z",
"durationMs": 1000,
"diagramNodeId": "node-root",
"inputBody": "root-input-body",
"outputBody": "root-output-body",
"inputHeaders": {"Content-Type": "application/json"},
@@ -78,7 +77,6 @@ class DetailControllerIT extends AbstractPostgresIT {
"startTime": "2026-03-10T10:00:00.100Z",
"endTime": "2026-03-10T10:00:00.200Z",
"durationMs": 100,
"diagramNodeId": "node-child1",
"inputBody": "child1-input",
"outputBody": "child1-output",
"inputHeaders": {},
@@ -91,7 +89,6 @@ class DetailControllerIT extends AbstractPostgresIT {
"startTime": "2026-03-10T10:00:00.200Z",
"endTime": "2026-03-10T10:00:00.800Z",
"durationMs": 600,
"diagramNodeId": "node-child2",
"inputBody": "child2-input",
"outputBody": "child2-output",
"inputHeaders": {},
@@ -104,7 +101,6 @@ class DetailControllerIT extends AbstractPostgresIT {
"startTime": "2026-03-10T10:00:00.300Z",
"endTime": "2026-03-10T10:00:00.700Z",
"durationMs": 400,
"diagramNodeId": "node-gc",
"inputBody": "gc-input",
"outputBody": "gc-output",
"inputHeaders": {"X-GC": "true"},

View File

@@ -39,8 +39,7 @@ class DiagramControllerIT extends AbstractPostgresIT {
"description": "Test route",
"version": 1,
"nodes": [],
"edges": [],
"processorNodeMapping": {}
"edges": []
}
""";
@@ -60,8 +59,7 @@ class DiagramControllerIT extends AbstractPostgresIT {
"description": "Flush test",
"version": 1,
"nodes": [],
"edges": [],
"processorNodeMapping": {}
"edges": []
}
""";

View File

@@ -53,8 +53,7 @@ class DiagramRenderControllerIT extends AbstractPostgresIT {
"edges": [
{"source": "n1", "target": "n2", "edgeType": "FLOW"},
{"source": "n2", "target": "n3", "edgeType": "FLOW"}
],
"processorNodeMapping": {}
]
}
""";

View File

@@ -35,7 +35,8 @@ class OpenSearchIndexIT extends AbstractPostgresIT {
now, now.plusMillis(100), 100L,
"OrderNotFoundException: order-12345 not found", null,
List.of(new ProcessorDoc("proc-1", "log", "COMPLETED",
null, null, "request body with customer-99", null, null, null)));
null, null, "request body with customer-99", null, null, null, null)),
null);
searchIndex.index(doc);
refreshOpenSearchIndices();
@@ -60,7 +61,8 @@ class OpenSearchIndexIT extends AbstractPostgresIT {
"COMPLETED", null, null,
now, now.plusMillis(50), 50L, null, null,
List.of(new ProcessorDoc("proc-1", "bean", "COMPLETED",
null, null, "UniquePayloadIdentifier12345", null, null, null)));
null, null, "UniquePayloadIdentifier12345", null, null, null, null)),
null);
searchIndex.index(doc);
refreshOpenSearchIndices();

View File

@@ -46,8 +46,7 @@ class DiagramLinkingIT extends AbstractPostgresIT {
],
"edges": [
{"source": "n1", "target": "n2", "edgeType": "FLOW"}
],
"processorNodeMapping": {}
]
}
""";

View File

@@ -55,8 +55,7 @@ class IngestionSchemaIT extends AbstractPostgresIT {
"startTime": "2026-03-11T10:00:00Z",
"endTime": "2026-03-11T10:00:00.500Z",
"durationMs": 500,
"diagramNodeId": "node-root",
"inputBody": "root-input",
"inputBody": "root-input",
"outputBody": "root-output",
"inputHeaders": {"Content-Type": "application/json"},
"outputHeaders": {"X-Result": "ok"},
@@ -68,8 +67,7 @@ class IngestionSchemaIT extends AbstractPostgresIT {
"startTime": "2026-03-11T10:00:00.100Z",
"endTime": "2026-03-11T10:00:00.400Z",
"durationMs": 300,
"diagramNodeId": "node-child",
"inputBody": "child-input",
"inputBody": "child-input",
"outputBody": "child-output",
"children": [
{
@@ -79,8 +77,7 @@ class IngestionSchemaIT extends AbstractPostgresIT {
"startTime": "2026-03-11T10:00:00.200Z",
"endTime": "2026-03-11T10:00:00.300Z",
"durationMs": 100,
"diagramNodeId": "node-grandchild",
"children": []
"children": []
}
]
}
@@ -101,7 +98,7 @@ class IngestionSchemaIT extends AbstractPostgresIT {
// Verify processors were flattened into processor_executions
List<Map<String, Object>> processors = jdbcTemplate.queryForList(
"SELECT processor_id, processor_type, depth, parent_processor_id, " +
"diagram_node_id, input_body, output_body, input_headers " +
"input_body, output_body, input_headers " +
"FROM processor_executions WHERE execution_id = 'ex-tree-1' " +
"ORDER BY depth, processor_id");
assertThat(processors).hasSize(3);
@@ -110,7 +107,6 @@ class IngestionSchemaIT extends AbstractPostgresIT {
assertThat(processors.get(0).get("processor_id")).isEqualTo("root-proc");
assertThat(((Number) processors.get(0).get("depth")).intValue()).isEqualTo(0);
assertThat(processors.get(0).get("parent_processor_id")).isNull();
assertThat(processors.get(0).get("diagram_node_id")).isEqualTo("node-root");
assertThat(processors.get(0).get("input_body")).isEqualTo("root-input");
assertThat(processors.get(0).get("output_body")).isEqualTo("root-output");
assertThat(processors.get(0).get("input_headers").toString()).contains("Content-Type");
@@ -119,7 +115,6 @@ class IngestionSchemaIT extends AbstractPostgresIT {
assertThat(processors.get(1).get("processor_id")).isEqualTo("child-proc");
assertThat(((Number) processors.get(1).get("depth")).intValue()).isEqualTo(1);
assertThat(processors.get(1).get("parent_processor_id")).isEqualTo("root-proc");
assertThat(processors.get(1).get("diagram_node_id")).isEqualTo("node-child");
assertThat(processors.get(1).get("input_body")).isEqualTo("child-input");
assertThat(processors.get(1).get("output_body")).isEqualTo("child-output");
@@ -127,7 +122,6 @@ class IngestionSchemaIT extends AbstractPostgresIT {
assertThat(processors.get(2).get("processor_id")).isEqualTo("grandchild-proc");
assertThat(((Number) processors.get(2).get("depth")).intValue()).isEqualTo(2);
assertThat(processors.get(2).get("parent_processor_id")).isEqualTo("child-proc");
assertThat(processors.get(2).get("diagram_node_id")).isEqualTo("node-grandchild");
}
@Test

View File

@@ -25,7 +25,8 @@ class PostgresExecutionStoreIT extends AbstractPostgresIT {
"exec-1", "route-a", "agent-1", "app-1",
"COMPLETED", "corr-1", "exchange-1",
now, now.plusMillis(100), 100L,
null, null, null);
null, null, null,
"REGULAR", null, null, null, null, null);
executionStore.upsert(record);
Optional<ExecutionRecord> found = executionStore.findById("exec-1");
@@ -33,6 +34,7 @@ class PostgresExecutionStoreIT extends AbstractPostgresIT {
assertTrue(found.isPresent());
assertEquals("exec-1", found.get().executionId());
assertEquals("COMPLETED", found.get().status());
assertEquals("REGULAR", found.get().engineLevel());
}
@Test
@@ -40,10 +42,12 @@ class PostgresExecutionStoreIT extends AbstractPostgresIT {
Instant now = Instant.now();
ExecutionRecord first = new ExecutionRecord(
"exec-dup", "route-a", "agent-1", "app-1",
"RUNNING", null, null, now, null, null, null, null, null);
"RUNNING", null, null, now, null, null, null, null, null,
null, null, null, null, null, null);
ExecutionRecord second = new ExecutionRecord(
"exec-dup", "route-a", "agent-1", "app-1",
"COMPLETED", null, null, now, now.plusMillis(200), 200L, null, null, null);
"COMPLETED", null, null, now, now.plusMillis(200), 200L, null, null, null,
"COMPLETE", null, null, null, null, null);
executionStore.upsert(first);
executionStore.upsert(second);
@@ -59,18 +63,19 @@ class PostgresExecutionStoreIT extends AbstractPostgresIT {
Instant now = Instant.now();
ExecutionRecord exec = new ExecutionRecord(
"exec-proc", "route-a", "agent-1", "app-1",
"COMPLETED", null, null, now, now.plusMillis(50), 50L, null, null, null);
"COMPLETED", null, null, now, now.plusMillis(50), 50L, null, null, null,
"COMPLETE", null, null, null, null, null);
executionStore.upsert(exec);
List<ProcessorRecord> processors = List.of(
new ProcessorRecord("exec-proc", "proc-1", "log", null,
new ProcessorRecord("exec-proc", "proc-1", "log",
"app-1", "route-a", 0, null, "COMPLETED",
now, now.plusMillis(10), 10L, null, null,
"input body", "output body", null, null),
new ProcessorRecord("exec-proc", "proc-2", "to", null,
"input body", "output body", null, null, null),
new ProcessorRecord("exec-proc", "proc-2", "to",
"app-1", "route-a", 1, "proc-1", "COMPLETED",
now.plusMillis(10), now.plusMillis(30), 20L, null, null,
null, null, null, null)
null, null, null, null, null)
);
executionStore.upsertProcessors("exec-proc", now, "app-1", "route-a", processors);

View File

@@ -59,6 +59,7 @@ class PostgresStatsStoreIT extends AbstractPostgresIT {
executionStore.upsert(new ExecutionRecord(
id, routeId, "agent-1", applicationName, status, null, null,
startTime, startTime.plusMillis(durationMs), durationMs,
status.equals("FAILED") ? "error" : null, null, null));
status.equals("FAILED") ? "error" : null, null, null,
null, null, null, null, null, null));
}
}

View File

@@ -1,5 +1,5 @@
package com.cameleer3.server.core.admin;
public enum AuditCategory {
INFRA, AUTH, USER_MGMT, CONFIG, RBAC
INFRA, AUTH, USER_MGMT, CONFIG, RBAC, AGENT
}

View File

@@ -34,6 +34,10 @@ public class AuditService {
repository.insert(record);
if (request != null) {
request.setAttribute("audit.logged", true);
}
log.info("AUDIT: user={} action={} category={} target={} result={}",
username, action, category, target, result);
}

View File

@@ -9,6 +9,7 @@ import java.util.ArrayList;
import java.util.List;
import java.util.Map;
import java.util.UUID;
import java.util.concurrent.CompletableFuture;
import java.util.concurrent.ConcurrentHashMap;
import java.util.concurrent.ConcurrentLinkedQueue;
import java.util.stream.Collectors;
@@ -30,6 +31,7 @@ public class AgentRegistryService {
private final ConcurrentHashMap<String, AgentInfo> agents = new ConcurrentHashMap<>();
private final ConcurrentHashMap<String, ConcurrentLinkedQueue<AgentCommand>> commands = new ConcurrentHashMap<>();
private final ConcurrentHashMap<String, CompletableFuture<CommandReply>> pendingReplies = new ConcurrentHashMap<>();
private volatile AgentEventListener eventListener;
@@ -279,6 +281,31 @@ public class AgentRegistryService {
}
}
/**
* Register a command that expects a synchronous reply from the agent.
* Returns a CompletableFuture that will be completed when the agent ACKs the command.
* Auto-cleans up from the pending map on completion or timeout.
*/
public CompletableFuture<CommandReply> addCommandWithReply(String agentId, CommandType type, String payload) {
AgentCommand command = addCommand(agentId, type, payload);
CompletableFuture<CommandReply> future = new CompletableFuture<>();
pendingReplies.put(command.id(), future);
future.whenComplete((result, ex) -> pendingReplies.remove(command.id()));
return future;
}
/**
* Complete a pending reply future for a command.
* Called when an agent ACKs a command that was registered via {@link #addCommandWithReply}.
* No-op if no pending future exists for the given command ID.
*/
public void completeReply(String commandId, String status, String message, String data) {
CompletableFuture<CommandReply> future = pendingReplies.remove(commandId);
if (future != null) {
future.complete(new CommandReply(status, message, data));
}
}
/**
* Set the event listener for command notifications.
* The SSE layer in the app module implements this interface.

View File

@@ -0,0 +1,11 @@
package com.cameleer3.server.core.agent;
/**
* Represents the reply data from an agent command acknowledgment.
* Used for synchronous request-reply command patterns (e.g. TEST_EXPRESSION).
*
* @param status "SUCCESS" or "FAILURE"
* @param message human-readable description of the result
* @param data optional structured JSON data returned by the agent
*/
public record CommandReply(String status, String message, String data) {}

View File

@@ -6,5 +6,7 @@ package com.cameleer3.server.core.agent;
public enum CommandType {
CONFIG_UPDATE,
DEEP_TRACE,
REPLAY
REPLAY,
SET_TRACED_PROCESSORS,
TEST_EXPRESSION
}

View File

@@ -2,11 +2,16 @@ package com.cameleer3.server.core.detail;
import com.cameleer3.server.core.storage.ExecutionStore;
import com.cameleer3.server.core.storage.ExecutionStore.ProcessorRecord;
import com.fasterxml.jackson.core.type.TypeReference;
import com.fasterxml.jackson.databind.ObjectMapper;
import java.util.*;
public class DetailService {
private static final ObjectMapper JSON = new ObjectMapper();
private static final TypeReference<Map<String, String>> STR_MAP = new TypeReference<>() {};
private final ExecutionStore executionStore;
public DetailService(ExecutionStore executionStore) {
@@ -25,7 +30,10 @@ public class DetailService {
exec.durationMs() != null ? exec.durationMs() : 0L,
exec.correlationId(), exec.exchangeId(),
exec.errorMessage(), exec.errorStacktrace(),
exec.diagramContentHash(), roots
exec.diagramContentHash(), roots,
exec.inputBody(), exec.outputBody(),
exec.inputHeaders(), exec.outputHeaders(),
parseAttributes(exec.attributes())
);
});
}
@@ -39,7 +47,8 @@ public class DetailService {
p.processorId(), p.processorType(), p.status(),
p.startTime(), p.endTime(),
p.durationMs() != null ? p.durationMs() : 0L,
p.diagramNodeId(), p.errorMessage(), p.errorStacktrace()
p.errorMessage(), p.errorStacktrace(),
parseAttributes(p.attributes())
));
}
@@ -59,4 +68,13 @@ public class DetailService {
}
return roots;
}
private static Map<String, String> parseAttributes(String json) {
if (json == null || json.isBlank()) return null;
try {
return JSON.readValue(json, STR_MAP);
} catch (Exception e) {
return null;
}
}
}

View File

@@ -2,6 +2,7 @@ package com.cameleer3.server.core.detail;
import java.time.Instant;
import java.util.List;
import java.util.Map;
/**
* Full detail of a route execution, including the nested processor tree.
@@ -22,6 +23,10 @@ import java.util.List;
* @param errorStackTrace error stack trace (empty string if no error)
* @param diagramContentHash content hash linking to the active route diagram version
* @param processors nested processor execution tree (root nodes)
* @param inputBody exchange input body at route entry (null if not captured)
* @param outputBody exchange output body at route exit (null if not captured)
* @param inputHeaders exchange input headers at route entry (null if not captured)
* @param outputHeaders exchange output headers at route exit (null if not captured)
*/
public record ExecutionDetail(
String executionId,
@@ -37,6 +42,11 @@ public record ExecutionDetail(
String errorMessage,
String errorStackTrace,
String diagramContentHash,
List<ProcessorNode> processors
List<ProcessorNode> processors,
String inputBody,
String outputBody,
String inputHeaders,
String outputHeaders,
Map<String, String> attributes
) {
}

View File

@@ -3,6 +3,7 @@ package com.cameleer3.server.core.detail;
import java.time.Instant;
import java.util.ArrayList;
import java.util.List;
import java.util.Map;
/**
* Nested tree node representing a single processor execution within a route.
@@ -18,23 +19,24 @@ public final class ProcessorNode {
private final Instant startTime;
private final Instant endTime;
private final long durationMs;
private final String diagramNodeId;
private final String errorMessage;
private final String errorStackTrace;
private final Map<String, String> attributes;
private final List<ProcessorNode> children;
public ProcessorNode(String processorId, String processorType, String status,
Instant startTime, Instant endTime, long durationMs,
String diagramNodeId, String errorMessage, String errorStackTrace) {
String errorMessage, String errorStackTrace,
Map<String, String> attributes) {
this.processorId = processorId;
this.processorType = processorType;
this.status = status;
this.startTime = startTime;
this.endTime = endTime;
this.durationMs = durationMs;
this.diagramNodeId = diagramNodeId;
this.errorMessage = errorMessage;
this.errorStackTrace = errorStackTrace;
this.attributes = attributes;
this.children = new ArrayList<>();
}
@@ -48,8 +50,8 @@ public final class ProcessorNode {
public Instant getStartTime() { return startTime; }
public Instant getEndTime() { return endTime; }
public long getDurationMs() { return durationMs; }
public String getDiagramNodeId() { return diagramNodeId; }
public String getErrorMessage() { return errorMessage; }
public String getErrorStackTrace() { return errorStackTrace; }
public Map<String, String> getAttributes() { return attributes; }
public List<ProcessorNode> getChildren() { return List.copyOf(children); }
}

View File

@@ -19,4 +19,14 @@ public interface DiagramRenderer {
* Compute a positioned JSON layout for the route graph.
*/
DiagramLayout layoutJson(RouteGraph graph);
/**
* Compute a positioned JSON layout with a specific flow direction.
*
* @param graph the route graph
* @param direction "LR" for left-to-right, "TB" for top-to-bottom
*/
default DiagramLayout layoutJson(RouteGraph graph, String direction) {
return layoutJson(graph);
}
}

View File

@@ -70,14 +70,16 @@ public class SearchIndexer implements SearchIndexerStats {
p.processorId(), p.processorType(), p.status(),
p.errorMessage(), p.errorStacktrace(),
p.inputBody(), p.outputBody(),
p.inputHeaders(), p.outputHeaders()))
p.inputHeaders(), p.outputHeaders(),
p.attributes()))
.toList();
searchIndex.index(new ExecutionDocument(
exec.executionId(), exec.routeId(), exec.agentId(), exec.applicationName(),
exec.status(), exec.correlationId(), exec.exchangeId(),
exec.startTime(), exec.endTime(), exec.durationMs(),
exec.errorMessage(), exec.errorStacktrace(), processorDocs));
exec.errorMessage(), exec.errorStacktrace(), processorDocs,
exec.attributes()));
indexedCount.incrementAndGet();
lastIndexedAt = Instant.now();

View File

@@ -1,5 +1,6 @@
package com.cameleer3.server.core.ingestion;
import com.cameleer3.common.model.ExchangeSnapshot;
import com.cameleer3.common.model.ProcessorExecution;
import com.cameleer3.common.model.RouteExecution;
import com.cameleer3.server.core.indexing.ExecutionUpdatedEvent;
@@ -77,6 +78,25 @@ public class IngestionService {
String diagramHash = diagramStore
.findContentHashForRoute(exec.getRouteId(), agentId)
.orElse("");
// Extract route-level snapshots (critical for REGULAR mode where no processors are recorded)
String inputBody = null;
String outputBody = null;
String inputHeaders = null;
String outputHeaders = null;
ExchangeSnapshot inputSnapshot = exec.getInputSnapshot();
if (inputSnapshot != null) {
inputBody = truncateBody(inputSnapshot.getBody());
inputHeaders = toJson(inputSnapshot.getHeaders());
}
ExchangeSnapshot outputSnapshot = exec.getOutputSnapshot();
if (outputSnapshot != null) {
outputBody = truncateBody(outputSnapshot.getBody());
outputHeaders = toJson(outputSnapshot.getHeaders());
}
return new ExecutionRecord(
exec.getExchangeId(), exec.getRouteId(), agentId, applicationName,
exec.getStatus() != null ? exec.getStatus().name() : "RUNNING",
@@ -84,7 +104,10 @@ public class IngestionService {
exec.getStartTime(), exec.getEndTime(),
exec.getDurationMs(),
exec.getErrorMessage(), exec.getErrorStackTrace(),
diagramHash
diagramHash,
exec.getEngineLevel(),
inputBody, outputBody, inputHeaders, outputHeaders,
toJson(exec.getAttributes())
);
}
@@ -96,7 +119,7 @@ public class IngestionService {
for (ProcessorExecution p : processors) {
flat.add(new ProcessorRecord(
executionId, p.getProcessorId(), p.getProcessorType(),
p.getDiagramNodeId(), applicationName, routeId,
applicationName, routeId,
depth, parentProcessorId,
p.getStatus() != null ? p.getStatus().name() : "RUNNING",
p.getStartTime() != null ? p.getStartTime() : execStartTime,
@@ -104,7 +127,8 @@ public class IngestionService {
p.getDurationMs(),
p.getErrorMessage(), p.getErrorStackTrace(),
truncateBody(p.getInputBody()), truncateBody(p.getOutputBody()),
toJson(p.getInputHeaders()), toJson(p.getOutputHeaders())
toJson(p.getInputHeaders()), toJson(p.getOutputHeaders()),
toJson(p.getAttributes())
));
if (p.getChildren() != null) {
flat.addAll(flattenProcessors(

View File

@@ -8,4 +8,4 @@ import com.cameleer3.common.graph.RouteGraph;
* The agent ID is extracted from the SecurityContext in the controller layer
* and carried through the write buffer so the flush scheduler can persist it.
*/
public record TaggedDiagram(String agentId, RouteGraph graph) {}
public record TaggedDiagram(String agentId, String applicationName, RouteGraph graph) {}

View File

@@ -1,6 +1,7 @@
package com.cameleer3.server.core.search;
import java.time.Instant;
import java.util.Map;
/**
* Lightweight summary of a route execution for search result listings.
@@ -30,6 +31,8 @@ public record ExecutionSummary(
long durationMs,
String correlationId,
String errorMessage,
String diagramContentHash
String diagramContentHash,
String highlight,
Map<String, String> attributes
) {
}

View File

@@ -55,16 +55,21 @@ public record SearchRequest(
private static final int MAX_LIMIT = 500;
private static final java.util.Set<String> ALLOWED_SORT_FIELDS = java.util.Set.of(
"startTime", "status", "agentId", "routeId", "correlationId", "durationMs"
"startTime", "status", "agentId", "routeId", "correlationId",
"durationMs", "executionId", "applicationName"
);
private static final java.util.Map<String, String> SORT_FIELD_TO_COLUMN = java.util.Map.of(
"startTime", "start_time",
"status", "status",
"agentId", "agent_id",
"routeId", "route_id",
"correlationId", "correlation_id",
"durationMs", "duration_ms"
/** Maps camelCase API sort field names to OpenSearch field names.
* Text fields use .keyword subfield; date/numeric fields are used directly. */
private static final java.util.Map<String, String> SORT_FIELD_TO_COLUMN = java.util.Map.ofEntries(
java.util.Map.entry("startTime", "start_time"),
java.util.Map.entry("durationMs", "duration_ms"),
java.util.Map.entry("status", "status.keyword"),
java.util.Map.entry("agentId", "agent_id.keyword"),
java.util.Map.entry("routeId", "route_id.keyword"),
java.util.Map.entry("correlationId", "correlation_id.keyword"),
java.util.Map.entry("executionId", "execution_id.keyword"),
java.util.Map.entry("applicationName", "application_name.keyword")
);
public SearchRequest {
@@ -75,7 +80,7 @@ public record SearchRequest(
if (!"asc".equalsIgnoreCase(sortDir)) sortDir = "desc";
}
/** Returns the validated database column name for ORDER BY. */
/** Returns the snake_case column name for OpenSearch/DB ORDER BY. */
public String sortColumn() {
return SORT_FIELD_TO_COLUMN.getOrDefault(sortField, "start_time");
}

View File

@@ -4,6 +4,7 @@ import com.cameleer3.common.graph.RouteGraph;
import com.cameleer3.server.core.ingestion.TaggedDiagram;
import java.util.List;
import java.util.Map;
import java.util.Optional;
public interface DiagramStore {
@@ -15,4 +16,6 @@ public interface DiagramStore {
Optional<String> findContentHashForRoute(String routeId, String agentId);
Optional<String> findContentHashForRouteByAgents(String routeId, List<String> agentIds);
Map<String, String> findProcessorRouteMapping(String applicationName);
}

View File

@@ -20,15 +20,19 @@ public interface ExecutionStore {
String executionId, String routeId, String agentId, String applicationName,
String status, String correlationId, String exchangeId,
Instant startTime, Instant endTime, Long durationMs,
String errorMessage, String errorStacktrace, String diagramContentHash
String errorMessage, String errorStacktrace, String diagramContentHash,
String engineLevel,
String inputBody, String outputBody, String inputHeaders, String outputHeaders,
String attributes
) {}
record ProcessorRecord(
String executionId, String processorId, String processorType,
String diagramNodeId, String applicationName, String routeId,
String applicationName, String routeId,
int depth, String parentProcessorId, String status,
Instant startTime, Instant endTime, Long durationMs,
String errorMessage, String errorStacktrace,
String inputBody, String outputBody, String inputHeaders, String outputHeaders
String inputBody, String outputBody, String inputHeaders, String outputHeaders,
String attributes
) {}
}

View File

@@ -8,12 +8,14 @@ public record ExecutionDocument(
String status, String correlationId, String exchangeId,
Instant startTime, Instant endTime, Long durationMs,
String errorMessage, String errorStacktrace,
List<ProcessorDoc> processors
List<ProcessorDoc> processors,
String attributes
) {
public record ProcessorDoc(
String processorId, String processorType, String status,
String errorMessage, String errorStacktrace,
String inputBody, String outputBody,
String inputHeaders, String outputHeaders
String inputHeaders, String outputHeaders,
String attributes
) {}
}

View File

@@ -24,10 +24,10 @@ class TreeReconstructionTest {
private ProcessorRecord proc(String id, String type, String status,
int depth, String parentId) {
return new ProcessorRecord(
"exec-1", id, type, "node-" + id,
"exec-1", id, type,
"default", "route1", depth, parentId,
status, NOW, NOW, 10L,
null, null, null, null, null, null
null, null, null, null, null, null, null
);
}

View File

@@ -0,0 +1,858 @@
# Taps, Business Attributes & Enhanced Replay — Implementation Plan
> **For agentic workers:** REQUIRED SUB-SKILL: Use superpowers:subagent-driven-development (recommended) or superpowers:executing-plans to implement this plan task-by-task. Steps use checkbox (`- [ ]`) syntax for tracking.
**Goal:** Add UI and backend support for tap management, business attribute display, enhanced replay, per-route recording toggles, and success compression.
**Architecture:** Backend-first approach — add attributes to the execution pipeline, then build the command infrastructure for test-expression and replay, then layer on the frontend features page by page. Each task produces a self-contained, committable unit.
**Tech Stack:** Java 17 / Spring Boot 3.4 (backend), React 18 / TypeScript / TanStack Query (frontend), @cameleer/design-system components, PostgreSQL (JSONB), OpenSearch.
**Spec:** `docs/superpowers/specs/2026-03-26-taps-attributes-replay-ui-design.md`
---
## File Map
### Backend — New Files
- `cameleer3-server-app/src/main/resources/db/migration/V5__attributes.sql` — Flyway migration adding `attributes JSONB` to executions and processor_executions tables
- `cameleer3-server-app/src/main/java/com/cameleer3/server/app/dto/TestExpressionRequest.java` — Request DTO for test-expression endpoint
- `cameleer3-server-app/src/main/java/com/cameleer3/server/app/dto/TestExpressionResponse.java` — Response DTO for test-expression endpoint
### Backend — Modified Files
- `cameleer3-server-core/src/main/java/com/cameleer3/server/core/agent/CommandType.java` — add TEST_EXPRESSION
- `cameleer3-server-core/src/main/java/com/cameleer3/server/core/storage/ExecutionStore.java` — add attributes to ExecutionRecord and ProcessorRecord
- `cameleer3-server-core/src/main/java/com/cameleer3/server/core/detail/ExecutionDetail.java` — add attributes field
- `cameleer3-server-core/src/main/java/com/cameleer3/server/core/detail/ProcessorNode.java` — add attributes field
- `cameleer3-server-core/src/main/java/com/cameleer3/server/core/detail/DetailService.java` — pass attributes through tree reconstruction
- `cameleer3-server-core/src/main/java/com/cameleer3/server/core/search/ExecutionSummary.java` — add attributes field
- `cameleer3-server-core/src/main/java/com/cameleer3/server/core/ingestion/IngestionService.java` — extract attributes from RouteExecution/ProcessorExecution
- `cameleer3-server-core/src/main/java/com/cameleer3/server/core/storage/model/ExecutionDocument.java` — add attributes to ProcessorDoc
- `cameleer3-server-core/src/main/java/com/cameleer3/server/core/indexing/SearchIndexer.java` — include attributes in indexing
- `cameleer3-server-core/src/main/java/com/cameleer3/server/core/agent/AgentRegistryService.java` — add CompletableFuture-based command reply support
- `cameleer3-server-app/src/main/java/com/cameleer3/server/app/storage/PostgresExecutionStore.java` — add attributes to INSERT/UPDATE queries
- `cameleer3-server-app/src/main/java/com/cameleer3/server/app/search/OpenSearchIndex.java` — add attributes to toMap() and fromSearchHit()
- `cameleer3-server-app/src/main/java/com/cameleer3/server/app/controller/ApplicationConfigController.java` — add test-expression endpoint
- `cameleer3-server-app/src/main/java/com/cameleer3/server/app/controller/AgentCommandController.java` — add test-expression mapping, complete futures on ACK
- `cameleer3-server-app/src/main/java/com/cameleer3/server/app/dto/CommandAckRequest.java` — add optional data field
### Frontend — Modified Files
- `ui/src/api/schema.d.ts` — add attributes to ExecutionDetail, ProcessorNode, ExecutionSummary
- `ui/src/api/queries/commands.ts` — add TapDefinition type, extend ApplicationConfig, add test-expression mutation, add replay mutation
- `ui/src/pages/ExchangeDetail/ExchangeDetail.tsx` — attributes strip, per-processor attributes, replay modal
- `ui/src/pages/ExchangeDetail/ExchangeDetail.module.css` — attributes strip and replay styles
- `ui/src/pages/Dashboard/Dashboard.tsx` — attributes column in exchanges table
- `ui/src/pages/Routes/RouteDetail.tsx` — recording toggle, taps tab, tap modal with test
- `ui/src/pages/Routes/RouteDetail.module.css` — taps and recording styles
- `ui/src/pages/Admin/AppConfigDetailPage.tsx` — restructure to 3 sections
- `ui/src/pages/Admin/AppConfigDetailPage.module.css` — updated styles
---
## Task 1: Verify Prerequisites and Database Migration
**Files:**
- Create: `cameleer3-server-app/src/main/resources/db/migration/V5__attributes.sql`
- [ ] **Step 1: Verify cameleer3-common has attributes support**
Confirm the `cameleer3-common` SNAPSHOT dependency includes `RouteExecution.getAttributes()` and `ProcessorExecution.getAttributes()`. Run:
```bash
mvn dependency:sources -pl cameleer3-server-core -q
```
Then inspect the source jar for `RouteExecution.java` to confirm the `attributes` field exists. If it does not, the dependency must be updated first.
- [ ] **Step 2: Write migration SQL**
```sql
-- V5__attributes.sql
ALTER TABLE executions ADD COLUMN IF NOT EXISTS attributes JSONB;
ALTER TABLE processor_executions ADD COLUMN IF NOT EXISTS attributes JSONB;
```
- [ ] **Step 3: Verify migration compiles**
Run: `cd cameleer3-server-app && mvn compile -pl . -q`
Expected: BUILD SUCCESS
- [ ] **Step 4: Commit**
```bash
git add cameleer3-server-app/src/main/resources/db/migration/V5__attributes.sql
git commit -m "feat: add attributes JSONB columns to executions and processor_executions"
```
---
## Task 2: Backend — Add Attributes to Storage Records and Detail Models
**Files:**
- Modify: `cameleer3-server-core/src/main/java/com/cameleer3/server/core/storage/ExecutionStore.java`
- Modify: `cameleer3-server-core/src/main/java/com/cameleer3/server/core/detail/ExecutionDetail.java`
- Modify: `cameleer3-server-core/src/main/java/com/cameleer3/server/core/detail/ProcessorNode.java`
- Modify: `cameleer3-server-core/src/main/java/com/cameleer3/server/core/search/ExecutionSummary.java`
- [ ] **Step 1: Add `attributes` field to `ExecutionRecord`**
In `ExecutionStore.java`, add `String attributes` (JSONB as string) as the last parameter of the `ExecutionRecord` record. This is a serialized `Map<String, String>`.
- [ ] **Step 2: Add `attributes` field to `ProcessorRecord`**
In `ExecutionStore.java`, add `String attributes` (JSONB as string) as the last parameter of the `ProcessorRecord` record.
- [ ] **Step 3: Add `attributes` field to `ExecutionDetail`**
Add `Map<String, String> attributes` as the last parameter of the `ExecutionDetail` record (after `outputHeaders`).
- [ ] **Step 4: Add `attributes` field to `ProcessorNode`**
`ProcessorNode` is a mutable class with a constructor. Add a `Map<String, String> attributes` field with getter. Add it to the constructor. Update the existing `ProcessorNode` constructor calls in `DetailService.java` to pass `null` or the attributes map.
- [ ] **Step 5: Add `attributes` field to `ExecutionSummary`**
Add `Map<String, String> attributes` as the last parameter (after `highlight`).
- [ ] **Step 6: Verify compilation**
Run: `mvn compile -q`
Expected: Compilation errors in files that construct these records — these will be fixed in the next tasks.
- [ ] **Step 7: Commit**
```bash
git add cameleer3-server-core/
git commit -m "feat: add attributes field to ExecutionRecord, ProcessorRecord, ExecutionDetail, ProcessorNode, ExecutionSummary"
```
---
## Task 3: Backend — Attributes Ingestion Pipeline
**Files:**
- Modify: `cameleer3-server-core/src/main/java/com/cameleer3/server/core/ingestion/IngestionService.java`
- Modify: `cameleer3-server-app/src/main/java/com/cameleer3/server/app/storage/PostgresExecutionStore.java`
- [ ] **Step 1: Extract attributes in `IngestionService.toExecutionRecord()`**
In the `toExecutionRecord()` method (~line 76-111), serialize `execution.getAttributes()` to JSON string using Jackson `ObjectMapper`. Pass it as the new `attributes` parameter to `ExecutionRecord`. If attributes is null or empty, pass `null`.
```java
String attributes = null;
if (execution.getAttributes() != null && !execution.getAttributes().isEmpty()) {
attributes = JSON.writeValueAsString(execution.getAttributes());
}
```
Note: `IngestionService` has a static `private static final ObjectMapper JSON` field (line 22). Use `JSON.writeValueAsString()`.
- [ ] **Step 2: Extract attributes in `IngestionService.flattenProcessors()`**
In the `flattenProcessors()` method (~line 113-138), serialize each `ProcessorExecution.getAttributes()` to JSON string. Pass as the new `attributes` parameter to `ProcessorRecord`.
- [ ] **Step 3: Update `PostgresExecutionStore.upsert()`**
Add `attributes` to the INSERT statement and bind parameters. The column is JSONB, so use `PGobject` with type "jsonb" or cast `?::jsonb` in the SQL.
In the INSERT (~line 26-32): add `attributes` column and `?::jsonb` placeholder.
In the ON CONFLICT UPDATE (~line 33-51): add `attributes = COALESCE(EXCLUDED.attributes, executions.attributes)` merge (follows the existing pattern, e.g., `input_body = COALESCE(EXCLUDED.input_body, executions.input_body)`).
In the bind parameters (~line 53-62): bind `record.attributes()`.
- [ ] **Step 4: Update `PostgresExecutionStore.upsertProcessors()`**
Same pattern: add `attributes` column, `?::jsonb` placeholder, bind parameter.
- [ ] **Step 5: Verify compilation**
Run: `mvn compile -q`
Expected: BUILD SUCCESS (or remaining errors from DetailService/SearchIndexer which are next tasks)
- [ ] **Step 6: Commit**
```bash
git add cameleer3-server-core/src/main/java/com/cameleer3/server/core/ingestion/IngestionService.java
git add cameleer3-server-app/src/main/java/com/cameleer3/server/app/storage/PostgresExecutionStore.java
git commit -m "feat: store execution and processor attributes from agent data"
```
---
## Task 4: Backend — Attributes in Detail Service and OpenSearch Indexing
**Files:**
- Modify: `cameleer3-server-core/src/main/java/com/cameleer3/server/core/detail/DetailService.java`
- Modify: `cameleer3-server-core/src/main/java/com/cameleer3/server/core/storage/model/ExecutionDocument.java`
- Modify: `cameleer3-server-core/src/main/java/com/cameleer3/server/core/indexing/SearchIndexer.java`
- Modify: `cameleer3-server-app/src/main/java/com/cameleer3/server/app/search/OpenSearchIndex.java`
- [ ] **Step 1: Pass attributes through `DetailService.buildTree()`**
In `buildTree()` (~line 35-63), when constructing `ProcessorNode` from `ProcessorRecord`, deserialize the `attributes` JSON string back to `Map<String, String>` and pass it to the constructor.
In `getDetail()` (~line 16-33), when constructing `ExecutionDetail`, deserialize the `ExecutionRecord.attributes()` JSON and pass it as the `attributes` parameter.
- [ ] **Step 2: Update `PostgresExecutionStore.findById()` and `findProcessors()` queries**
These SELECT queries need to include the new `attributes` column and map it into `ExecutionRecord` / `ProcessorRecord` via the row mapper.
- [ ] **Step 3: Add attributes to `ExecutionDocument.ProcessorDoc`**
Add `String attributes` field to the `ProcessorDoc` record in `ExecutionDocument.java`. Also add `String attributes` to `ExecutionDocument` itself for route-level attributes.
- [ ] **Step 4: Update `SearchIndexer.indexExecution()`**
When constructing `ProcessorDoc` objects (~line 68-74), pass `processor.attributes()`. When constructing `ExecutionDocument` (~line 76-80), pass the execution record's attributes.
- [ ] **Step 5: Update `OpenSearchIndex.toMap()`**
In the `toMap()` method (~line 303-333), add `"attributes"` to the document map and to each processor sub-document map.
- [ ] **Step 6: Update `OpenSearchIndex.fromSearchHit()` (or equivalent)**
When parsing search results back into `ExecutionSummary`, extract the `attributes` field from the OpenSearch hit source and deserialize it into `Map<String, String>`.
- [ ] **Step 7: Verify compilation**
Run: `mvn compile -q`
Expected: BUILD SUCCESS
- [ ] **Step 8: Commit**
```bash
git add cameleer3-server-core/ cameleer3-server-app/
git commit -m "feat: thread attributes through detail service and OpenSearch indexing"
```
---
## Task 5: Backend — TEST_EXPRESSION Command and Request-Reply Infrastructure
**Files:**
- Modify: `cameleer3-server-core/src/main/java/com/cameleer3/server/core/agent/CommandType.java`
- Modify: `cameleer3-server-core/src/main/java/com/cameleer3/server/core/agent/AgentRegistryService.java`
- Modify: `cameleer3-server-app/src/main/java/com/cameleer3/server/app/dto/CommandAckRequest.java`
- Modify: `cameleer3-server-app/src/main/java/com/cameleer3/server/app/controller/AgentCommandController.java`
- Create: `cameleer3-server-app/src/main/java/com/cameleer3/server/app/dto/TestExpressionRequest.java`
- Create: `cameleer3-server-app/src/main/java/com/cameleer3/server/app/dto/TestExpressionResponse.java`
- Modify: `cameleer3-server-app/src/main/java/com/cameleer3/server/app/controller/ApplicationConfigController.java`
- [ ] **Step 1: Add TEST_EXPRESSION to CommandType enum**
```java
public enum CommandType {
CONFIG_UPDATE,
DEEP_TRACE,
REPLAY,
SET_TRACED_PROCESSORS,
TEST_EXPRESSION
}
```
- [ ] **Step 2: Add `data` field to `CommandAckRequest`**
```java
public record CommandAckRequest(String status, String message, String data) {}
```
The `data` field carries structured JSON results (e.g., expression test result). Existing ACKs that don't send data will deserialize `data` as `null`.
- [ ] **Step 3: Add CompletableFuture map to AgentRegistryService**
Add a `ConcurrentHashMap<String, CompletableFuture<CommandAckRequest>>` for pending request-reply commands. Add methods:
```java
public CompletableFuture<CommandAckRequest> addCommandWithReply(String agentId, CommandType type, String payload) {
AgentCommand command = addCommand(agentId, type, payload);
CompletableFuture<CommandAckRequest> future = new CompletableFuture<>();
pendingReplies.put(command.id(), future);
return future;
}
public void completeReply(String commandId, CommandAckRequest ack) {
CompletableFuture<CommandAckRequest> future = pendingReplies.remove(commandId);
if (future != null) {
future.complete(ack);
}
}
```
Note: Use `future.orTimeout(5, TimeUnit.SECONDS)` in the caller. The future auto-completes exceptionally on timeout. Add a `whenComplete` handler that removes the entry from `pendingReplies` to prevent leaks:
```java
future.whenComplete((result, ex) -> pendingReplies.remove(command.id()));
```
- [ ] **Step 4: Complete futures in AgentCommandController.acknowledgeCommand()**
In the ACK endpoint (~line 156-179), after `registryService.acknowledgeCommand()`, call `registryService.completeReply(commandId, ack)`.
- [ ] **Step 5: Add test-expression mapping to mapCommandType()**
```java
case "test-expression" -> CommandType.TEST_EXPRESSION;
```
- [ ] **Step 6: Create TestExpressionRequest and TestExpressionResponse DTOs**
```java
// TestExpressionRequest.java
public record TestExpressionRequest(String expression, String language, String body, String target) {}
// TestExpressionResponse.java
public record TestExpressionResponse(String result, String error) {}
```
- [ ] **Step 7: Add test-expression endpoint to ApplicationConfigController**
Note: `ApplicationConfigController` does not use `@PreAuthorize` — security is handled at the URL pattern level in the security config. The test-expression endpoint inherits the same access rules as other config endpoints. No `@PreAuthorize` annotation needed.
```java
@PostMapping("/{application}/test-expression")
@Operation(summary = "Test a tap expression against sample data via a live agent")
public ResponseEntity<TestExpressionResponse> testExpression(
@PathVariable String application,
@RequestBody TestExpressionRequest request) {
// 1. Find a LIVE agent for this application via registryService
// 2. Send TEST_EXPRESSION command with addCommandWithReply()
// 3. Await CompletableFuture with 5s timeout via future.orTimeout(5, TimeUnit.SECONDS)
// 4. Parse ACK data as result/error, return TestExpressionResponse
// Handle: no live agent (404), timeout (504), parse error (500)
// Clean up: future.whenComplete removes from pendingReplies map on timeout
}
```
- [ ] **Step 8: Verify compilation**
Run: `mvn compile -q`
Expected: BUILD SUCCESS
- [ ] **Step 9: Commit**
```bash
git add cameleer3-server-core/ cameleer3-server-app/
git commit -m "feat: add TEST_EXPRESSION command with request-reply infrastructure"
```
---
## Task 6: Backend — Regenerate OpenAPI and Schema
**Files:**
- Modify: `openapi.json` (regenerated)
- Modify: `ui/src/api/schema.d.ts` (regenerated)
- [ ] **Step 1: Build the server to generate updated OpenAPI spec**
Run: `mvn clean compile -q`
- [ ] **Step 2: Start the server temporarily to extract OpenAPI JSON**
Run the server, fetch `http://localhost:8080/v3/api-docs`, save to `openapi.json`. Alternatively, if the project has an automated OpenAPI generation step, use that.
- [ ] **Step 3: Regenerate schema.d.ts from openapi.json**
Run the existing schema generation command (check package.json scripts in ui/).
- [ ] **Step 4: Verify the new types include `attributes` on ExecutionDetail, ProcessorNode, ExecutionSummary**
Read `ui/src/api/schema.d.ts` and confirm the fields are present. Note: the OpenAPI generator may strip nullable fields (e.g., `highlight` exists on Java `ExecutionSummary` but not in the current schema). If `attributes` is missing, add `@Schema(nullable = true)` or `@JsonInclude(JsonInclude.Include.ALWAYS)` annotation on the Java DTO and regenerate. Alternatively, manually add the field to `schema.d.ts`.
- [ ] **Step 5: Commit**
```bash
git add openapi.json ui/src/api/schema.d.ts
git commit -m "chore: regenerate openapi.json and schema.d.ts with attributes and test-expression"
```
---
## Task 7: Frontend — TypeScript Types and API Hooks
**Files:**
- Modify: `ui/src/api/queries/commands.ts`
- [ ] **Step 1: Add TapDefinition interface**
```typescript
export interface TapDefinition {
tapId: string;
processorId: string;
target: 'INPUT' | 'OUTPUT' | 'BOTH';
expression: string;
language: string;
attributeName: string;
attributeType: 'BUSINESS_OBJECT' | 'CORRELATION' | 'EVENT' | 'CUSTOM';
enabled: boolean;
version: number;
}
```
- [ ] **Step 2: Extend ApplicationConfig interface**
Add to the existing `ApplicationConfig` interface:
```typescript
taps: TapDefinition[];
tapVersion: number;
routeRecording: Record<string, boolean>;
compressSuccess: boolean;
```
- [ ] **Step 3: Add useTestExpression mutation hook**
```typescript
export function useTestExpression() {
return useMutation({
mutationFn: async ({ application, expression, language, body, target }: {
application: string;
expression: string;
language: string;
body: string;
target: string;
}) => {
const { data, error } = await api.POST('/config/{application}/test-expression', {
params: { path: { application } },
body: { expression, language, body, target },
});
if (error) throw new Error('Failed to test expression');
return data!;
},
});
}
```
- [ ] **Step 4: Add useReplayExchange mutation hook**
```typescript
export function useReplayExchange() {
return useMutation({
mutationFn: async ({ agentId, headers, body }: {
agentId: string;
headers: Record<string, string>;
body: string;
}) => {
const { data, error } = await api.POST('/agents/{id}/commands', {
params: { path: { id: agentId } },
body: { type: 'replay', payload: { headers, body } } as any,
});
if (error) throw new Error('Failed to send replay command');
return data!;
},
});
}
```
- [ ] **Step 5: Verify build**
Run: `cd ui && npm run build`
Expected: BUILD SUCCESS (or type errors in pages that now receive new fields — those pages are updated in later tasks)
- [ ] **Step 6: Commit**
```bash
git add ui/src/api/queries/commands.ts
git commit -m "feat: add TapDefinition type, extend ApplicationConfig, add test-expression and replay hooks"
```
---
## Task 8: Frontend — Business Attributes on ExchangeDetail
**Files:**
- Modify: `ui/src/pages/ExchangeDetail/ExchangeDetail.tsx`
- Modify: `ui/src/pages/ExchangeDetail/ExchangeDetail.module.css`
- [ ] **Step 1: Add attributes strip to exchange header**
After the header info row and before the stat boxes, render the route-level attributes:
```tsx
{detail.attributes && Object.keys(detail.attributes).length > 0 && (
<div className={styles.attributesStrip}>
<span className={styles.attributesLabel}>Attributes</span>
{Object.entries(detail.attributes).map(([key, value]) => (
<Badge key={key} label={`${key}: ${value}`} color="auto" variant="filled" />
))}
</div>
)}
```
- [ ] **Step 2: Add per-processor attributes in processor detail panel**
In the processor detail section (where the selected processor's message IN/OUT is shown), add attributes badges if the selected processor has them. Access via `detail.processors` tree — traverse the nested tree to find the processor at the selected index and read its `attributes` map. Note: body/headers data comes from a separate `useProcessorSnapshot` call, but `attributes` is inline on the `ProcessorNode` in the detail response — no additional API call needed.
- [ ] **Step 3: Add CSS for attributes strip**
```css
.attributesStrip {
display: flex;
gap: 8px;
flex-wrap: wrap;
align-items: center;
padding: 10px 14px;
background: var(--bg-surface);
border: 1px solid var(--border-subtle);
border-radius: var(--radius-lg);
margin-bottom: 16px;
}
.attributesLabel {
font-size: 11px;
color: var(--text-muted);
margin-right: 4px;
}
```
- [ ] **Step 4: Verify build**
Run: `cd ui && npm run build`
Expected: BUILD SUCCESS
- [ ] **Step 5: Commit**
```bash
git add ui/src/pages/ExchangeDetail/
git commit -m "feat: display business attributes on ExchangeDetail page"
```
---
## Task 9: Frontend — Replay Modal on ExchangeDetail
**Files:**
- Modify: `ui/src/pages/ExchangeDetail/ExchangeDetail.tsx`
- Modify: `ui/src/pages/ExchangeDetail/ExchangeDetail.module.css`
- [ ] **Step 1: Add replay button to exchange header**
Add a "Replay" button (primary variant) in the header action area. Only render for OPERATOR/ADMIN roles (check with `useAuthStore()`).
```tsx
<Button variant="primary" size="sm" onClick={() => setReplayOpen(true)}>
Replay
</Button>
```
- [ ] **Step 2: Build the replay modal component**
Add state: `replayOpen`, `replayHeaders` (key-value array), `replayBody` (string), `replayAgent` (string), `replayTab` ('headers' | 'body').
Pre-populate from `detail.inputHeaders` (parse JSON string to object) and `detail.inputBody`.
Use Modal (size="lg"), Tabs for Headers/Body, and the `useReplayExchange` mutation hook.
Headers tab: render editable rows with Input fields for key and value, remove button per row, "Add header" link at bottom.
Body tab: Textarea with monospace font, pre-populated with `detail.inputBody`.
- [ ] **Step 3: Wire up agent selector**
Use `useAgents('LIVE', detail.applicationName)` to populate a Select dropdown. Default to the agent that originally processed this exchange (`detail.agentId`) if it's still LIVE.
- [ ] **Step 4: Wire up replay submission**
On "Replay" click: call `replayExchange.mutate({ agentId, headers, body })`. Show loading spinner on button. On success: `toast('Replay command sent')`, close modal. On error: `toast('Replay failed: ...')`.
- [ ] **Step 5: Add CSS for replay modal elements**
Style the warning banner, header table, body textarea, and agent selector.
- [ ] **Step 6: Verify build**
Run: `cd ui && npm run build`
Expected: BUILD SUCCESS
- [ ] **Step 7: Commit**
```bash
git add ui/src/pages/ExchangeDetail/
git commit -m "feat: add replay modal with editable headers and body on ExchangeDetail"
```
---
## Task 10: Frontend — Attributes Column on Dashboard
**Files:**
- Modify: `ui/src/pages/Dashboard/Dashboard.tsx`
- [ ] **Step 1: Add attributes column to the exchanges table**
In `buildBaseColumns()` (~line 97-163), add a new column after the `applicationName` column. Use CSS module classes (not inline styles — per project convention in `feedback_css_modules_not_inline.md`):
```typescript
{
key: 'attributes',
header: 'Attributes',
render: (_, row) => {
const attrs = row.attributes;
if (!attrs || Object.keys(attrs).length === 0) return <span className={styles.muted}></span>;
const entries = Object.entries(attrs);
const shown = entries.slice(0, 2);
const overflow = entries.length - 2;
return (
<div className={styles.attrCell}>
{shown.map(([k, v]) => (
<Badge key={k} label={String(v)} color="auto" title={k} />
))}
{overflow > 0 && <span className={styles.attrOverflow}>+{overflow}</span>}
</div>
);
},
},
```
Add corresponding CSS classes to `Dashboard.module.css`:
```css
.attrCell { display: flex; gap: 4px; align-items: center; }
.attrOverflow { font-size: 10px; color: var(--text-muted); }
```
- [ ] **Step 2: Verify build**
Run: `cd ui && npm run build`
Expected: BUILD SUCCESS
- [ ] **Step 3: Commit**
```bash
git add ui/src/pages/Dashboard/Dashboard.tsx
git commit -m "feat: show business attributes as compact badges in dashboard exchanges table"
```
---
## Task 11: Frontend — RouteDetail Recording Toggle and Taps KPI
**Files:**
- Modify: `ui/src/pages/Routes/RouteDetail.tsx`
- Modify: `ui/src/pages/Routes/RouteDetail.module.css`
- [ ] **Step 1: Add recording toggle to route header**
Add imports: `import { useApplicationConfig, useUpdateApplicationConfig } from '../../api/queries/commands'` and `Toggle` from `@cameleer/design-system`.
In the route header section, add a pill-styled container with a Toggle component:
```tsx
const config = useApplicationConfig(appId);
const updateConfig = useUpdateApplicationConfig();
const isRecording = config.data?.routeRecording?.[routeId] !== false; // default true
function toggleRecording() {
if (!config.data) return;
const routeRecording = { ...config.data.routeRecording, [routeId]: !isRecording };
updateConfig.mutate({ ...config.data, routeRecording });
}
```
Render:
```tsx
<div className={styles.recordingPill}>
<span className={styles.recordingLabel}>Recording</span>
<Toggle checked={isRecording} onChange={toggleRecording} />
</div>
```
- [ ] **Step 2: Add "Active Taps" to KPI strip**
Count enabled taps for this route's processors (cross-reference tap processorIds with this route's processor list from diagram data). Add to `kpiItems` array.
- [ ] **Step 3: Add "Taps" tab to tabs array**
```typescript
const tapCount = /* count taps for this route */;
const tabs = [
{ label: 'Performance', value: 'performance' },
{ label: 'Recent Executions', value: 'executions', count: exchangeRows.length },
{ label: 'Error Patterns', value: 'errors', count: errorPatterns.length },
{ label: 'Taps', value: 'taps', count: tapCount },
];
```
- [ ] **Step 4: Add CSS for recording pill**
```css
.recordingPill {
display: flex;
align-items: center;
gap: 8px;
background: var(--bg-surface);
border: 1px solid var(--border-subtle);
border-radius: var(--radius-lg);
padding: 6px 12px;
}
.recordingLabel {
font-size: 11px;
color: var(--text-muted);
}
```
- [ ] **Step 5: Verify build**
Run: `cd ui && npm run build`
Expected: BUILD SUCCESS
- [ ] **Step 6: Commit**
```bash
git add ui/src/pages/Routes/
git commit -m "feat: add recording toggle and taps KPI to RouteDetail header"
```
---
## Task 12: Frontend — RouteDetail Taps Tab and Tap Modal
**Files:**
- Modify: `ui/src/pages/Routes/RouteDetail.tsx`
- Modify: `ui/src/pages/Routes/RouteDetail.module.css`
- [ ] **Step 1: Render taps DataTable when "Taps" tab is active**
Filter `config.data.taps` to only taps whose `processorId` exists in this route's diagram. Display in a DataTable with columns: Attribute, Processor, Expression, Language, Target, Type, Enabled (Toggle), Actions.
Empty state: "No taps configured for this route. Add a tap to extract business attributes from exchange data."
- [ ] **Step 2: Build the Add/Edit Tap modal**
State: `tapModalOpen`, `editingTap` (null for new, TapDefinition for edit), form fields.
Modal contents:
- FormField + Input for Attribute Name
- FormField + Select for Processor (options from `useDiagramLayout` node list)
- Two FormFields side-by-side: Select for Language (simple, jsonpath, xpath, jq, groovy) and Select for Target (INPUT, OUTPUT, BOTH)
- FormField + Textarea for Expression (monospace)
- Attribute Type pill selector (4 options, styled as button group)
- Toggle for Enabled
- [ ] **Step 3: Add Test Expression section to tap modal**
Collapsible section (default expanded) with two tabs: "Recent Exchange" and "Custom Payload".
Recent Exchange tab:
- Use `useSearchExecutions` with this route's filter to get recent exchanges as summaries
- Auto-select most recent exchange, then fetch its detail via `useExecutionDetail` to get the `inputBody` for the test payload
- Select dropdown to change exchange
- "Test" button calls `useTestExpression` mutation with the exchange's body
Custom Payload tab:
- Textarea pre-populated from the most recent exchange's body (fetched via detail endpoint)
- Switching from Recent Exchange tab carries the payload over
- "Test" button calls `useTestExpression` mutation
Result display: green box for success, red box for error.
- [ ] **Step 4: Wire up tap save**
On save: update the `taps` array in ApplicationConfig (add new or replace existing by tapId), then call `updateConfig.mutate()`. Generate `tapId` as UUID for new taps.
- [ ] **Step 5: Wire up tap delete**
On delete: remove tap from array, call `updateConfig.mutate()`. Import and use `ConfirmDialog` from `@cameleer/design-system` before deleting.
- [ ] **Step 6: Wire up enabled toggle inline**
Toggle in the DataTable row directly calls config update (toggle the specific tap's `enabled` field).
- [ ] **Step 7: Add CSS for taps tab content**
Style the taps header (title + button), tap modal form layout, test expression section, result boxes.
- [ ] **Step 8: Verify build**
Run: `cd ui && npm run build`
Expected: BUILD SUCCESS
- [ ] **Step 9: Commit**
```bash
git add ui/src/pages/Routes/
git commit -m "feat: add taps management tab with CRUD modal and expression testing on RouteDetail"
```
---
## Task 13: Frontend — AppConfigDetailPage Restructure
**Files:**
- Modify: `ui/src/pages/Admin/AppConfigDetailPage.tsx`
- Modify: `ui/src/pages/Admin/AppConfigDetailPage.module.css`
- [ ] **Step 1: Merge Logging + Observability into "Settings" section**
Replace the two separate `SectionHeader` sections with a single "Settings" section. Render all setting badges in a single flex row: Log Forwarding, Engine Level, Payload Capture, Metrics, Sampling Rate, Compress Success (new field).
Edit mode: all badges become dropdowns/toggles as before, plus a new Toggle for `compressSuccess`.
- [ ] **Step 2: Merge Traced Processors + Taps into "Traces & Taps" section**
Build a merged data structure: for each processor that has either a trace override or taps, create a row with Route, Processor, Capture badge, Taps badges.
To resolve processor-to-route mapping: fetch route catalog for this application, then for each route fetch its diagram. Build a `Map<processorId, routeId>` by iterating diagram nodes. For processors not found, show "unknown".
Table columns: Route, Processor, Capture (badge/select in edit mode), Taps (attribute badges with enabled indicators, read-only).
Summary: "N traced · M taps · manage taps on route pages".
- [ ] **Step 3: Add "Route Recording" section**
Fetch route list from `useRouteCatalog` filtered by application. Render table with Route name and Toggle.
In view mode: toggles show current state (disabled).
In edit mode: toggles are interactive.
Default for routes not in `routeRecording` map: recording enabled (true).
Summary: "N of M routes recording".
- [ ] **Step 4: Update form state for new fields**
Add `compressSuccess` and `routeRecording` to the form state object and `updateField` handler. Ensure save sends the complete config including new fields.
- [ ] **Step 5: Update CSS for restructured sections**
Adjust section spacing, flex row for merged settings badges.
- [ ] **Step 6: Verify build**
Run: `cd ui && npm run build`
Expected: BUILD SUCCESS
- [ ] **Step 7: Commit**
```bash
git add ui/src/pages/Admin/
git commit -m "feat: restructure AppConfigDetailPage to Settings, Traces & Taps, Route Recording sections"
```
---
## Task 14: Final Build Verification and Push
- [ ] **Step 1: Run full backend build**
Run: `mvn clean compile -q`
Expected: BUILD SUCCESS
- [ ] **Step 2: Run full frontend build**
Run: `cd ui && npm run build`
Expected: BUILD SUCCESS
- [ ] **Step 3: Manual smoke test checklist**
Verify in browser:
- ExchangeDetail shows attributes strip when attributes exist
- ExchangeDetail replay button opens modal, can send replay
- Dashboard table shows attributes column
- RouteDetail shows recording toggle, taps tab with CRUD
- Tap modal test expression section works (if live agent available)
- AppConfigDetailPage shows 3 merged sections
- AppConfigDetailPage edit mode works for compress success and route recording
- [ ] **Step 4: Push to remote**
```bash
git push origin main
```

View File

@@ -0,0 +1,247 @@
# Taps, Business Attributes & Enhanced Replay — UI Design
## Context
The Cameleer3 agent now supports camel-native data extraction taps, business attributes on executions, enhanced replay with editable payloads, per-route recording toggles, and success compression. The agent-side implementation is deployed and live.
The shared models (`TapDefinition`, extended `ApplicationConfig` with `taps`, `tapVersion`, `routeRecording`, `compressSuccess`) exist in `cameleer3-common` (agent repo). The server already depends on this library and persists `ApplicationConfig` as JSONB in the `application_config` table. However, the server-side execution DTOs (`ExecutionDetail`, `ExecutionSummary`, `ProcessorNode`) do not yet carry `attributes` fields, and the `CommandType` enum lacks `TEST_EXPRESSION`.
This spec covers all UI surfaces and the backend changes needed to support them.
## Design Decisions
| Decision | Choice | Rationale |
|----------|--------|-----------|
| Tap management location | RouteDetail contextual + AppConfigDetail overview | Taps target processors; processor list is contextual to a route. Admin overview for cross-route visibility. |
| Business attributes display | Header badges + per-processor + dashboard table | Primary value of taps — must be front-and-center for quick identification |
| Replay trigger | Button in ExchangeDetail header | Route-level action, clear and discoverable |
| Route recording location | RouteDetail toggle + AppConfigDetail bulk table | Contextual single-route control + centralized bulk management |
| Compress success | Badge in AppConfigDetail Settings section | Simple boolean toggle, admin-level concern |
| Expression testing | Agent-side evaluation via TEST_EXPRESSION command | Only the agent has the Camel expression engine; works for all languages |
| AppConfigDetail layout | 3 sections: Settings, Traces & Taps, Route Recording | Collapsed from 4 sections; Logging+Observability merged, TracedProcessors+Taps merged |
## Prerequisites
Before UI work can begin, the following backend changes are required:
1. **Update `cameleer3-common` dependency** — ensure the server pulls a version that includes `TapDefinition`, and `ApplicationConfig` with `taps`, `tapVersion`, `routeRecording`, `compressSuccess` fields.
2. **Add `attributes` to execution DTOs**`ExecutionDetail`, `ProcessorNode`, and `ExecutionSummary` need a `Map<String, String> attributes` field. This requires changes to the PostgreSQL ingestion pipeline (store attributes from agent-submitted `RouteExecution`/`ProcessorExecution`), the detail service (reconstruct attributes), and the OpenSearch indexing (index attributes for search results).
3. **Add `TEST_EXPRESSION` to `CommandType`** enum.
4. **Enhance `CommandAckRequest`** — add an optional `data` field (`String`, JSON) to carry structured results (currently only `status` + `message`). The test-expression endpoint needs the result value from the ACK.
5. **Regenerate `openapi.json`** after all backend REST API changes.
## Page Changes
### 1. ExchangeDetail
**Business attributes strip** between header info and stat boxes:
- Route-level attributes as auto-colored badges (`key: value`, monospace)
- Wraps on overflow
- Empty state: section not rendered when no attributes exist
**Per-processor attributes** in processor detail panel:
- Badges below processor info, before message IN/OUT sections
- Shows attributes extracted at that specific processor
**Replay button** in header action area (top-right), primary blue. Requires OPERATOR or ADMIN role:
- Opens large Modal with:
- Warning banner ("This will re-execute the exchange on the selected agent")
- Target Agent select — uses `useAgents(application, 'LIVE')` to populate. Disabled with message when no LIVE agents available.
- Tabs: Headers (editable key-value table with add/remove) | Body (editable monospace textarea, JSON indicator)
- Pre-populated from original exchange's `inputHeaders` and `inputBody` (already available on `ExecutionDetail`)
- Cancel / Replay footer
- Sends REPLAY command via `POST /api/v1/agents/{agentId}/commands`
- Payload: `{ "type": "replay", "payload": { "headers": {...}, "body": "..." } }`
- Success: toast with confirmation message from ACK
- Failure: toast with error message
- Loading state: Replay button shows spinner while awaiting ACK
### 2. Dashboard Exchanges Table
**New "Attributes" column** between App and Exchange ID:
- First 2 attribute values as compact auto-colored badges (value only; key shown via native `title` attribute on hover)
- "+N" overflow indicator when more than 2
- Em-dash when no attributes
### 3. RouteDetail
**Recording toggle** in route header (top-right):
- Toggle in pill container with "Recording" label
- Updates `routeRecording` map in ApplicationConfig via PUT
- Requires OPERATOR or ADMIN role
**"Active Taps" KPI card** added to KPI strip.
**New "Taps" tab** (fourth tab alongside Performance, Recent Executions, Error Patterns):
- Header: "Data Extraction Taps" + "Add Tap" button (OPERATOR or ADMIN only)
- DataTable columns: Attribute, Processor, Expression, Language, Target, Type, Enabled (toggle), Actions (edit/delete)
- Add/edit opens tap modal
- Empty state: "No taps configured for this route. Add a tap to extract business attributes from exchange data."
**Add/Edit Tap modal** (Modal size="md"):
- Fields: Attribute Name (input), Processor (select from route diagram via `useDiagramLayout`), Language + Target (side-by-side selects), Expression (monospace textarea), Attribute Type (pill selector: BUSINESS_OBJECT / CORRELATION / EVENT / CUSTOM), Enabled toggle
- **Test Expression section** (collapsible, default expanded):
- Tabs: "Recent Exchange" | "Custom Payload"
- Recent Exchange: auto-selects most recent exchange with captured data at selected processor. Dropdown to change. Test button sends expression to live agent. Result display.
- Custom Payload: editable textarea pre-populated from most recent exchange body. Switching from Recent Exchange carries the payload over. Test button → result display.
- Result: green success box with extracted value, or red error box with message
- Loading state: spinner on Test button while awaiting agent response
- No agents state: "No LIVE agents available to test expression" with Test button disabled
- Note showing which agent evaluated and which language was used
- Save / Cancel footer
- Save writes the tap to the `taps` array in ApplicationConfig via existing `PUT /api/v1/config/{application}`
### 4. AppConfigDetailPage
Restructured to **3 sections** (from 4):
**Section 1 — Settings:** Merged Logging + Observability. All settings as badges in flex row: Log Forwarding, Engine Level, Payload Capture, Metrics, Sampling Rate, Compress Success (new). Edit mode: badges become dropdowns/toggles.
**Section 2 — Traces & Taps:** Merged Traced Processors + Data Extraction Taps. Table columns: Route, Processor, Capture (badge or em-dash), Taps (attribute name badges with enabled/disabled indicator). Sorted by route. Capture editable in edit mode; taps read-only with "manage taps on route pages" hint. Summary: "N traced · M taps".
Processor-to-route mapping: Taps carry a `processorId` that belongs to a specific route. The route association is derived by cross-referencing with route diagram data (via `useDiagramLayout` per route from the route catalog). If a processor cannot be mapped to a route (e.g., route no longer active), show "unknown" in the Route column.
**Section 3 — Route Recording:** Table: Route + Recording toggle. Summary: "N of M routes recording". Toggles editable in edit mode. Route list from `useRouteCatalog` filtered by application. Routes not present in the `routeRecording` map default to recording enabled (consistent with agent behavior where absence = enabled).
### 5. AgentHealth Config Bar
No changes. New features managed at AppConfig level, not per-agent.
## RBAC Permissions
| Action | Minimum Role |
|--------|-------------|
| View business attributes | VIEWER |
| View taps / traces / recording state | VIEWER |
| Create / edit / delete taps | OPERATOR |
| Toggle route recording | OPERATOR |
| Edit app config settings | OPERATOR |
| Replay exchange | OPERATOR |
| Test expression | OPERATOR |
These align with the existing pattern where VIEWER sees data and OPERATOR can modify configuration.
## TypeScript Interface Changes
```typescript
// Add to ApplicationConfig in commands.ts
interface ApplicationConfig {
// ... existing fields ...
taps: TapDefinition[]
tapVersion: number
routeRecording: Record<string, boolean>
compressSuccess: boolean
}
interface TapDefinition {
tapId: string
processorId: string
target: 'INPUT' | 'OUTPUT' | 'BOTH'
expression: string
language: string
attributeName: string
attributeType: 'BUSINESS_OBJECT' | 'CORRELATION' | 'EVENT' | 'CUSTOM'
enabled: boolean
version: number
}
```
## Backend Changes
### New Endpoint: Test Expression
`POST /api/v1/config/{application}/test-expression`
Request:
```json
{
"expression": "${body.orderId}",
"language": "simple",
"body": "{\"orderId\": \"ORD-123\"}",
"target": "OUTPUT"
}
```
Response (success):
```json
{ "result": "ORD-123" }
```
Response (failure):
```json
{ "error": "Expression evaluation timed out (50ms limit)" }
```
**Request-reply mechanism:** The server selects a LIVE agent for the application, sends a `TEST_EXPRESSION` command via SSE, then awaits the ACK with a `CompletableFuture` (timeout 5s). The `CommandAckRequest` record is extended with an optional `data` field (JSON string) to carry the evaluation result. The controller completes the future when the ACK arrives, returning the result to the HTTP caller. If no LIVE agent is available or the timeout expires, the endpoint returns an appropriate error response.
### Replay Command Payload
The REPLAY command (already exists in `CommandType`) is sent via `POST /api/v1/agents/{agentId}/commands`:
```json
{
"type": "replay",
"payload": {
"headers": {
"Content-Type": "application/json",
"X-Correlation-Id": "corr-abc123"
},
"body": "{\"orderId\": \"ORD-2024-78542\", ...}"
}
}
```
The agent uses `ProducerTemplate.send()` to replay the exchange on the original route with the provided headers and body.
### Execution DTO Changes
**`ExecutionDetail`** — add `Map<String, String> attributes` (route-level aggregated)
**`ProcessorNode`** — add `Map<String, String> attributes` (per-processor)
**`ExecutionSummary`** — add `Map<String, String> attributes` (route-level, for dashboard table)
These require:
- PostgreSQL ingestion: store attributes from incoming `RouteExecution` and `ProcessorExecution` (the agent already sends them)
- Detail service: include attributes when reconstructing the execution tree
- OpenSearch indexing: index route-level attributes for search result enrichment
### CommandType Addition
Add `TEST_EXPRESSION` to the `CommandType` enum.
### CommandAckRequest Enhancement
Extend from `(String status, String message)` to `(String status, String message, String data)` where `data` is an optional JSON string for structured results.
## Design System Impact
No new components required. Uses existing: Modal, DataTable, Badge, Toggle, Select, Input, Textarea, FormField, Tabs, Button, CodeBlock, Collapsible.
## Files Touched
### Frontend (ui/src/)
- `api/queries/commands.ts` — TapDefinition interface, extend ApplicationConfig, add test-expression mutation, add replay mutation
- `pages/ExchangeDetail/ExchangeDetail.tsx` — attributes strip, per-processor attributes, replay button + modal
- `pages/ExchangeDetail/ExchangeDetail.module.css` — attributes strip styles, replay modal styles
- `pages/Dashboard/Dashboard.tsx` — attributes column in exchanges table
- `pages/Routes/RouteDetail.tsx` — recording toggle, active taps KPI, taps tab, tap modal with test section
- `pages/Routes/RouteDetail.module.css` — taps tab, recording toggle, tap modal styles
- `pages/Admin/AppConfigDetailPage.tsx` — restructure to 3 sections, traces & taps merged table, route recording table, compress success badge
- `pages/Admin/AppConfigDetailPage.module.css` — updated section styles
### Backend (cameleer3-server-app/)
- `controller/ApplicationConfigController.java` — add test-expression endpoint
- `dto/CommandAckRequest.java` — add optional `data` field
- `controller/AgentCommandController.java` — support CompletableFuture-based ACK for test-expression
### Backend (cameleer3-server-core/)
- `agent/CommandType.java` — add TEST_EXPRESSION
- `detail/ExecutionDetail.java` — add attributes field
- `detail/ProcessorNode.java` — add attributes field
- `search/ExecutionSummary.java` — add attributes field
- `detail/DetailService.java` — include attributes in reconstruction
- `storage/` — store attributes from ingested executions
- `search/SearchService.java` — include attributes in search results
### Generated
- `ui/src/api/schema.d.ts` — regenerate from openapi.json
- `openapi.json` — regenerate after backend changes

View File

@@ -0,0 +1,335 @@
# Interactive Process Diagram — Design Spec
**Sub-project:** 1 of 3 (Component → Execution Overlay → Page Integration)
**Scope:** Interactive SVG diagram component with zoom/pan, node interactions, config badges, and a configurable layout direction. Does NOT include execution overlay or page replacement — those are sub-projects 2 and 3.
---
## Problem
The current RouteFlow component renders Camel routes as a flat vertical list of nodes. It cannot show compound structures (choice branches, split fan-out, try-catch nesting), does not support zoom/pan, and has no interactive controls beyond click-to-select. Routes with 10+ processors become hard to follow, and the relationship between processors is not visually clear.
## Goal
Build an interactive process diagram component styled after MuleSoft / TIBCO BusinessWorks 5, rendering Camel routes as left-to-right flow diagrams using server-computed ELK layout coordinates. The component supports zoom/pan, node hover toolbars for tracing/tap configuration, config badge indicators, and a collapsible detail side-panel.
---
## Decisions
| Decision | Choice | Rationale |
|----------|--------|-----------|
| Rendering | SVG + custom React | Full control over styling, no heavy deps. Server owns layout. |
| Node style | Top-Bar Cards | TIBCO BW5-inspired white cards with colored top accent bar. Professional, clean. |
| Flow direction | Left-to-right (default) | Matches MuleSoft/BW5 conventions. Query param for flexibility. |
| Component location | `ui/src/components/ProcessDiagram/` | Tightly coupled to Cameleer data model, no design-system abstraction needed. |
| Interactions | Hover floating toolbar + click-to-select | Discoverable, no right-click dependency. |
| Error handlers | Below main flow | Clear visual separation, labeled divider. |
| Selection behavior | Side panel with config info; execution data only with overlay | Keeps base diagram focused on topology. |
---
## 1. Backend: Layout Direction Parameter
### Change
Add optional `direction` query parameter to diagram render endpoints.
### Files
- `cameleer3-server-app/.../diagram/ElkDiagramRenderer.java` — accept direction param, map to ELK `Direction.RIGHT` (LR) or `Direction.DOWN` (TB)
- `cameleer3-server-core/.../diagram/DiagramRenderer.java` — update interface to accept direction
- `cameleer3-server-app/.../controller/DiagramRenderController.java` — add `@RequestParam(defaultValue = "LR") String direction` to render endpoints
- `ui/src/api/queries/diagrams.ts` — pass `direction` query param to API calls; also update `DiagramLayout` edge type to match backend `PositionedEdge` serialization: `{ sourceId, targetId, label?, points: number[][] }` (currently defines `{ from?, to? }` which is missing `points` and `label`)
### Behavior
- `GET /diagrams/{contentHash}/render?direction=LR` → left-to-right layout (default)
- `GET /diagrams/{contentHash}/render?direction=TB` → top-to-bottom layout
- `GET /diagrams?application=X&routeId=Y&direction=LR` → same for by-route endpoint
### Compound Node Direction
The direction parameter applies to the **root** layout only. Compound nodes (CHOICE, SPLIT, TRY_CATCH, etc.) keep their internal layout direction as **top-to-bottom** regardless of the root direction. This matches how MuleSoft/BW5 render branching patterns: the main flow goes left-to-right, but branches within a choice or split fan out vertically inside their container.
---
## 2. Frontend: ProcessDiagram Component
### File Structure
```
ui/src/components/ProcessDiagram/
├── ProcessDiagram.tsx # Root: SVG container, zoom/pan, section layout
├── ProcessDiagram.module.css # Styles using design system tokens
├── DiagramNode.tsx # Individual node: top-bar card rendering
├── DiagramEdge.tsx # Edge: cubic Bezier path with arrowhead
├── CompoundNode.tsx # Container for compound types (choice, split)
├── NodeToolbar.tsx # Floating action toolbar on hover
├── ConfigBadge.tsx # Indicator badges (TRACE, TAP) on nodes
├── ErrorSection.tsx # Visual separator + error handler flow section
├── ZoomControls.tsx # HTML overlay: zoom in/out/fit buttons
├── useZoomPan.ts # Hook: viewBox transform, wheel zoom, drag pan
├── useDiagramData.ts # Hook: fetch + separate layout into sections
├── node-colors.ts # NodeType → design system color token mapping
├── types.ts # Shared TypeScript interfaces
└── index.ts # Public exports
```
### Props API
```typescript
interface ProcessDiagramProps {
application: string;
routeId: string;
direction?: 'LR' | 'TB'; // default 'LR'
selectedNodeId?: string; // controlled selection
onNodeSelect?: (nodeId: string) => void;
onNodeAction?: (nodeId: string, action: NodeAction) => void;
nodeConfigs?: Map<string, NodeConfig>; // active taps/tracing per processor
className?: string;
}
type NodeAction = 'inspect' | 'toggle-trace' | 'configure-tap' | 'copy-id';
interface NodeConfig {
traceEnabled?: boolean;
tapExpression?: string;
}
// ExecutionOverlay types will be added in sub-project 2 when needed.
// No forward-declared types here to avoid drift.
```
### SVG Structure
```
<div class="process-diagram">
<svg viewBox="..."> // zoom = viewBox transform
<defs> // arrowhead markers, filters
<marker id="arrow">...</marker>
</defs>
<g class="diagram-content"> // pan offset transform
<!-- Main Route section -->
<g class="section section--main">
<g class="edges"> // rendered first (behind nodes)
<path d="M ... C ..." /> // cubic bezier from ELK waypoints
</g>
<g class="nodes">
<g transform="translate(x,y)"> // ELK-computed position
<!-- DiagramNode: top-bar card -->
<!-- ConfigBadge: top-right corner pills -->
<!-- NodeToolbar: foreignObject on hover -->
</g>
<g class="compound"> // CompoundNode: dashed border container
<g transform="translate(...)"> <!-- children inside -->
</g>
</g>
</g>
<!-- Error Handler section(s) -->
<g class="section section--error"
transform="translate(0, mainHeight + gap)">
<text>onException: java.lang.Exception</text>
<line ... /> // divider
<g class="edges">...</g>
<g class="nodes">...</g>
</g>
</g>
</svg>
<div class="zoom-controls">...</div> // HTML overlay, bottom-right
</div>
```
---
## 3. Node Visual States
### Base States
| State | Visual |
|-------|--------|
| Normal | White card, `--border` (#E4DFD8), colored top bar per type |
| Hovered | Warm tint background (`--bg-hover` / #F5F0EA), stronger border, floating toolbar appears above |
| Selected | Amber selection ring (2.5px solid `--amber`), side panel opens |
### Config Badges
Small colored pill badges positioned at the top-right corner of the node card, always visible:
- **TRACE** — teal (`--running`) pill, shown when tracing is enabled
- **TAP** — purple (`--purple`) pill, shown when a tap expression is configured
### Execution Overlay States (sub-project 2 — node must support these props)
| State | Visual |
|-------|--------|
| Executed (OK) | Green left border or subtle green tint |
| Failed (caused error handler) | Red border (2px `--error`), red marker icon |
| Not executed | Dimmed (reduced opacity) |
| Has trace data | Small "data available" indicator icon |
| No trace data | No indicator (or grayed-out data icon) |
### Node Type Colors
| Category | Token | Hex | Types |
|----------|-------|-----|-------|
| Endpoints | `--running` | #1A7F8E teal | ENDPOINT |
| Processors | `--amber` | #C6820E | PROCESSOR, BEAN, LOG, SET_HEADER, SET_BODY, TRANSFORM, MARSHAL, UNMARSHAL |
| Targets | `--success` | #3D7C47 green | TO, TO_DYNAMIC, DIRECT, SEDA |
| EIP Patterns | `--purple` | #7C3AED | EIP_CHOICE, EIP_WHEN, EIP_OTHERWISE, EIP_SPLIT, EIP_MULTICAST, EIP_LOOP, EIP_AGGREGATE, EIP_FILTER, etc. |
| Error Handling | `--error` | #C0392B red | ERROR_HANDLER, ON_EXCEPTION, TRY_CATCH, DO_TRY, DO_CATCH, DO_FINALLY |
| Cross-Route | (hardcoded) | #06B6D4 cyan | EIP_WIRE_TAP, EIP_ENRICH, EIP_POLL_ENRICH |
Note: This frontend color mapping intentionally differs from the backend `ElkDiagramRenderer` SVG colors (which use blue for endpoints, green for processors). The frontend uses design system tokens for consistency with the rest of the UI. The backend SVG renderer is not changed.
### Compound Node Rendering
Compound types (CHOICE, SPLIT, TRY_CATCH, LOOP, etc.) render as:
- Full-width colored header bar with white text label (type name)
- White body area with subtle border matching the type color
- Children rendered inside at their ELK-relative positions
- Children have their own hover/select/badge behavior
---
## 4. Interactions
### Hover Floating Toolbar
On mouse enter over a node, a dark floating toolbar appears above the node (centered). Uses `<foreignObject>` for HTML accessibility.
| Icon | Action | Callback |
|------|--------|----------|
| Search | Inspect | `onNodeAction(id, 'inspect')` — selects node, opens side panel |
| T | Toggle Trace | `onNodeAction(id, 'toggle-trace')` — enables/disables tracing |
| Pencil | Configure Tap | `onNodeAction(id, 'configure-tap')` — opens tap config |
| ... | More | `onNodeAction(id, 'copy-id')` — copies processor ID |
Toolbar hides on mouse leave after a short delay (150ms) to prevent flicker when moving between node and toolbar.
### Click-to-Select
Click on a node → calls `onNodeSelect(nodeId)`. Parent controls `selectedNodeId` prop. Selected node shows amber ring.
### Zoom & Pan
**`useZoomPan` hook manages:**
- Mouse wheel → zoom centered on cursor
- Click+drag on background → pan
- Pinch gesture → zoom (trackpad/touch)
- State: `{ scale, translateX, translateY }`
- Applied to SVG `viewBox` attribute
**`ZoomControls` component:**
- Three buttons: `+` (zoom in), `` (zoom out), fit-to-view icon
- Positioned as HTML overlay at bottom-right of diagram container
- Fit-to-view calculates viewBox to show entire diagram with 40px padding
**Zoom limits:** 25% to 400%.
### Keyboard Navigation
**Required:**
| Key | Action |
|-----|--------|
| Escape | Deselect / close panel |
| +/- | Zoom in/out |
| 0 | Fit to view |
**Stretch (implement if time permits):**
| Key | Action |
|-----|--------|
| Arrow keys | Move selection between connected nodes |
| Tab | Cycle through nodes in flow order |
| Enter | Open detail panel for selected node |
---
## 5. Error Handler Sections
Error handler compounds (ON_EXCEPTION, ERROR_HANDLER) render as separate sections below the main flow:
1. **Divider:** Horizontal line with label text (e.g., "onException: java.lang.Exception")
2. **Gap:** 40px vertical gap between main section and error section
3. **Layout:** Error section gets its own ELK-computed layout (compound node children already have relative coordinates)
4. **Styling:** Same node rendering as main section, but the section background has a subtle red tint
5. **Multiple handlers:** Each ON_EXCEPTION becomes its own section, stacked vertically
The `useDiagramData` hook separates top-level compound error nodes from regular nodes, computing the Y offset for each error section based on accumulated heights.
---
## 6. Data Flow
```
useDiagramByRoute(app, routeId)
→ contentHash
→ useDiagramLayout(contentHash, direction)
→ DiagramLayout { nodes[], edges[], width, height }
useDiagramData hook:
1. Separate nodes into mainNodes[] and errorSections[]
(reuses logic from buildFlowSegments: error-handler compounds with children → error sections)
2. Filter edges: mainEdges (between main nodes), errorEdges (within each error section)
3. Compute total SVG dimensions: max(mainWidth, errorWidths) × (mainHeight + gap + errorHeights)
4. Return { mainNodes, mainEdges, errorSections, totalWidth, totalHeight }
```
The existing `diagram-mapping.ts` `buildFlowSegments` function handles the separation logic. The new `useDiagramData` hook adapts this for SVG coordinate-based rendering instead of RouteFlow's FlowSegment format.
---
## 7. Side Panel (Detail Panel)
When a node is selected, a collapsible side panel slides in from the right of the diagram container.
**Base mode (no execution overlay):**
- Processor ID
- Processor type
- Endpoint URI (if applicable)
- Active configuration: tracing status, tap expression
- Node metadata from the diagram
**With execution overlay (sub-project 2):**
- Execution status + duration
- Input/output body (if trace data captured)
- Input/output headers
- Error message + stack trace (if failed)
- Loop iteration selector (if inside a loop)
For sub-project 1, the side panel shows config info only. The component accepts an `onNodeSelect` callback — the parent page controls what appears in the panel.
The side panel is NOT part of the ProcessDiagram component itself. It is rendered by the parent page and controlled via the `selectedNodeId` / `onNodeSelect` props. This keeps the diagram component focused on visualization.
**Dev test page (`/dev/diagram`):** In sub-project 1, the test page renders the ProcessDiagram with a simple stub side panel that shows the selected node's ID, type, label, and any `nodeConfigs` entry. This validates the selection interaction without needing full page integration.
---
## 8. Non-Goals (Sub-project 2 & 3)
These are explicitly out of scope for sub-project 1:
- **Execution overlay rendering** — animated flow, per-node status/duration, dimming non-executed nodes
- **Loop/split iteration stepping** — "debugger" UI with iteration tabs
- **Page integration** — replacing RouteFlow on RouteDetail, ExchangeDetail, Dashboard
- **Minimap** — small overview for large diagrams (stretch goal, not v1)
- **Drag to rearrange** — nodes are server-positioned, not user-movable
---
## Verification
1. **Backend:** `mvn clean verify -DskipITs` passes after direction param addition
2. **Frontend types:** `npx tsc -p tsconfig.app.json --noEmit` passes
3. **Manual test:** Create a temporary test page or Storybook-like route (`/dev/diagram`) that renders the ProcessDiagram component with a known route
4. **Zoom/pan:** Mouse wheel zooms, drag pans, fit-to-view works
5. **Node interaction:** Hover shows toolbar, click selects with amber ring
6. **Config badges:** Pass mock `nodeConfigs` and verify TRACE/TAP pills render
7. **Error sections:** Route with ON_EXCEPTION renders error handler below main flow
8. **Compound nodes:** Route with CHOICE renders children inside dashed container
9. **Keyboard (required):** Escape deselects, +/- zooms, 0 fits to view
10. **Direction:** `?direction=TB` renders top-to-bottom layout

View File

@@ -58,6 +58,9 @@
<repository>
<id>gitea</id>
<url>https://gitea.siegeln.net/api/packages/cameleer/maven</url>
<snapshots>
<enabled>true</enabled>
</snapshots>
</repository>
</repositories>

View File

@@ -4,11 +4,15 @@ WORKDIR /app
ARG REGISTRY_TOKEN
COPY package.json package-lock.json .npmrc ./
RUN echo "//gitea.siegeln.net/api/packages/cameleer/npm/:_authToken=${REGISTRY_TOKEN}" >> .npmrc && \
npm ci && \
rm -f .npmrc
npm ci
COPY . .
# Upgrade design system to latest dev snapshot (after COPY to bust Docker cache)
RUN echo "//gitea.siegeln.net/api/packages/cameleer/npm/:_authToken=${REGISTRY_TOKEN}" >> .npmrc && \
npm install @cameleer/design-system@dev && \
rm -f .npmrc
ARG VITE_ENV_NAME=PRODUCTION
ENV VITE_ENV_NAME=$VITE_ENV_NAME
RUN npm run build

22
ui/package-lock.json generated
View File

@@ -8,7 +8,7 @@
"name": "ui",
"version": "0.0.0",
"dependencies": {
"@cameleer/design-system": "^0.0.3",
"@cameleer/design-system": "^0.1.17",
"@tanstack/react-query": "^5.90.21",
"openapi-fetch": "^0.17.0",
"react": "^19.2.4",
@@ -276,9 +276,9 @@
}
},
"node_modules/@cameleer/design-system": {
"version": "0.0.3",
"resolved": "https://gitea.siegeln.net/api/packages/cameleer/npm/%40cameleer%2Fdesign-system/-/0.0.3/design-system-0.0.3.tgz",
"integrity": "sha512-x1mZvgYz7j57xFB26pMh9hn5waSJA1CcRWTgkzleLfaO/CmhekLup1HHlbh0b9SxVci6g2HzbcJldr4kvM1yzg==",
"version": "0.1.17",
"resolved": "https://gitea.siegeln.net/api/packages/cameleer/npm/%40cameleer%2Fdesign-system/-/0.1.17/design-system-0.1.17.tgz",
"integrity": "sha512-THK6yN+xSrxEJadEQ4AZiVhPvoI2rq6gvmMonpxVhUw93dOPO5p06pRS5csJc1miFD1thOrazsoDzSTAbNaELw==",
"dependencies": {
"react": "^19.0.0",
"react-dom": "^19.0.0",
@@ -2934,9 +2934,9 @@
}
},
"node_modules/react-router": {
"version": "7.13.1",
"resolved": "https://registry.npmjs.org/react-router/-/react-router-7.13.1.tgz",
"integrity": "sha512-td+xP4X2/6BJvZoX6xw++A2DdEi++YypA69bJUV5oVvqf6/9/9nNlD70YO1e9d3MyamJEBQFEzk6mbfDYbqrSA==",
"version": "7.13.2",
"resolved": "https://registry.npmjs.org/react-router/-/react-router-7.13.2.tgz",
"integrity": "sha512-tX1Aee+ArlKQP+NIUd7SE6Li+CiGKwQtbS+FfRxPX6Pe4vHOo6nr9d++u5cwg+Z8K/x8tP+7qLmujDtfrAoUJA==",
"license": "MIT",
"dependencies": {
"cookie": "^1.0.1",
@@ -2956,12 +2956,12 @@
}
},
"node_modules/react-router-dom": {
"version": "7.13.1",
"resolved": "https://registry.npmjs.org/react-router-dom/-/react-router-dom-7.13.1.tgz",
"integrity": "sha512-UJnV3Rxc5TgUPJt2KJpo1Jpy0OKQr0AjgbZzBFjaPJcFOb2Y8jA5H3LT8HUJAiRLlWrEXWHbF1Z4SCZaQjWDHw==",
"version": "7.13.2",
"resolved": "https://registry.npmjs.org/react-router-dom/-/react-router-dom-7.13.2.tgz",
"integrity": "sha512-aR7SUORwTqAW0JDeiWF07e9SBE9qGpByR9I8kJT5h/FrBKxPMS6TiC7rmVO+gC0q52Bx7JnjWe8Z1sR9faN4YA==",
"license": "MIT",
"dependencies": {
"react-router": "7.13.1"
"react-router": "7.13.2"
},
"engines": {
"node": ">=20.0.0"

View File

@@ -14,7 +14,7 @@
"generate-api:live": "curl -s http://localhost:8081/api/v1/api-docs -o src/api/openapi.json && openapi-typescript src/api/openapi.json -o src/api/schema.d.ts"
},
"dependencies": {
"@cameleer/design-system": "^0.0.3",
"@cameleer/design-system": "^0.1.17",
"@tanstack/react-query": "^5.90.21",
"openapi-fetch": "^0.17.0",
"react": "^19.2.4",

File diff suppressed because one or more lines are too long

View File

@@ -1,5 +1,6 @@
import { useQuery, useMutation, useQueryClient } from '@tanstack/react-query';
import { adminFetch } from './admin-api';
import { useRefreshInterval } from '../use-refresh-interval';
// ── Types ──────────────────────────────────────────────────────────────
@@ -38,34 +39,38 @@ export interface ActiveQuery {
// ── Query Hooks ────────────────────────────────────────────────────────
export function useDatabaseStatus() {
const refetchInterval = useRefreshInterval(30_000);
return useQuery({
queryKey: ['admin', 'database', 'status'],
queryFn: () => adminFetch<DatabaseStatus>('/database/status'),
refetchInterval: 30_000,
refetchInterval,
});
}
export function useConnectionPool() {
const refetchInterval = useRefreshInterval(10_000);
return useQuery({
queryKey: ['admin', 'database', 'pool'],
queryFn: () => adminFetch<PoolStats>('/database/pool'),
refetchInterval: 10_000,
refetchInterval,
});
}
export function useDatabaseTables() {
const refetchInterval = useRefreshInterval(60_000);
return useQuery({
queryKey: ['admin', 'database', 'tables'],
queryFn: () => adminFetch<TableInfo[]>('/database/tables'),
refetchInterval: 60_000,
refetchInterval,
});
}
export function useActiveQueries() {
const refetchInterval = useRefreshInterval(5_000);
return useQuery({
queryKey: ['admin', 'database', 'queries'],
queryFn: () => adminFetch<ActiveQuery[]>('/database/queries'),
refetchInterval: 5_000,
refetchInterval,
});
}

View File

@@ -1,14 +1,15 @@
import { useQuery, useMutation, useQueryClient } from '@tanstack/react-query';
import { adminFetch } from './admin-api';
import { useRefreshInterval } from '../use-refresh-interval';
// ── Types ──────────────────────────────────────────────────────────────
export interface OpenSearchStatus {
connected: boolean;
reachable: boolean;
clusterHealth: string;
version: string | null;
numberOfNodes: number;
url: string;
nodeCount: number;
host: string;
}
export interface PipelineStats {
@@ -53,28 +54,31 @@ export interface PerformanceStats {
// ── Query Hooks ────────────────────────────────────────────────────────
export function useOpenSearchStatus() {
const refetchInterval = useRefreshInterval(30_000);
return useQuery({
queryKey: ['admin', 'opensearch', 'status'],
queryFn: () => adminFetch<OpenSearchStatus>('/opensearch/status'),
refetchInterval: 30_000,
refetchInterval,
});
}
export function usePipelineStats() {
const refetchInterval = useRefreshInterval(10_000);
return useQuery({
queryKey: ['admin', 'opensearch', 'pipeline'],
queryFn: () => adminFetch<PipelineStats>('/opensearch/pipeline'),
refetchInterval: 10_000,
refetchInterval,
});
}
export function useOpenSearchIndices(page = 0, size = 20, search = '') {
export function useOpenSearchIndices(page = 0, size = 20, search = '', prefix = 'executions') {
return useQuery({
queryKey: ['admin', 'opensearch', 'indices', page, size, search],
queryKey: ['admin', 'opensearch', 'indices', prefix, page, size, search],
queryFn: () => {
const params = new URLSearchParams();
params.set('page', String(page));
params.set('size', String(size));
params.set('prefix', prefix);
if (search) params.set('search', search);
return adminFetch<IndicesPage>(`/opensearch/indices?${params}`);
},
@@ -83,10 +87,11 @@ export function useOpenSearchIndices(page = 0, size = 20, search = '') {
}
export function useOpenSearchPerformance() {
const refetchInterval = useRefreshInterval(30_000);
return useQuery({
queryKey: ['admin', 'opensearch', 'performance'],
queryFn: () => adminFetch<PerformanceStats>('/opensearch/performance'),
refetchInterval: 30_000,
refetchInterval,
});
}

View File

@@ -1,8 +1,10 @@
import { useQuery } from '@tanstack/react-query';
import { config } from '../../config';
import { useAuthStore } from '../../auth/auth-store';
import { useRefreshInterval } from './use-refresh-interval';
export function useAgentMetrics(agentId: string | null, names: string[], buckets = 60) {
const refetchInterval = useRefreshInterval(30_000);
return useQuery({
queryKey: ['agent-metrics', agentId, names.join(','), buckets],
queryFn: async () => {
@@ -21,6 +23,6 @@ export function useAgentMetrics(agentId: string | null, names: string[], buckets
return res.json() as Promise<{ metrics: Record<string, Array<{ time: string; value: number }>> }>;
},
enabled: !!agentId && names.length > 0,
refetchInterval: 30_000,
refetchInterval,
});
}

View File

@@ -2,8 +2,10 @@ import { useQuery } from '@tanstack/react-query';
import { api } from '../client';
import { config } from '../../config';
import { useAuthStore } from '../../auth/auth-store';
import { useRefreshInterval } from './use-refresh-interval';
export function useAgents(status?: string, application?: string) {
const refetchInterval = useRefreshInterval(10_000);
return useQuery({
queryKey: ['agents', status, application],
queryFn: async () => {
@@ -13,18 +15,20 @@ export function useAgents(status?: string, application?: string) {
if (error) throw new Error('Failed to load agents');
return data!;
},
refetchInterval: 10_000,
refetchInterval,
});
}
export function useAgentEvents(appId?: string, agentId?: string, limit = 50) {
export function useAgentEvents(appId?: string, agentId?: string, limit = 50, toOverride?: string) {
const refetchInterval = useRefreshInterval(15_000);
return useQuery({
queryKey: ['agents', 'events', appId, agentId, limit],
queryKey: ['agents', 'events', appId, agentId, limit, toOverride],
queryFn: async () => {
const token = useAuthStore.getState().accessToken;
const params = new URLSearchParams();
if (appId) params.set('appId', appId);
if (agentId) params.set('agentId', agentId);
if (toOverride) params.set('to', toOverride);
params.set('limit', String(limit));
const res = await fetch(`${config.apiBaseUrl}/agents/events-log?${params}`, {
headers: {
@@ -35,6 +39,6 @@ export function useAgentEvents(appId?: string, agentId?: string, limit = 50) {
if (!res.ok) throw new Error('Failed to load agent events');
return res.json();
},
refetchInterval: 15_000,
refetchInterval,
});
}

View File

@@ -1,13 +1,19 @@
import { useQuery } from '@tanstack/react-query';
import { config } from '../../config';
import { useAuthStore } from '../../auth/auth-store';
import { useRefreshInterval } from './use-refresh-interval';
export function useRouteCatalog() {
export function useRouteCatalog(from?: string, to?: string) {
const refetchInterval = useRefreshInterval(15_000);
return useQuery({
queryKey: ['routes', 'catalog'],
queryKey: ['routes', 'catalog', from, to],
queryFn: async () => {
const token = useAuthStore.getState().accessToken;
const res = await fetch(`${config.apiBaseUrl}/routes/catalog`, {
const params = new URLSearchParams();
if (from) params.set('from', from);
if (to) params.set('to', to);
const qs = params.toString();
const res = await fetch(`${config.apiBaseUrl}/routes/catalog${qs ? `?${qs}` : ''}`, {
headers: {
Authorization: `Bearer ${token}`,
'X-Cameleer-Protocol-Version': '1',
@@ -16,11 +22,13 @@ export function useRouteCatalog() {
if (!res.ok) throw new Error('Failed to load route catalog');
return res.json();
},
refetchInterval: 15_000,
placeholderData: (prev) => prev,
refetchInterval,
});
}
export function useRouteMetrics(from?: string, to?: string, appId?: string) {
const refetchInterval = useRefreshInterval(30_000);
return useQuery({
queryKey: ['routes', 'metrics', from, to, appId],
queryFn: async () => {
@@ -38,6 +46,6 @@ export function useRouteMetrics(from?: string, to?: string, appId?: string) {
if (!res.ok) throw new Error('Failed to load route metrics');
return res.json();
},
refetchInterval: 30_000,
refetchInterval,
});
}

View File

@@ -0,0 +1,178 @@
import { useMutation, useQuery, useQueryClient } from '@tanstack/react-query'
import { api } from '../client'
import { useAuthStore } from '../../auth/auth-store'
// ── Application Config ────────────────────────────────────────────────────
export interface TapDefinition {
tapId: string
processorId: string
target: 'INPUT' | 'OUTPUT' | 'BOTH'
expression: string
language: string
attributeName: string
attributeType: 'BUSINESS_OBJECT' | 'CORRELATION' | 'EVENT' | 'CUSTOM'
enabled: boolean
version: number
}
export interface ApplicationConfig {
application: string
version: number
updatedAt?: string
engineLevel?: string
payloadCaptureMode?: string
applicationLogLevel?: string
agentLogLevel?: string
metricsEnabled: boolean
samplingRate: number
tracedProcessors: Record<string, string>
taps: TapDefinition[]
tapVersion: number
routeRecording: Record<string, boolean>
compressSuccess: boolean
}
/** Authenticated fetch using the JWT from auth store */
function authFetch(url: string, init?: RequestInit): Promise<Response> {
const token = useAuthStore.getState().accessToken
const headers = new Headers(init?.headers)
if (token) headers.set('Authorization', `Bearer ${token}`)
headers.set('X-Cameleer-Protocol-Version', '1')
return fetch(url, { ...init, headers })
}
export function useAllApplicationConfigs() {
return useQuery({
queryKey: ['applicationConfig', 'all'],
queryFn: async () => {
const res = await authFetch('/api/v1/config')
if (!res.ok) throw new Error('Failed to fetch configs')
return res.json() as Promise<ApplicationConfig[]>
},
})
}
export function useApplicationConfig(application: string | undefined) {
return useQuery({
queryKey: ['applicationConfig', application],
queryFn: async () => {
const res = await authFetch(`/api/v1/config/${application}`)
if (!res.ok) throw new Error('Failed to fetch config')
return res.json() as Promise<ApplicationConfig>
},
enabled: !!application,
})
}
export function useUpdateApplicationConfig() {
const queryClient = useQueryClient()
return useMutation({
mutationFn: async (config: ApplicationConfig) => {
const res = await authFetch(`/api/v1/config/${config.application}`, {
method: 'PUT',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify(config),
})
if (!res.ok) throw new Error('Failed to update config')
return res.json() as Promise<ApplicationConfig>
},
onSuccess: (saved) => {
queryClient.setQueryData(['applicationConfig', saved.application], saved)
queryClient.invalidateQueries({ queryKey: ['applicationConfig', 'all'] })
},
})
}
// ── Processor → Route Mapping ─────────────────────────────────────────────
export function useProcessorRouteMapping(application?: string) {
return useQuery({
queryKey: ['config', application, 'processor-routes'],
queryFn: async () => {
const res = await authFetch(`/api/v1/config/${application}/processor-routes`)
if (!res.ok) throw new Error('Failed to fetch processor-route mapping')
return res.json() as Promise<Record<string, string>>
},
enabled: !!application,
})
}
// ── Generic Group Command (kept for non-config commands) ──────────────────
interface SendGroupCommandParams {
group: string
type: string
payload: Record<string, unknown>
}
export function useSendGroupCommand() {
return useMutation({
mutationFn: async ({ group, type, payload }: SendGroupCommandParams) => {
const { data, error } = await api.POST('/agents/groups/{group}/commands', {
params: { path: { group } },
body: { type, payload } as any,
})
if (error) throw new Error('Failed to send command')
return data!
},
})
}
// ── Test Expression ───────────────────────────────────────────────────────
export function useTestExpression() {
return useMutation({
mutationFn: async ({
application,
expression,
language,
body,
target,
}: {
application: string
expression: string
language: string
body: string
target: string
}) => {
const res = await authFetch(
`/api/v1/config/${encodeURIComponent(application)}/test-expression`,
{
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ expression, language, body, target }),
},
)
if (!res.ok) {
if (res.status === 404) throw new Error('No live agent available')
if (res.status === 504) throw new Error('Expression test timed out')
throw new Error('Failed to test expression')
}
return res.json() as Promise<{ result?: string; error?: string }>
},
})
}
// ── Replay Exchange ───────────────────────────────────────────────────────
export function useReplayExchange() {
return useMutation({
mutationFn: async ({
agentId,
headers,
body,
}: {
agentId: string
headers: Record<string, string>
body: string
}) => {
const { data, error } = await api.POST('/agents/{id}/commands', {
params: { path: { id: agentId } },
body: { type: 'replay', payload: { headers, body } } as any,
})
if (error) throw new Error('Failed to send replay command')
return data!
},
})
}

View File

@@ -1,19 +1,43 @@
import { useQuery } from '@tanstack/react-query';
import { api } from '../client';
interface DiagramLayout {
export interface DiagramNode {
id?: string;
label?: string;
type?: string;
x?: number;
y?: number;
width?: number;
height?: number;
nodes?: Array<{ id?: string; label?: string; type?: string; x?: number; y?: number; width?: number; height?: number }>;
edges?: Array<{ from?: string; to?: string }>;
children?: DiagramNode[];
}
export function useDiagramLayout(contentHash: string | null) {
export interface DiagramEdge {
sourceId: string;
targetId: string;
label?: string;
points: number[][];
}
export interface DiagramLayout {
width?: number;
height?: number;
nodes?: DiagramNode[];
edges?: DiagramEdge[];
}
export function useDiagramLayout(
contentHash: string | null,
direction: 'LR' | 'TB' = 'LR',
) {
return useQuery({
queryKey: ['diagrams', 'layout', contentHash],
queryKey: ['diagrams', 'layout', contentHash, direction],
queryFn: async () => {
const { data, error } = await api.GET('/diagrams/{contentHash}/render', {
params: { path: { contentHash: contentHash! } },
params: {
path: { contentHash: contentHash! },
query: { direction },
},
headers: { Accept: 'application/json' },
});
if (error) throw new Error('Failed to load diagram layout');
@@ -23,15 +47,19 @@ export function useDiagramLayout(contentHash: string | null) {
});
}
export function useDiagramByRoute(application: string | undefined, routeId: string | undefined) {
export function useDiagramByRoute(
application: string | undefined,
routeId: string | undefined,
direction: 'LR' | 'TB' = 'LR',
) {
return useQuery({
queryKey: ['diagrams', 'byRoute', application, routeId],
queryKey: ['diagrams', 'byRoute', application, routeId, direction],
queryFn: async () => {
const { data, error } = await api.GET('/diagrams', {
params: { query: { application: application!, routeId: routeId! } },
params: { query: { application: application!, routeId: routeId!, direction } },
});
if (error) throw new Error('Failed to load diagram for route');
return data!;
return data as DiagramLayout;
},
enabled: !!application && !!routeId,
});

View File

@@ -1,6 +1,7 @@
import { useQuery } from '@tanstack/react-query';
import { api } from '../client';
import type { SearchRequest } from '../types';
import { useLiveQuery } from './use-refresh-interval';
export function useExecutionStats(
timeFrom: string | undefined,
@@ -8,6 +9,7 @@ export function useExecutionStats(
routeId?: string,
application?: string,
) {
const live = useLiveQuery(10_000);
return useQuery({
queryKey: ['executions', 'stats', timeFrom, timeTo, routeId, application],
queryFn: async () => {
@@ -24,13 +26,14 @@ export function useExecutionStats(
if (error) throw new Error('Failed to load stats');
return data!;
},
enabled: !!timeFrom,
enabled: !!timeFrom && live.enabled,
placeholderData: (prev) => prev,
refetchInterval: 10_000,
refetchInterval: live.refetchInterval,
});
}
export function useSearchExecutions(filters: SearchRequest, live = false) {
const liveQuery = useLiveQuery(5_000);
return useQuery({
queryKey: ['executions', 'search', filters],
queryFn: async () => {
@@ -41,7 +44,8 @@ export function useSearchExecutions(filters: SearchRequest, live = false) {
return data!;
},
placeholderData: (prev) => prev,
refetchInterval: live ? 5_000 : false,
enabled: live ? liveQuery.enabled : true,
refetchInterval: live ? liveQuery.refetchInterval : false,
});
}
@@ -51,6 +55,7 @@ export function useStatsTimeseries(
routeId?: string,
application?: string,
) {
const live = useLiveQuery(30_000);
return useQuery({
queryKey: ['executions', 'timeseries', timeFrom, timeTo, routeId, application],
queryFn: async () => {
@@ -68,9 +73,9 @@ export function useStatsTimeseries(
if (error) throw new Error('Failed to load timeseries');
return data!;
},
enabled: !!timeFrom,
enabled: !!timeFrom && live.enabled,
placeholderData: (prev) => prev,
refetchInterval: 30_000,
refetchInterval: live.refetchInterval,
});
}

View File

@@ -0,0 +1,56 @@
import { useQuery } from '@tanstack/react-query';
import { config } from '../../config';
import { useAuthStore } from '../../auth/auth-store';
import { useRefreshInterval } from './use-refresh-interval';
import { useGlobalFilters } from '@cameleer/design-system';
export interface LogEntryResponse {
timestamp: string;
level: string;
loggerName: string | null;
message: string;
threadName: string | null;
stackTrace: string | null;
}
export function useApplicationLogs(
application?: string,
agentId?: string,
options?: { limit?: number; toOverride?: string; exchangeId?: string },
) {
const refetchInterval = useRefreshInterval(15_000);
const { timeRange } = useGlobalFilters();
const to = options?.toOverride ?? timeRange.end.toISOString();
// When filtering by exchangeId, skip the global time range — exchange logs are historical
const useTimeRange = !options?.exchangeId;
return useQuery({
queryKey: ['logs', application, agentId,
useTimeRange ? timeRange.start.toISOString() : null,
useTimeRange ? to : null,
options?.limit, options?.exchangeId],
queryFn: async () => {
const token = useAuthStore.getState().accessToken;
const params = new URLSearchParams();
params.set('application', application!);
if (agentId) params.set('agentId', agentId);
if (options?.exchangeId) params.set('exchangeId', options.exchangeId);
if (useTimeRange) {
params.set('from', timeRange.start.toISOString());
params.set('to', to);
}
if (options?.limit) params.set('limit', String(options.limit));
const res = await fetch(`${config.apiBaseUrl}/logs?${params}`, {
headers: {
Authorization: `Bearer ${token}`,
'X-Cameleer-Protocol-Version': '1',
},
});
if (!res.ok) throw new Error('Failed to load application logs');
return res.json() as Promise<LogEntryResponse[]>;
},
enabled: !!application,
placeholderData: (prev) => prev,
refetchInterval,
});
}

View File

@@ -1,8 +1,10 @@
import { useQuery } from '@tanstack/react-query';
import { config } from '../../config';
import { useAuthStore } from '../../auth/auth-store';
import { useRefreshInterval } from './use-refresh-interval';
export function useProcessorMetrics(routeId: string | null, appId?: string) {
const refetchInterval = useRefreshInterval(30_000);
return useQuery({
queryKey: ['processor-metrics', routeId, appId],
queryFn: async () => {
@@ -20,6 +22,6 @@ export function useProcessorMetrics(routeId: string | null, appId?: string) {
return res.json();
},
enabled: !!routeId,
refetchInterval: 30_000,
refetchInterval,
});
}

View File

@@ -0,0 +1,23 @@
import { useGlobalFilters } from '@cameleer/design-system';
/**
* Returns the given interval when auto-refresh is enabled, or `false` when paused.
* Use as `refetchInterval` in React Query hooks.
*/
export function useRefreshInterval(intervalMs: number): number | false {
const { autoRefresh } = useGlobalFilters();
return autoRefresh ? intervalMs : false;
}
/**
* Returns `enabled` and `refetchInterval` tied to the LIVE/PAUSED toggle.
* - LIVE: enabled=true, refetchInterval=intervalMs (fetch + poll)
* - PAUSED: enabled=false, refetchInterval=false (no fetch at all)
*/
export function useLiveQuery(intervalMs: number) {
const { autoRefresh } = useGlobalFilters();
return {
enabled: autoRefresh,
refetchInterval: autoRefresh ? intervalMs : false as number | false,
};
}

605
ui/src/api/schema.d.ts vendored
View File

@@ -4,6 +4,30 @@
*/
export interface paths {
"/config/{application}": {
parameters: {
query?: never;
header?: never;
path?: never;
cookie?: never;
};
/**
* Get application config
* @description Returns the current configuration for an application. Returns defaults if none stored.
*/
get: operations["getConfig"];
/**
* Update application config
* @description Saves config and pushes CONFIG_UPDATE to all LIVE agents of this application
*/
put: operations["updateConfig"];
post?: never;
delete?: never;
options?: never;
head?: never;
patch?: never;
trace?: never;
};
"/admin/users/{userId}": {
parameters: {
query?: never;
@@ -68,7 +92,7 @@ export interface paths {
cookie?: never;
};
/** Get OIDC configuration */
get: operations["getConfig"];
get: operations["getConfig_1"];
/** Save OIDC configuration */
put: operations["saveConfig"];
post?: never;
@@ -136,6 +160,26 @@ export interface paths {
patch?: never;
trace?: never;
};
"/data/logs": {
parameters: {
query?: never;
header?: never;
path?: never;
cookie?: never;
};
get?: never;
put?: never;
/**
* Ingest application log entries
* @description Accepts a batch of log entries from an agent. Entries are indexed in OpenSearch.
*/
post: operations["ingestLogs"];
delete?: never;
options?: never;
head?: never;
patch?: never;
trace?: never;
};
"/data/executions": {
parameters: {
query?: never;
@@ -176,6 +220,23 @@ export interface paths {
patch?: never;
trace?: never;
};
"/config/{application}/test-expression": {
parameters: {
query?: never;
header?: never;
path?: never;
cookie?: never;
};
get?: never;
put?: never;
/** Test a tap expression against sample data via a live agent */
post: operations["testExpression"];
delete?: never;
options?: never;
head?: never;
patch?: never;
trace?: never;
};
"/auth/refresh": {
parameters: {
query?: never;
@@ -278,7 +339,7 @@ export interface paths {
put?: never;
/**
* Send command to a specific agent
* @description Sends a config-update, deep-trace, or replay command to the specified agent
* @description Sends a command to the specified agent via SSE
*/
post: operations["sendCommand"];
delete?: never;
@@ -298,7 +359,7 @@ export interface paths {
put?: never;
/**
* Acknowledge command receipt
* @description Agent acknowledges that it has received and processed a command
* @description Agent acknowledges that it has received and processed a command, with result status and message
*/
post: operations["acknowledgeCommand"];
delete?: never;
@@ -403,6 +464,23 @@ export interface paths {
patch?: never;
trace?: never;
};
"/admin/users/{userId}/password": {
parameters: {
query?: never;
header?: never;
path?: never;
cookie?: never;
};
get?: never;
put?: never;
/** Reset user password */
post: operations["resetPassword"];
delete?: never;
options?: never;
head?: never;
patch?: never;
trace?: never;
};
"/admin/users/{userId}/groups/{groupId}": {
parameters: {
query?: never;
@@ -563,6 +641,26 @@ export interface paths {
patch?: never;
trace?: never;
};
"/routes/metrics/processors": {
parameters: {
query?: never;
header?: never;
path?: never;
cookie?: never;
};
/**
* Get processor metrics
* @description Returns aggregated performance metrics per processor for the given route and time window
*/
get: operations["getProcessorMetrics"];
put?: never;
post?: never;
delete?: never;
options?: never;
head?: never;
patch?: never;
trace?: never;
};
"/routes/catalog": {
parameters: {
query?: never;
@@ -583,6 +681,26 @@ export interface paths {
patch?: never;
trace?: never;
};
"/logs": {
parameters: {
query?: never;
header?: never;
path?: never;
cookie?: never;
};
/**
* Search application log entries
* @description Returns log entries for a given application, optionally filtered by agent, level, time range, and text query
*/
get: operations["searchLogs"];
put?: never;
post?: never;
delete?: never;
options?: never;
head?: never;
patch?: never;
trace?: never;
};
"/executions/{executionId}": {
parameters: {
query?: never;
@@ -657,6 +775,26 @@ export interface paths {
patch?: never;
trace?: never;
};
"/config": {
parameters: {
query?: never;
header?: never;
path?: never;
cookie?: never;
};
/**
* List all application configs
* @description Returns stored configurations for all applications
*/
get: operations["listConfigs"];
put?: never;
post?: never;
delete?: never;
options?: never;
head?: never;
patch?: never;
trace?: never;
};
"/auth/oidc/config": {
parameters: {
query?: never;
@@ -665,7 +803,7 @@ export interface paths {
cookie?: never;
};
/** Get OIDC config for SPA login flow */
get: operations["getConfig_1"];
get: operations["getConfig_2"];
put?: never;
post?: never;
delete?: never;
@@ -714,6 +852,22 @@ export interface paths {
patch?: never;
trace?: never;
};
"/agents/{agentId}/metrics": {
parameters: {
query?: never;
header?: never;
path?: never;
cookie?: never;
};
get: operations["getMetrics_1"];
put?: never;
post?: never;
delete?: never;
options?: never;
head?: never;
patch?: never;
trace?: never;
};
"/agents/events-log": {
parameters: {
query?: never;
@@ -887,6 +1041,23 @@ export interface paths {
patch?: never;
trace?: never;
};
"/admin/database/metrics-pipeline": {
parameters: {
query?: never;
header?: never;
path?: never;
cookie?: never;
};
/** Get metrics ingestion pipeline diagnostics */
get: operations["getMetricsPipeline"];
put?: never;
post?: never;
delete?: never;
options?: never;
head?: never;
patch?: never;
trace?: never;
};
"/admin/audit": {
parameters: {
query?: never;
@@ -925,6 +1096,41 @@ export interface paths {
export type webhooks = Record<string, never>;
export interface components {
schemas: {
ApplicationConfig: {
application?: string;
/** Format: int32 */
version?: number;
/** Format: date-time */
updatedAt?: string;
engineLevel?: string;
payloadCaptureMode?: string;
metricsEnabled?: boolean;
/** Format: double */
samplingRate?: number;
tracedProcessors?: {
[key: string]: string;
};
logForwardingLevel?: string;
taps?: components["schemas"]["TapDefinition"][];
/** Format: int32 */
tapVersion?: number;
routeRecording?: {
[key: string]: boolean;
};
compressSuccess?: boolean;
};
TapDefinition: {
tapId?: string;
processorId?: string;
target?: string;
expression?: string;
language?: string;
attributeName?: string;
attributeType?: string;
enabled?: boolean;
/** Format: int32 */
version?: number;
};
UpdateUserRequest: {
displayName?: string;
email?: string;
@@ -1103,6 +1309,10 @@ export interface components {
correlationId: string;
errorMessage: string;
diagramContentHash: string;
highlight: string;
attributes: {
[key: string]: string;
};
};
SearchResultExecutionSummary: {
data: components["schemas"]["ExecutionSummary"][];
@@ -1113,6 +1323,31 @@ export interface components {
/** Format: int32 */
limit: number;
};
LogBatch: {
entries?: components["schemas"]["LogEntry"][];
};
LogEntry: {
/** Format: date-time */
timestamp?: string;
level?: string;
loggerName?: string;
message?: string;
threadName?: string;
stackTrace?: string;
mdc?: {
[key: string]: string;
};
};
TestExpressionRequest: {
expression?: string;
language?: string;
body?: string;
target?: string;
};
TestExpressionResponse: {
result?: string;
error?: string;
};
RefreshRequest: {
refreshToken?: string;
};
@@ -1153,6 +1388,11 @@ export interface components {
commandId: string;
status: string;
};
CommandAckRequest: {
status?: string;
message?: string;
data?: string;
};
/** @description Agent registration payload */
AgentRegistrationRequest: {
agentId: string;
@@ -1211,6 +1451,9 @@ export interface components {
effectiveRoles?: components["schemas"]["RoleSummary"][];
effectiveGroups?: components["schemas"]["GroupSummary"][];
};
SetPasswordRequest: {
password?: string;
};
CreateRoleRequest: {
name?: string;
description?: string;
@@ -1283,6 +1526,22 @@ export interface components {
throughputPerSec: number;
sparkline: number[];
};
ProcessorMetrics: {
processorId: string;
processorType: string;
routeId: string;
appId: string;
/** Format: int64 */
totalCount: number;
/** Format: int64 */
failedCount: number;
/** Format: double */
avgDurationMs: number;
/** Format: double */
p99DurationMs: number;
/** Format: double */
errorRate: number;
};
/** @description Summary of an agent instance for sidebar display */
AgentSummary: {
id: string;
@@ -1310,10 +1569,26 @@ export interface components {
/** Format: date-time */
lastSeen: string;
};
/** @description Application log entry from OpenSearch */
LogEntryResponse: {
/** @description Log timestamp (ISO-8601) */
timestamp?: string;
/** @description Log level (INFO, WARN, ERROR, DEBUG) */
level?: string;
/** @description Logger name */
loggerName?: string;
/** @description Log message */
message?: string;
/** @description Thread name */
threadName?: string;
/** @description Stack trace (if present) */
stackTrace?: string;
};
ExecutionDetail: {
executionId: string;
routeId: string;
agentId: string;
applicationName: string;
status: string;
/** Format: date-time */
startTime: string;
@@ -1327,8 +1602,13 @@ export interface components {
errorStackTrace: string;
diagramContentHash: string;
processors: components["schemas"]["ProcessorNode"][];
applicationName?: string;
children?: components["schemas"]["ProcessorNode"][];
inputBody: string;
outputBody: string;
inputHeaders: string;
outputHeaders: string;
attributes: {
[key: string]: string;
};
};
ProcessorNode: {
processorId: string;
@@ -1343,6 +1623,9 @@ export interface components {
diagramNodeId: string;
errorMessage: string;
errorStackTrace: string;
attributes: {
[key: string]: string;
};
children: components["schemas"]["ProcessorNode"][];
};
DiagramLayout: {
@@ -1391,6 +1674,10 @@ export interface components {
registeredAt: string;
/** Format: date-time */
lastHeartbeat: string;
version: string;
capabilities: {
[key: string]: Record<string, never>;
};
/** Format: double */
tps: number;
/** Format: double */
@@ -1406,6 +1693,17 @@ export interface components {
/** Format: int64 */
timeout?: number;
};
AgentMetricsResponse: {
metrics: {
[key: string]: components["schemas"]["MetricBucket"][];
};
};
MetricBucket: {
/** Format: date-time */
time: string;
/** Format: double */
value: number;
};
/** @description Agent lifecycle event */
AgentEventResponse: {
/** Format: int64 */
@@ -1723,7 +2021,7 @@ export interface components {
username?: string;
action?: string;
/** @enum {string} */
category?: "INFRA" | "AUTH" | "USER_MGMT" | "CONFIG" | "RBAC";
category?: "INFRA" | "AUTH" | "USER_MGMT" | "CONFIG" | "RBAC" | "AGENT";
target?: string;
detail?: {
[key: string]: Record<string, never>;
@@ -1742,6 +2040,54 @@ export interface components {
}
export type $defs = Record<string, never>;
export interface operations {
getConfig: {
parameters: {
query?: never;
header?: never;
path: {
application: string;
};
cookie?: never;
};
requestBody?: never;
responses: {
/** @description Config returned */
200: {
headers: {
[name: string]: unknown;
};
content: {
"*/*": components["schemas"]["ApplicationConfig"];
};
};
};
};
updateConfig: {
parameters: {
query?: never;
header?: never;
path: {
application: string;
};
cookie?: never;
};
requestBody: {
content: {
"application/json": components["schemas"]["ApplicationConfig"];
};
};
responses: {
/** @description Config saved and pushed */
200: {
headers: {
[name: string]: unknown;
};
content: {
"*/*": components["schemas"]["ApplicationConfig"];
};
};
};
};
getUser: {
parameters: {
query?: never;
@@ -1971,7 +2317,7 @@ export interface operations {
};
};
};
getConfig: {
getConfig_1: {
parameters: {
query?: never;
header?: never;
@@ -2149,7 +2495,7 @@ export interface operations {
routeId?: string;
agentId?: string;
processorType?: string;
group?: string;
application?: string;
offset?: number;
limit?: number;
sortField?: string;
@@ -2216,6 +2562,13 @@ export interface operations {
};
content?: never;
};
/** @description Invalid payload */
400: {
headers: {
[name: string]: unknown;
};
content?: never;
};
/** @description Buffer full, retry later */
503: {
headers: {
@@ -2225,6 +2578,28 @@ export interface operations {
};
};
};
ingestLogs: {
parameters: {
query?: never;
header?: never;
path?: never;
cookie?: never;
};
requestBody: {
content: {
"application/json": components["schemas"]["LogBatch"];
};
};
responses: {
/** @description Logs accepted for indexing */
202: {
headers: {
[name: string]: unknown;
};
content?: never;
};
};
};
ingestExecutions: {
parameters: {
query?: never;
@@ -2269,6 +2644,50 @@ export interface operations {
};
};
};
testExpression: {
parameters: {
query?: never;
header?: never;
path: {
application: string;
};
cookie?: never;
};
requestBody: {
content: {
"application/json": components["schemas"]["TestExpressionRequest"];
};
};
responses: {
/** @description Expression evaluated successfully */
200: {
headers: {
[name: string]: unknown;
};
content: {
"*/*": components["schemas"]["TestExpressionResponse"];
};
};
/** @description No live agent available for this application */
404: {
headers: {
[name: string]: unknown;
};
content: {
"*/*": components["schemas"]["TestExpressionResponse"];
};
};
/** @description Agent did not respond in time */
504: {
headers: {
[name: string]: unknown;
};
content: {
"*/*": components["schemas"]["TestExpressionResponse"];
};
};
};
};
refresh: {
parameters: {
query?: never;
@@ -2511,7 +2930,11 @@ export interface operations {
};
cookie?: never;
};
requestBody?: never;
requestBody?: {
content: {
"application/json": components["schemas"]["CommandAckRequest"];
};
};
responses: {
/** @description Command acknowledged */
200: {
@@ -2732,6 +3155,30 @@ export interface operations {
};
};
};
resetPassword: {
parameters: {
query?: never;
header?: never;
path: {
userId: string;
};
cookie?: never;
};
requestBody: {
content: {
"application/json": components["schemas"]["SetPasswordRequest"];
};
};
responses: {
/** @description Password reset */
204: {
headers: {
[name: string]: unknown;
};
content?: never;
};
};
};
addUserToGroup: {
parameters: {
query?: never;
@@ -3046,9 +3493,37 @@ export interface operations {
};
};
};
getProcessorMetrics: {
parameters: {
query: {
routeId: string;
appId?: string;
from?: string;
to?: string;
};
header?: never;
path?: never;
cookie?: never;
};
requestBody?: never;
responses: {
/** @description Metrics returned */
200: {
headers: {
[name: string]: unknown;
};
content: {
"*/*": components["schemas"]["ProcessorMetrics"][];
};
};
};
};
getCatalog: {
parameters: {
query?: never;
query?: {
from?: string;
to?: string;
};
header?: never;
path?: never;
cookie?: never;
@@ -3066,6 +3541,35 @@ export interface operations {
};
};
};
searchLogs: {
parameters: {
query: {
application: string;
agentId?: string;
level?: string;
query?: string;
exchangeId?: string;
from?: string;
to?: string;
limit?: number;
};
header?: never;
path?: never;
cookie?: never;
};
requestBody?: never;
responses: {
/** @description OK */
200: {
headers: {
[name: string]: unknown;
};
content: {
"*/*": components["schemas"]["LogEntryResponse"][];
};
};
};
};
getDetail: {
parameters: {
query?: never;
@@ -3138,6 +3642,8 @@ export interface operations {
query: {
application: string;
routeId: string;
/** @description Layout direction: LR (left-to-right) or TB (top-to-bottom) */
direction?: "LR" | "TB";
};
header?: never;
path?: never;
@@ -3167,7 +3673,10 @@ export interface operations {
};
renderDiagram: {
parameters: {
query?: never;
query?: {
/** @description Layout direction: LR (left-to-right) or TB (top-to-bottom) */
direction?: "LR" | "TB";
};
header?: never;
path: {
contentHash: string;
@@ -3197,7 +3706,27 @@ export interface operations {
};
};
};
getConfig_1: {
listConfigs: {
parameters: {
query?: never;
header?: never;
path?: never;
cookie?: never;
};
requestBody?: never;
responses: {
/** @description Configs returned */
200: {
headers: {
[name: string]: unknown;
};
content: {
"*/*": components["schemas"]["ApplicationConfig"][];
};
};
};
};
getConfig_2: {
parameters: {
query?: never;
header?: never;
@@ -3301,6 +3830,33 @@ export interface operations {
};
};
};
getMetrics_1: {
parameters: {
query: {
names: string;
from?: string;
to?: string;
buckets?: number;
};
header?: never;
path: {
agentId: string;
};
cookie?: never;
};
requestBody?: never;
responses: {
/** @description OK */
200: {
headers: {
[name: string]: unknown;
};
content: {
"*/*": components["schemas"]["AgentMetricsResponse"];
};
};
};
};
getEvents: {
parameters: {
query?: {
@@ -3413,6 +3969,7 @@ export interface operations {
page?: number;
size?: number;
search?: string;
prefix?: string;
};
header?: never;
path?: never;
@@ -3511,6 +4068,28 @@ export interface operations {
};
};
};
getMetricsPipeline: {
parameters: {
query?: never;
header?: never;
path?: never;
cookie?: never;
};
requestBody?: never;
responses: {
/** @description OK */
200: {
headers: {
[name: string]: unknown;
};
content: {
"*/*": {
[key: string]: Record<string, never>;
};
};
};
};
};
getAuditLog: {
parameters: {
query?: {

View File

@@ -0,0 +1,85 @@
.page {
display: flex;
align-items: center;
justify-content: center;
min-height: 100vh;
background: var(--bg-base);
}
.card {
width: 100%;
max-width: 400px;
padding: 32px;
}
.loginForm {
display: flex;
flex-direction: column;
align-items: center;
font-family: var(--font-body);
width: 100%;
}
.logo {
margin-bottom: 8px;
font-size: 24px;
font-weight: 700;
color: var(--text-primary);
}
.subtitle {
font-size: 13px;
color: var(--text-muted);
margin: 0 0 24px;
}
.error {
width: 100%;
margin-bottom: 16px;
}
.socialSection {
display: flex;
flex-direction: column;
gap: 8px;
width: 100%;
margin-bottom: 20px;
}
.divider {
display: flex;
align-items: center;
gap: 12px;
width: 100%;
margin-bottom: 20px;
}
.dividerLine {
flex: 1;
height: 1px;
background: var(--border);
}
.dividerText {
color: var(--text-muted);
font-size: 11px;
text-transform: uppercase;
letter-spacing: 0.5px;
font-weight: 500;
}
.fields {
display: flex;
flex-direction: column;
gap: 14px;
width: 100%;
}
.submitButton {
width: 100%;
}
.ssoButton {
width: 100%;
justify-content: center;
}

View File

@@ -3,6 +3,7 @@ import { Navigate } from 'react-router';
import { useAuthStore } from './auth-store';
import { api } from '../api/client';
import { Card, Input, Button, Alert, FormField } from '@cameleer/design-system';
import styles from './LoginPage.module.css';
interface OidcInfo {
clientId: string;
@@ -50,53 +51,75 @@ export function LoginPage() {
};
return (
<div style={{ display: 'flex', alignItems: 'center', justifyContent: 'center', minHeight: '100vh', background: 'var(--surface-ground)' }}>
<Card>
<form onSubmit={handleSubmit} style={{ padding: '2rem', minWidth: 360 }}>
<div style={{ textAlign: 'center', marginBottom: '1.5rem' }}>
<h1 style={{ fontSize: '1.5rem', fontWeight: 600 }}>cameleer3</h1>
<p style={{ color: 'var(--text-secondary)', marginTop: '0.25rem', fontSize: '0.875rem' }}>
Sign in to access the observability dashboard
</p>
</div>
<div className={styles.page}>
<Card className={styles.card}>
<div className={styles.loginForm}>
<div className={styles.logo}>cameleer3</div>
<p className={styles.subtitle}>Sign in to access the observability dashboard</p>
{error && (
<div className={styles.error}>
<Alert variant="error">{error}</Alert>
</div>
)}
{oidc && (
<>
<Button variant="secondary" onClick={handleOidcLogin} disabled={oidcLoading} style={{ width: '100%', marginBottom: '1rem' }}>
{oidcLoading ? 'Redirecting...' : 'Sign in with SSO'}
</Button>
<div style={{ display: 'flex', alignItems: 'center', gap: '0.75rem', margin: '1rem 0' }}>
<hr style={{ flex: 1, border: 'none', borderTop: '1px solid var(--border)' }} />
<span style={{ color: 'var(--text-tertiary)', fontSize: '0.75rem' }}>or</span>
<hr style={{ flex: 1, border: 'none', borderTop: '1px solid var(--border)' }} />
<div className={styles.socialSection}>
<Button
variant="secondary"
className={styles.ssoButton}
onClick={handleOidcLogin}
disabled={oidcLoading}
type="button"
>
{oidcLoading ? 'Redirecting...' : 'Sign in with SSO'}
</Button>
</div>
<div className={styles.divider}>
<div className={styles.dividerLine} />
<span className={styles.dividerText}>or</span>
<div className={styles.dividerLine} />
</div>
</>
)}
<FormField label="Username">
<Input
value={username}
onChange={(e) => setUsername(e.target.value)}
autoFocus
autoComplete="username"
/>
</FormField>
<form className={styles.fields} onSubmit={handleSubmit} aria-label="Sign in" noValidate>
<FormField label="Username" htmlFor="login-username">
<Input
id="login-username"
value={username}
onChange={(e) => setUsername(e.target.value)}
placeholder="Enter your username"
autoFocus
autoComplete="username"
disabled={loading}
/>
</FormField>
<FormField label="Password">
<Input
type="password"
value={password}
onChange={(e) => setPassword(e.target.value)}
autoComplete="current-password"
/>
</FormField>
<FormField label="Password" htmlFor="login-password">
<Input
id="login-password"
type="password"
value={password}
onChange={(e) => setPassword(e.target.value)}
placeholder="••••••••"
autoComplete="current-password"
disabled={loading}
/>
</FormField>
<Button variant="primary" disabled={loading || !username || !password} style={{ width: '100%', marginTop: '0.5rem' }}>
{loading ? 'Signing in...' : 'Sign In'}
</Button>
{error && <div style={{ marginTop: '1rem' }}><Alert variant="error">{error}</Alert></div>}
</form>
<Button
variant="primary"
type="submit"
loading={loading}
disabled={loading || !username || !password}
className={styles.submitButton}
>
Sign in
</Button>
</form>
</div>
</Card>
</div>
);

View File

@@ -1,17 +1,108 @@
import { Outlet, useNavigate, useLocation } from 'react-router';
import { AppShell, Sidebar, TopBar, CommandPalette, CommandPaletteProvider, GlobalFilterProvider, ToastProvider, useCommandPalette } from '@cameleer/design-system';
import { AppShell, Sidebar, TopBar, CommandPalette, CommandPaletteProvider, GlobalFilterProvider, ToastProvider, BreadcrumbProvider, useCommandPalette, useGlobalFilters } from '@cameleer/design-system';
import type { SidebarApp, SearchResult } from '@cameleer/design-system';
import { useRouteCatalog } from '../api/queries/catalog';
import { useAgents } from '../api/queries/agents';
import { useSearchExecutions } from '../api/queries/executions';
import { useAuthStore } from '../auth/auth-store';
import { useMemo, useCallback } from 'react';
import type { SidebarApp } from '@cameleer/design-system';
import { useState, useMemo, useCallback, useEffect } from 'react';
function healthToColor(health: string): string {
switch (health) {
case 'live': return 'success';
case 'stale': return 'warning';
case 'dead': return 'error';
default: return 'auto';
}
}
function buildSearchData(
catalog: any[] | undefined,
agents: any[] | undefined,
): SearchResult[] {
if (!catalog) return [];
const results: SearchResult[] = [];
for (const app of catalog) {
const liveAgents = (app.agents || []).filter((a: any) => a.status === 'live').length;
results.push({
id: app.appId,
category: 'application',
title: app.appId,
badges: [{ label: (app.health || 'unknown').toUpperCase(), color: healthToColor(app.health) }],
meta: `${(app.routes || []).length} routes · ${(app.agents || []).length} agents (${liveAgents} live) · ${(app.exchangeCount ?? 0).toLocaleString()} exchanges`,
path: `/apps/${app.appId}`,
});
for (const route of (app.routes || [])) {
results.push({
id: route.routeId,
category: 'route',
title: route.routeId,
badges: [{ label: app.appId }],
meta: `${(route.exchangeCount ?? 0).toLocaleString()} exchanges`,
path: `/apps/${app.appId}/${route.routeId}`,
});
}
}
if (agents) {
for (const agent of agents) {
results.push({
id: agent.id,
category: 'agent',
title: agent.name,
badges: [{ label: (agent.state || 'unknown').toUpperCase(), color: healthToColor((agent.state || '').toLowerCase()) }],
meta: `${agent.application} · ${agent.version || ''}${agent.agentTps != null ? ` · ${agent.agentTps.toFixed(1)} msg/s` : ''}`,
path: `/agents/${agent.application}/${agent.id}`,
});
}
}
return results;
}
function formatDuration(ms: number): string {
if (ms >= 60_000) return `${(ms / 1000).toFixed(0)}s`;
if (ms >= 1000) return `${(ms / 1000).toFixed(2)}s`;
return `${ms}ms`;
}
function statusToColor(status: string): string {
switch (status) {
case 'COMPLETED': return 'success';
case 'FAILED': return 'error';
case 'RUNNING': return 'running';
default: return 'warning';
}
}
function useDebouncedValue<T>(value: T, delayMs: number): T {
const [debounced, setDebounced] = useState(value);
useEffect(() => {
const timer = setTimeout(() => setDebounced(value), delayMs);
return () => clearTimeout(timer);
}, [value, delayMs]);
return debounced;
}
function LayoutContent() {
const navigate = useNavigate();
const location = useLocation();
const { data: catalog } = useRouteCatalog();
const { username, roles, logout } = useAuthStore();
const { timeRange } = useGlobalFilters();
const { data: catalog } = useRouteCatalog(timeRange.start.toISOString(), timeRange.end.toISOString());
const { data: agents } = useAgents();
const { username, logout } = useAuthStore();
const { open: paletteOpen, setOpen: setPaletteOpen } = useCommandPalette();
// Exchange full-text search via command palette
const [paletteQuery, setPaletteQuery] = useState('');
const debouncedQuery = useDebouncedValue(paletteQuery, 300);
const { data: exchangeResults } = useSearchExecutions(
{ text: debouncedQuery || undefined, offset: 0, limit: 10 },
false,
);
const sidebarApps: SidebarApp[] = useMemo(() => {
if (!catalog) return [];
return catalog.map((app: any) => ({
@@ -33,11 +124,66 @@ function LayoutContent() {
}));
}, [catalog]);
const catalogData = useMemo(
() => buildSearchData(catalog, agents as any[]),
[catalog, agents],
);
const searchData: SearchResult[] = useMemo(() => {
const exchangeItems: SearchResult[] = (exchangeResults?.data || []).map((e: any) => ({
id: e.executionId,
category: 'exchange' as const,
title: e.executionId,
badges: [{ label: e.status, color: statusToColor(e.status) }],
meta: `${e.routeId} · ${e.applicationName ?? ''} · ${formatDuration(e.durationMs)}`,
path: `/exchanges/${e.executionId}`,
serverFiltered: true,
matchContext: e.highlight ?? undefined,
}));
const attributeItems: SearchResult[] = [];
if (debouncedQuery) {
const q = debouncedQuery.toLowerCase();
for (const e of exchangeResults?.data || []) {
if (!e.attributes) continue;
for (const [key, value] of Object.entries(e.attributes as Record<string, string>)) {
if (key.toLowerCase().includes(q) || String(value).toLowerCase().includes(q)) {
attributeItems.push({
id: `${e.executionId}-attr-${key}`,
category: 'attribute' as const,
title: `${key} = "${value}"`,
badges: [{ label: e.status, color: statusToColor(e.status) }],
meta: `${e.executionId} · ${e.routeId} · ${e.applicationName ?? ''}`,
path: `/exchanges/${e.executionId}`,
serverFiltered: true,
});
}
}
}
}
return [...catalogData, ...exchangeItems, ...attributeItems];
}, [catalogData, exchangeResults, debouncedQuery]);
const breadcrumb = useMemo(() => {
const LABELS: Record<string, string> = {
apps: 'Applications',
agents: 'Agents',
exchanges: 'Exchanges',
routes: 'Routes',
admin: 'Admin',
'api-docs': 'API Docs',
rbac: 'Users & Roles',
audit: 'Audit Log',
oidc: 'OIDC',
database: 'Database',
opensearch: 'OpenSearch',
appconfig: 'App Config',
};
const parts = location.pathname.split('/').filter(Boolean);
return parts.map((part, i) => ({
label: part,
href: '/' + parts.slice(0, i + 1).join('/'),
label: LABELS[part] ?? part,
...(i < parts.length - 1 ? { href: '/' + parts.slice(0, i + 1).join('/') } : {}),
}));
}, [location.pathname]);
@@ -47,12 +193,12 @@ function LayoutContent() {
}, [logout, navigate]);
const handlePaletteSelect = useCallback((result: any) => {
if (result.path) navigate(result.path);
if (result.path) {
navigate(result.path, { state: result.path ? { sidebarReveal: result.path } : undefined });
}
setPaletteOpen(false);
}, [navigate, setPaletteOpen]);
const isAdmin = roles.includes('ADMIN');
return (
<AppShell
sidebar={
@@ -69,8 +215,10 @@ function LayoutContent() {
<CommandPalette
open={paletteOpen}
onClose={() => setPaletteOpen(false)}
onOpen={() => setPaletteOpen(true)}
onSelect={handlePaletteSelect}
data={[]}
onQueryChange={setPaletteQuery}
data={searchData}
/>
<main style={{ flex: 1, overflow: 'auto', padding: '1.5rem' }}>
<Outlet />
@@ -84,7 +232,9 @@ export function LayoutShell() {
<ToastProvider>
<CommandPaletteProvider>
<GlobalFilterProvider>
<LayoutContent />
<BreadcrumbProvider>
<LayoutContent />
</BreadcrumbProvider>
</GlobalFilterProvider>
</CommandPaletteProvider>
</ToastProvider>

View File

@@ -0,0 +1,136 @@
import type { DiagramNode as DiagramNodeType, DiagramEdge as DiagramEdgeType } from '../../api/queries/diagrams';
import type { NodeConfig } from './types';
import { colorForType, isCompoundType } from './node-colors';
import { DiagramNode } from './DiagramNode';
import { DiagramEdge } from './DiagramEdge';
const HEADER_HEIGHT = 22;
const CORNER_RADIUS = 4;
interface CompoundNodeProps {
node: DiagramNodeType;
/** All edges for this section — compound filters to its own internal edges */
edges: DiagramEdgeType[];
/** Absolute offset of the nearest compound ancestor (for coordinate adjustment) */
parentX?: number;
parentY?: number;
selectedNodeId?: string;
hoveredNodeId: string | null;
nodeConfigs?: Map<string, NodeConfig>;
onNodeClick: (nodeId: string) => void;
onNodeEnter: (nodeId: string) => void;
onNodeLeave: () => void;
}
export function CompoundNode({
node, edges, parentX = 0, parentY = 0,
selectedNodeId, hoveredNodeId, nodeConfigs,
onNodeClick, onNodeEnter, onNodeLeave,
}: CompoundNodeProps) {
const x = (node.x ?? 0) - parentX;
const y = (node.y ?? 0) - parentY;
const absX = node.x ?? 0;
const absY = node.y ?? 0;
const w = node.width ?? 200;
const h = node.height ?? 100;
const color = colorForType(node.type);
const typeName = node.type?.replace(/^EIP_/, '').replace(/_/g, ' ') ?? '';
const label = node.label ? `${typeName}: ${node.label}` : typeName;
// Collect all descendant node IDs to filter edges that belong inside this compound
const descendantIds = new Set<string>();
collectIds(node.children ?? [], descendantIds);
const internalEdges = edges.filter(
e => descendantIds.has(e.sourceId) && descendantIds.has(e.targetId),
);
return (
<g data-node-id={node.id} transform={`translate(${x}, ${y})`}>
{/* Container body */}
<rect
x={0}
y={0}
width={w}
height={h}
rx={CORNER_RADIUS}
fill="white"
stroke={color}
strokeWidth={1.5}
/>
{/* Colored header bar */}
<rect x={0} y={0} width={w} height={HEADER_HEIGHT} rx={CORNER_RADIUS} fill={color} />
<rect x={CORNER_RADIUS} y={CORNER_RADIUS} width={w - CORNER_RADIUS * 2} height={HEADER_HEIGHT - CORNER_RADIUS} fill={color} />
{/* Header label */}
<text
x={w / 2}
y={HEADER_HEIGHT / 2 + 4}
fill="white"
fontSize={10}
fontWeight={600}
textAnchor="middle"
>
{label}
</text>
{/* Internal edges (rendered after background, before children) */}
<g className="edges">
{internalEdges.map((edge, i) => (
<DiagramEdge
key={`${edge.sourceId}-${edge.targetId}-${i}`}
edge={{
...edge,
points: edge.points.map(p => [p[0] - absX, p[1] - absY]),
}}
/>
))}
</g>
{/* Children — recurse into compound children, render leaves as DiagramNode */}
{node.children?.map(child => {
if (isCompoundType(child.type) && child.children && child.children.length > 0) {
return (
<CompoundNode
key={child.id}
node={child}
edges={edges}
parentX={absX}
parentY={absY}
selectedNodeId={selectedNodeId}
hoveredNodeId={hoveredNodeId}
nodeConfigs={nodeConfigs}
onNodeClick={onNodeClick}
onNodeEnter={onNodeEnter}
onNodeLeave={onNodeLeave}
/>
);
}
return (
<DiagramNode
key={child.id}
node={{
...child,
x: (child.x ?? 0) - absX,
y: (child.y ?? 0) - absY,
}}
isHovered={hoveredNodeId === child.id}
isSelected={selectedNodeId === child.id}
config={child.id ? nodeConfigs?.get(child.id) : undefined}
onClick={() => child.id && onNodeClick(child.id)}
onMouseEnter={() => child.id && onNodeEnter(child.id)}
onMouseLeave={onNodeLeave}
/>
);
})}
</g>
);
}
function collectIds(nodes: DiagramNodeType[], set: Set<string>) {
for (const n of nodes) {
if (n.id) set.add(n.id);
if (n.children) collectIds(n.children, set);
}
}

View File

@@ -0,0 +1,48 @@
import type { NodeConfig } from './types';
const BADGE_HEIGHT = 14;
const BADGE_RADIUS = 7;
const BADGE_FONT_SIZE = 8;
const BADGE_GAP = 4;
interface ConfigBadgeProps {
nodeWidth: number;
config: NodeConfig;
}
export function ConfigBadge({ nodeWidth, config }: ConfigBadgeProps) {
const badges: { label: string; color: string }[] = [];
if (config.tapExpression) badges.push({ label: 'TAP', color: '#7C3AED' });
if (config.traceEnabled) badges.push({ label: 'TRACE', color: '#1A7F8E' });
if (badges.length === 0) return null;
let xOffset = nodeWidth;
return (
<g className="config-badges">
{badges.map((badge, i) => {
const textWidth = badge.label.length * 5.5 + 8;
xOffset -= textWidth + (i > 0 ? BADGE_GAP : 0);
return (
<g key={badge.label} transform={`translate(${xOffset}, ${-BADGE_HEIGHT / 2 - 2})`}>
<rect
width={textWidth}
height={BADGE_HEIGHT}
rx={BADGE_RADIUS}
fill={badge.color}
/>
<text
x={textWidth / 2}
y={BADGE_HEIGHT / 2 + 3}
fill="white"
fontSize={BADGE_FONT_SIZE}
fontWeight={600}
textAnchor="middle"
>
{badge.label}
</text>
</g>
);
})}
</g>
);
}

View File

@@ -0,0 +1,49 @@
import type { DiagramEdge as DiagramEdgeType } from '../../api/queries/diagrams';
interface DiagramEdgeProps {
edge: DiagramEdgeType;
offsetY?: number;
}
export function DiagramEdge({ edge, offsetY = 0 }: DiagramEdgeProps) {
const pts = edge.points;
if (!pts || pts.length < 2) return null;
// Build SVG path: move to first point, then cubic bezier or line to rest
let d = `M ${pts[0][0]} ${pts[0][1] + offsetY}`;
if (pts.length === 2) {
d += ` L ${pts[1][0]} ${pts[1][1] + offsetY}`;
} else if (pts.length === 4) {
// 4 points: start, control1, control2, end → cubic bezier
d += ` C ${pts[1][0]} ${pts[1][1] + offsetY}, ${pts[2][0]} ${pts[2][1] + offsetY}, ${pts[3][0]} ${pts[3][1] + offsetY}`;
} else {
// Multiple points: connect with line segments through intermediate points
for (let i = 1; i < pts.length; i++) {
d += ` L ${pts[i][0]} ${pts[i][1] + offsetY}`;
}
}
return (
<g className="diagram-edge">
<path
d={d}
fill="none"
stroke="#9CA3AF"
strokeWidth={1.5}
markerEnd="url(#arrowhead)"
/>
{edge.label && pts.length >= 2 && (
<text
x={(pts[0][0] + pts[pts.length - 1][0]) / 2}
y={(pts[0][1] + pts[pts.length - 1][1]) / 2 + offsetY - 6}
fill="#9C9184"
fontSize={9}
textAnchor="middle"
>
{edge.label}
</text>
)}
</g>
);
}

View File

@@ -0,0 +1,93 @@
import type { DiagramNode as DiagramNodeType } from '../../api/queries/diagrams';
import type { NodeConfig } from './types';
import { colorForType, iconForType } from './node-colors';
import { ConfigBadge } from './ConfigBadge';
const TOP_BAR_HEIGHT = 6;
const CORNER_RADIUS = 4;
interface DiagramNodeProps {
node: DiagramNodeType;
isHovered: boolean;
isSelected: boolean;
config?: NodeConfig;
onClick: () => void;
onMouseEnter: () => void;
onMouseLeave: () => void;
}
export function DiagramNode({
node, isHovered, isSelected, config, onClick, onMouseEnter, onMouseLeave,
}: DiagramNodeProps) {
const x = node.x ?? 0;
const y = node.y ?? 0;
const w = node.width ?? 120;
const h = node.height ?? 40;
const color = colorForType(node.type);
const icon = iconForType(node.type);
// Extract label parts: type name and detail
const typeName = node.type?.replace(/^EIP_/, '').replace(/_/g, ' ') ?? '';
const detail = node.label || '';
return (
<g
data-node-id={node.id}
transform={`translate(${x}, ${y})`}
onClick={(e) => { e.stopPropagation(); onClick(); }}
onMouseEnter={onMouseEnter}
onMouseLeave={onMouseLeave}
style={{ cursor: 'pointer' }}
>
{/* Selection ring */}
{isSelected && (
<rect
x={-2}
y={-2}
width={w + 4}
height={h + 4}
rx={CORNER_RADIUS + 2}
fill="none"
stroke="#C6820E"
strokeWidth={2.5}
/>
)}
{/* Card background */}
<rect
x={0}
y={0}
width={w}
height={h}
rx={CORNER_RADIUS}
fill={isHovered ? '#F5F0EA' : 'white'}
stroke={isHovered || isSelected ? color : '#E4DFD8'}
strokeWidth={isHovered || isSelected ? 1.5 : 1}
/>
{/* Colored top bar */}
<rect x={0} y={0} width={w} height={TOP_BAR_HEIGHT} rx={CORNER_RADIUS} fill={color} />
<rect x={CORNER_RADIUS} y={0} width={w - CORNER_RADIUS * 2} height={TOP_BAR_HEIGHT} fill={color} />
{/* Icon */}
<text x={14} y={h / 2 + 6} fill={color} fontSize={14}>
{icon}
</text>
{/* Type name */}
<text x={32} y={h / 2 + 1} fill="#1A1612" fontSize={11} fontWeight={600}>
{typeName}
</text>
{/* Detail label (truncated) */}
{detail && detail !== typeName && (
<text x={32} y={h / 2 + 14} fill="#5C5347" fontSize={10}>
{detail.length > 22 ? detail.slice(0, 20) + '...' : detail}
</text>
)}
{/* Config badges */}
{config && <ConfigBadge nodeWidth={w} config={config} />}
</g>
);
}

View File

@@ -0,0 +1,118 @@
import { useMemo } from 'react';
import type { DiagramSection } from './types';
import type { NodeConfig } from './types';
import type { DiagramNode as DiagramNodeType } from '../../api/queries/diagrams';
import { DiagramEdge } from './DiagramEdge';
import { DiagramNode } from './DiagramNode';
import { CompoundNode } from './CompoundNode';
import { isCompoundType } from './node-colors';
const CONTENT_PADDING_Y = 20;
const CONTENT_PADDING_LEFT = 12;
interface ErrorSectionProps {
section: DiagramSection;
totalWidth: number;
selectedNodeId?: string;
hoveredNodeId: string | null;
nodeConfigs?: Map<string, NodeConfig>;
onNodeClick: (nodeId: string) => void;
onNodeEnter: (nodeId: string) => void;
onNodeLeave: () => void;
}
export function ErrorSection({
section, totalWidth, selectedNodeId, hoveredNodeId, nodeConfigs,
onNodeClick, onNodeEnter, onNodeLeave,
}: ErrorSectionProps) {
const boxHeight = useMemo(() => {
let maxY = 0;
for (const n of section.nodes) {
const bottom = (n.y ?? 0) + (n.height ?? 40);
if (bottom > maxY) maxY = bottom;
if (n.children) {
for (const c of n.children) {
const cb = (c.y ?? 0) + (c.height ?? 40);
if (cb > maxY) maxY = cb;
}
}
}
// Content height + top padding + bottom padding (to vertically center)
return maxY + CONTENT_PADDING_Y * 2;
}, [section.nodes]);
return (
<g transform={`translate(0, ${section.offsetY})`}>
{/* Section label */}
<text x={8} y={-6} fill="#C0392B" fontSize={11} fontWeight={600}>
{section.label}
</text>
{/* Divider line */}
<line
x1={0}
y1={0}
x2={totalWidth}
y2={0}
stroke="#C0392B"
strokeWidth={1}
strokeDasharray="6 3"
opacity={0.5}
/>
{/* Subtle red tint background — sized to actual content */}
<rect
x={0}
y={4}
width={totalWidth}
height={boxHeight}
fill="#C0392B"
opacity={0.03}
rx={4}
/>
{/* Content group with margin from top-left */}
<g transform={`translate(${CONTENT_PADDING_LEFT}, ${CONTENT_PADDING_Y})`}>
{/* Edges */}
<g className="edges">
{section.edges.map((edge, i) => (
<DiagramEdge key={`${edge.sourceId}-${edge.targetId}-${i}`} edge={edge} />
))}
</g>
{/* Nodes */}
<g className="nodes">
{section.nodes.map(node => {
if (isCompoundType(node.type) && node.children && node.children.length > 0) {
return (
<CompoundNode
key={node.id}
node={node}
edges={section.edges}
selectedNodeId={selectedNodeId}
hoveredNodeId={hoveredNodeId}
nodeConfigs={nodeConfigs}
onNodeClick={onNodeClick}
onNodeEnter={onNodeEnter}
onNodeLeave={onNodeLeave}
/>
);
}
return (
<DiagramNode
key={node.id}
node={node}
isHovered={hoveredNodeId === node.id}
isSelected={selectedNodeId === node.id}
config={node.id ? nodeConfigs?.get(node.id) : undefined}
onClick={() => node.id && onNodeClick(node.id)}
onMouseEnter={() => node.id && onNodeEnter(node.id)}
onMouseLeave={onNodeLeave}
/>
);
})}
</g>
</g>
</g>
);
}

View File

@@ -0,0 +1,86 @@
import { useCallback, useRef, useState } from 'react';
import type { NodeAction } from './types';
import styles from './ProcessDiagram.module.css';
const HIDE_DELAY = 150;
interface NodeToolbarProps {
nodeId: string;
/** Screen-space position (already transformed by zoom/pan) */
screenX: number;
screenY: number;
onAction: (nodeId: string, action: NodeAction) => void;
onMouseEnter: () => void;
onMouseLeave: () => void;
}
const ACTIONS: { icon: string; action: NodeAction; title: string }[] = [
{ icon: '\uD83D\uDD0D', action: 'inspect', title: 'Inspect' },
{ icon: 'T', action: 'toggle-trace', title: 'Toggle tracing' },
{ icon: '\u270E', action: 'configure-tap', title: 'Configure tap' },
{ icon: '\u22EF', action: 'copy-id', title: 'Copy ID' },
];
export function NodeToolbar({
nodeId, screenX, screenY, onAction, onMouseEnter, onMouseLeave,
}: NodeToolbarProps) {
return (
<div
className={styles.nodeToolbar}
style={{ left: screenX, top: screenY }}
onMouseEnter={onMouseEnter}
onMouseLeave={onMouseLeave}
>
{ACTIONS.map(a => (
<button
key={a.action}
className={styles.nodeToolbarBtn}
title={a.title}
onClick={(e) => {
e.stopPropagation();
onAction(nodeId, a.action);
}}
>
{a.icon}
</button>
))}
</div>
);
}
/** Hook to manage toolbar visibility with hide delay */
export function useToolbarHover() {
const [hoveredNodeId, setHoveredNodeId] = useState<string | null>(null);
const hideTimer = useRef<ReturnType<typeof setTimeout> | null>(null);
const onNodeEnter = useCallback((nodeId: string) => {
if (hideTimer.current) {
clearTimeout(hideTimer.current);
hideTimer.current = null;
}
setHoveredNodeId(nodeId);
}, []);
const onNodeLeave = useCallback(() => {
hideTimer.current = setTimeout(() => {
setHoveredNodeId(null);
hideTimer.current = null;
}, HIDE_DELAY);
}, []);
const onToolbarEnter = useCallback(() => {
if (hideTimer.current) {
clearTimeout(hideTimer.current);
hideTimer.current = null;
}
}, []);
const onToolbarLeave = useCallback(() => {
hideTimer.current = setTimeout(() => {
setHoveredNodeId(null);
hideTimer.current = null;
}, HIDE_DELAY);
}, []);
return { hoveredNodeId, onNodeEnter, onNodeLeave, onToolbarEnter, onToolbarLeave };
}

View File

@@ -0,0 +1,115 @@
.container {
position: relative;
width: 100%;
height: 100%;
min-height: 300px;
overflow: hidden;
background: var(--bg-surface, #FFFFFF);
border: 1px solid var(--border, #E4DFD8);
border-radius: var(--radius-md, 8px);
}
.svg {
width: 100%;
height: 100%;
display: block;
outline: none;
}
.loading {
display: flex;
align-items: center;
justify-content: center;
width: 100%;
height: 100%;
min-height: 300px;
color: var(--text-muted, #9C9184);
font-size: 14px;
}
.error {
display: flex;
align-items: center;
justify-content: center;
width: 100%;
height: 100%;
min-height: 300px;
color: var(--error, #C0392B);
font-size: 14px;
}
.zoomControls {
position: absolute;
bottom: 12px;
right: 12px;
display: flex;
align-items: center;
gap: 4px;
background: var(--bg-surface, #FFFFFF);
border: 1px solid var(--border, #E4DFD8);
border-radius: var(--radius-sm, 5px);
padding: 4px;
box-shadow: var(--shadow-md, 0 2px 8px rgba(44, 37, 32, 0.08));
}
.zoomBtn {
display: flex;
align-items: center;
justify-content: center;
width: 28px;
height: 28px;
border: none;
background: transparent;
color: var(--text-primary, #1A1612);
font-size: 16px;
cursor: pointer;
border-radius: var(--radius-sm, 5px);
}
.zoomBtn:hover {
background: var(--bg-hover, #F5F0EA);
}
.zoomLevel {
font-size: 11px;
color: var(--text-muted, #9C9184);
min-width: 36px;
text-align: center;
font-variant-numeric: tabular-nums;
}
.nodeToolbar {
position: absolute;
display: flex;
align-items: center;
gap: 2px;
padding: 3px 4px;
background: var(--bg-surface, #FFFFFF);
border: 1px solid var(--border, #E4DFD8);
border-radius: var(--radius-sm, 5px);
box-shadow: var(--shadow-lg, 0 4px 16px rgba(44, 37, 32, 0.10));
transform: translate(-50%, -100%);
margin-top: -6px;
z-index: 10;
pointer-events: auto;
}
.nodeToolbarBtn {
display: flex;
align-items: center;
justify-content: center;
width: 26px;
height: 26px;
border: none;
background: transparent;
color: var(--text-secondary, #5C5347);
font-size: 12px;
cursor: pointer;
border-radius: var(--radius-sm, 5px);
padding: 0;
}
.nodeToolbarBtn:hover {
background: var(--bg-hover, #F5F0EA);
color: var(--text-primary, #1A1612);
}

View File

@@ -0,0 +1,266 @@
import { useCallback, useEffect } from 'react';
import type { ProcessDiagramProps } from './types';
import { useDiagramData } from './useDiagramData';
import { useZoomPan } from './useZoomPan';
import { useToolbarHover, NodeToolbar } from './NodeToolbar';
import { DiagramNode } from './DiagramNode';
import { DiagramEdge } from './DiagramEdge';
import { CompoundNode } from './CompoundNode';
import { ErrorSection } from './ErrorSection';
import { ZoomControls } from './ZoomControls';
import { isCompoundType } from './node-colors';
import styles from './ProcessDiagram.module.css';
const PADDING = 40;
export function ProcessDiagram({
application,
routeId,
direction = 'LR',
selectedNodeId,
onNodeSelect,
onNodeAction,
nodeConfigs,
className,
}: ProcessDiagramProps) {
const { sections, totalWidth, totalHeight, isLoading, error } = useDiagramData(
application, routeId, direction,
);
const zoom = useZoomPan();
const toolbar = useToolbarHover();
const contentWidth = totalWidth + PADDING * 2;
const contentHeight = totalHeight + PADDING * 2;
// Reset to 100% at top-left on first data load
useEffect(() => {
if (totalWidth > 0 && totalHeight > 0) {
zoom.resetView();
}
}, [totalWidth, totalHeight]); // eslint-disable-line react-hooks/exhaustive-deps
const handleNodeClick = useCallback(
(nodeId: string) => { onNodeSelect?.(nodeId); },
[onNodeSelect],
);
const handleNodeAction = useCallback(
(nodeId: string, action: import('./types').NodeAction) => {
if (action === 'inspect') onNodeSelect?.(nodeId);
onNodeAction?.(nodeId, action);
},
[onNodeSelect, onNodeAction],
);
const handleKeyDown = useCallback(
(e: React.KeyboardEvent) => {
if (e.key === 'Escape') {
onNodeSelect?.('');
return;
}
zoom.onKeyDown(e, contentWidth, contentHeight);
},
[onNodeSelect, zoom, contentWidth, contentHeight],
);
if (isLoading) {
return (
<div className={`${styles.container} ${className ?? ''}`}>
<div className={styles.loading}>Loading diagram...</div>
</div>
);
}
if (error) {
return (
<div className={`${styles.container} ${className ?? ''}`}>
<div className={styles.error}>Failed to load diagram</div>
</div>
);
}
if (sections.length === 0) {
return (
<div className={`${styles.container} ${className ?? ''}`}>
<div className={styles.loading}>No diagram data available</div>
</div>
);
}
const mainSection = sections[0];
const errorSections = sections.slice(1);
return (
<div
ref={zoom.containerRef}
className={`${styles.container} ${className ?? ''}`}
>
<svg
className={styles.svg}
onWheel={zoom.onWheel}
onPointerDown={zoom.onPointerDown}
onPointerMove={zoom.onPointerMove}
onPointerUp={zoom.onPointerUp}
onKeyDown={handleKeyDown}
tabIndex={0}
onClick={() => onNodeSelect?.('')}
>
<defs>
<marker
id="arrowhead"
markerWidth="8"
markerHeight="6"
refX="7"
refY="3"
orient="auto"
>
<polygon points="0 0, 8 3, 0 6" fill="#9CA3AF" />
</marker>
</defs>
<g style={{ transform: zoom.transform, transformOrigin: '0 0' }}>
{/* Main section top-level edges (not inside compounds) */}
<g className="edges">
{mainSection.edges.filter(e => topLevelEdge(e, mainSection.nodes)).map((edge, i) => (
<DiagramEdge key={`${edge.sourceId}-${edge.targetId}-${i}`} edge={edge} />
))}
</g>
{/* Main section nodes */}
<g className="nodes">
{mainSection.nodes.map(node => {
if (isCompoundType(node.type) && node.children && node.children.length > 0) {
return (
<CompoundNode
key={node.id}
node={node}
edges={mainSection.edges}
selectedNodeId={selectedNodeId}
hoveredNodeId={toolbar.hoveredNodeId}
nodeConfigs={nodeConfigs}
onNodeClick={handleNodeClick}
onNodeEnter={toolbar.onNodeEnter}
onNodeLeave={toolbar.onNodeLeave}
/>
);
}
return (
<DiagramNode
key={node.id}
node={node}
isHovered={toolbar.hoveredNodeId === node.id}
isSelected={selectedNodeId === node.id}
config={node.id ? nodeConfigs?.get(node.id) : undefined}
onClick={() => node.id && handleNodeClick(node.id)}
onMouseEnter={() => node.id && toolbar.onNodeEnter(node.id)}
onMouseLeave={toolbar.onNodeLeave}
/>
);
})}
</g>
{/* Toolbar rendered as HTML overlay below */}
{/* Error handler sections */}
{errorSections.map((section, i) => (
<ErrorSection
key={`error-${i}`}
section={section}
totalWidth={totalWidth}
selectedNodeId={selectedNodeId}
hoveredNodeId={toolbar.hoveredNodeId}
nodeConfigs={nodeConfigs}
onNodeClick={handleNodeClick}
onNodeEnter={toolbar.onNodeEnter}
onNodeLeave={toolbar.onNodeLeave}
/>
))}
</g>
</svg>
{/* Node toolbar — HTML overlay, fixed size regardless of zoom */}
{toolbar.hoveredNodeId && onNodeAction && (() => {
const hNode = findNodeById(sections, toolbar.hoveredNodeId!);
if (!hNode) return null;
// Convert SVG coordinates to screen-space using zoom transform
const nodeCenter = (hNode.x ?? 0) + (hNode.width ?? 160) / 2;
const nodeTop = hNode.y ?? 0;
const screenX = nodeCenter * zoom.state.scale + zoom.state.translateX;
const screenY = nodeTop * zoom.state.scale + zoom.state.translateY;
return (
<NodeToolbar
nodeId={toolbar.hoveredNodeId!}
screenX={screenX}
screenY={screenY}
onAction={handleNodeAction}
onMouseEnter={toolbar.onToolbarEnter}
onMouseLeave={toolbar.onToolbarLeave}
/>
);
})()}
<ZoomControls
onZoomIn={zoom.zoomIn}
onZoomOut={zoom.zoomOut}
onFitToView={() => zoom.fitToView(contentWidth, contentHeight)}
scale={zoom.state.scale}
/>
</div>
);
}
function findNodeById(
sections: import('./types').DiagramSection[],
nodeId: string,
): import('../../api/queries/diagrams').DiagramNode | undefined {
for (const section of sections) {
for (const node of section.nodes) {
if (node.id === nodeId) return node;
if (node.children) {
const found = findInChildren(node.children, nodeId);
if (found) return found;
}
}
}
return undefined;
}
function findInChildren(
nodes: import('../../api/queries/diagrams').DiagramNode[],
nodeId: string,
): import('../../api/queries/diagrams').DiagramNode | undefined {
for (const n of nodes) {
if (n.id === nodeId) return n;
if (n.children) {
const found = findInChildren(n.children, nodeId);
if (found) return found;
}
}
return undefined;
}
/** Returns true if the edge connects two top-level nodes (not inside any compound). */
function topLevelEdge(
edge: import('../../api/queries/diagrams').DiagramEdge,
nodes: import('../../api/queries/diagrams').DiagramNode[],
): boolean {
// Collect all IDs that are children of compound nodes (at any depth)
const compoundChildIds = new Set<string>();
for (const n of nodes) {
if (n.children && n.children.length > 0) {
collectDescendantIds(n.children, compoundChildIds);
}
}
return !compoundChildIds.has(edge.sourceId) && !compoundChildIds.has(edge.targetId);
}
function collectDescendantIds(
nodes: import('../../api/queries/diagrams').DiagramNode[],
set: Set<string>,
) {
for (const n of nodes) {
if (n.id) set.add(n.id);
if (n.children) collectDescendantIds(n.children, set);
}
}

View File

@@ -0,0 +1,19 @@
import styles from './ProcessDiagram.module.css';
interface ZoomControlsProps {
onZoomIn: () => void;
onZoomOut: () => void;
onFitToView: () => void;
scale: number;
}
export function ZoomControls({ onZoomIn, onZoomOut, onFitToView, scale }: ZoomControlsProps) {
return (
<div className={styles.zoomControls}>
<button className={styles.zoomBtn} onClick={onZoomIn} title="Zoom in (+)">+</button>
<span className={styles.zoomLevel}>{Math.round(scale * 100)}%</span>
<button className={styles.zoomBtn} onClick={onZoomOut} title="Zoom out (-)"></button>
<button className={styles.zoomBtn} onClick={onFitToView} title="Fit to view (0)"></button>
</div>
);
}

View File

@@ -0,0 +1,2 @@
export { ProcessDiagram } from './ProcessDiagram';
export type { ProcessDiagramProps, NodeAction, NodeConfig } from './types';

View File

@@ -0,0 +1,95 @@
/** Maps backend NodeType strings to CSS color values using design system tokens. */
const ENDPOINT_COLOR = '#1A7F8E'; // --running
const PROCESSOR_COLOR = '#C6820E'; // --amber
const TARGET_COLOR = '#3D7C47'; // --success
const EIP_COLOR = '#7C3AED'; // --purple
const ERROR_COLOR = '#C0392B'; // --error
const CROSS_ROUTE_COLOR = '#06B6D4';
const DEFAULT_COLOR = '#9C9184'; // --text-muted
const TYPE_MAP: Record<string, string> = {
ENDPOINT: ENDPOINT_COLOR,
PROCESSOR: PROCESSOR_COLOR,
BEAN: PROCESSOR_COLOR,
LOG: PROCESSOR_COLOR,
SET_HEADER: PROCESSOR_COLOR,
SET_BODY: PROCESSOR_COLOR,
TRANSFORM: PROCESSOR_COLOR,
MARSHAL: PROCESSOR_COLOR,
UNMARSHAL: PROCESSOR_COLOR,
TO: TARGET_COLOR,
TO_DYNAMIC: TARGET_COLOR,
DIRECT: TARGET_COLOR,
SEDA: TARGET_COLOR,
EIP_CHOICE: EIP_COLOR,
EIP_WHEN: EIP_COLOR,
EIP_OTHERWISE: EIP_COLOR,
EIP_SPLIT: EIP_COLOR,
EIP_MULTICAST: EIP_COLOR,
EIP_LOOP: EIP_COLOR,
EIP_AGGREGATE: EIP_COLOR,
EIP_FILTER: EIP_COLOR,
EIP_RECIPIENT_LIST: EIP_COLOR,
EIP_ROUTING_SLIP: EIP_COLOR,
EIP_DYNAMIC_ROUTER: EIP_COLOR,
EIP_LOAD_BALANCE: EIP_COLOR,
EIP_THROTTLE: EIP_COLOR,
EIP_DELAY: EIP_COLOR,
EIP_IDEMPOTENT_CONSUMER: EIP_COLOR,
EIP_CIRCUIT_BREAKER: EIP_COLOR,
EIP_PIPELINE: EIP_COLOR,
ERROR_HANDLER: ERROR_COLOR,
ON_EXCEPTION: ERROR_COLOR,
TRY_CATCH: ERROR_COLOR,
DO_TRY: ERROR_COLOR,
DO_CATCH: ERROR_COLOR,
DO_FINALLY: ERROR_COLOR,
EIP_WIRE_TAP: CROSS_ROUTE_COLOR,
EIP_ENRICH: CROSS_ROUTE_COLOR,
EIP_POLL_ENRICH: CROSS_ROUTE_COLOR,
};
const COMPOUND_TYPES = new Set([
'EIP_CHOICE', 'EIP_WHEN', 'EIP_OTHERWISE',
'EIP_SPLIT', 'TRY_CATCH',
'DO_TRY', 'DO_CATCH', 'DO_FINALLY',
'EIP_LOOP', 'EIP_MULTICAST', 'EIP_AGGREGATE',
'ON_EXCEPTION', 'ERROR_HANDLER',
]);
const ERROR_COMPOUND_TYPES = new Set([
'ON_EXCEPTION', 'ERROR_HANDLER',
]);
export function colorForType(type: string | undefined): string {
if (!type) return DEFAULT_COLOR;
return TYPE_MAP[type] ?? DEFAULT_COLOR;
}
export function isCompoundType(type: string | undefined): boolean {
return !!type && COMPOUND_TYPES.has(type);
}
export function isErrorCompoundType(type: string | undefined): boolean {
return !!type && ERROR_COMPOUND_TYPES.has(type);
}
/** Icon character for a node type */
export function iconForType(type: string | undefined): string {
if (!type) return '\u2699'; // gear
const t = type.toUpperCase();
if (t === 'ENDPOINT') return '\u25B6'; // play
if (t === 'TO' || t === 'TO_DYNAMIC' || t === 'DIRECT' || t === 'SEDA') return '\u25A0'; // square
if (t.startsWith('EIP_CHOICE') || t === 'EIP_WHEN' || t === 'EIP_OTHERWISE') return '\u25C6'; // diamond
if (t === 'ON_EXCEPTION' || t === 'ERROR_HANDLER' || t.startsWith('TRY') || t.startsWith('DO_')) return '\u26A0'; // warning
if (t === 'EIP_SPLIT' || t === 'EIP_MULTICAST') return '\u2442'; // fork
if (t === 'EIP_LOOP') return '\u21BA'; // loop arrow
if (t === 'EIP_WIRE_TAP' || t === 'EIP_ENRICH' || t === 'EIP_POLL_ENRICH') return '\u2197'; // arrow
return '\u2699'; // gear
}

Some files were not shown because too many files have changed in this diff Show More