takt: github-issue-207-previous-response-source-path (#210)
This commit is contained in:
parent
e6ccebfe18
commit
9c4408909d
28
README.md
28
README.md
@ -230,7 +230,7 @@ takt list --non-interactive --format json
|
||||
1. Run `takt add` and confirm a pending record is created in `.takt/tasks.yaml`.
|
||||
2. Open the generated `.takt/tasks/{slug}/order.md` and add detailed specifications/references as needed.
|
||||
3. Run `takt run` (or `takt watch`) to execute pending tasks from `tasks.yaml`.
|
||||
4. Verify outputs in `.takt/reports/{slug}/` using the same slug as `task_dir`.
|
||||
4. Verify outputs in `.takt/runs/{slug}/reports/` using the same slug as `task_dir`.
|
||||
|
||||
### Pipeline Mode (for CI/Automation)
|
||||
|
||||
@ -541,12 +541,12 @@ The model string is passed to the Codex SDK. If unspecified, defaults to `codex`
|
||||
├── config.yaml # Project config (current piece, etc.)
|
||||
├── tasks/ # Task input directories (.takt/tasks/{slug}/order.md, etc.)
|
||||
├── tasks.yaml # Pending tasks metadata (task_dir, piece, worktree, etc.)
|
||||
├── reports/ # Execution reports (auto-generated)
|
||||
│ └── {timestamp}-{slug}/
|
||||
└── logs/ # NDJSON format session logs
|
||||
├── latest.json # Pointer to current/latest session
|
||||
├── previous.json # Pointer to previous session
|
||||
└── {sessionId}.jsonl # NDJSON session log per piece execution
|
||||
└── runs/ # Run-scoped artifacts
|
||||
└── {slug}/
|
||||
├── reports/ # Execution reports (auto-generated)
|
||||
├── context/ # knowledge/policy/previous_response snapshots
|
||||
├── logs/ # NDJSON session logs for this run
|
||||
└── meta.json # Run metadata
|
||||
```
|
||||
|
||||
Builtin resources are embedded in the npm package (`builtins/`). User files in `~/.takt/` take priority.
|
||||
@ -646,8 +646,9 @@ TAKT stores task metadata in `.takt/tasks.yaml`, and each task's long specificat
|
||||
schema.sql
|
||||
wireframe.png
|
||||
tasks.yaml
|
||||
reports/
|
||||
runs/
|
||||
20260201-015714-foptng/
|
||||
reports/
|
||||
```
|
||||
|
||||
**tasks.yaml record**:
|
||||
@ -680,15 +681,14 @@ Clones are ephemeral. After task completion, they auto-commit + push, then delet
|
||||
|
||||
### Session Logs
|
||||
|
||||
TAKT writes session logs in NDJSON (`.jsonl`) format to `.takt/logs/`. Each record is atomically appended, so partial logs are preserved even if the process crashes, and you can track in real-time with `tail -f`.
|
||||
TAKT writes session logs in NDJSON (`.jsonl`) format to `.takt/runs/{slug}/logs/`. Each record is atomically appended, so partial logs are preserved even if the process crashes, and you can track in real-time with `tail -f`.
|
||||
|
||||
- `.takt/logs/latest.json` - Pointer to current (or latest) session
|
||||
- `.takt/logs/previous.json` - Pointer to previous session
|
||||
- `.takt/logs/{sessionId}.jsonl` - NDJSON session log per piece execution
|
||||
- `.takt/runs/{slug}/logs/{sessionId}.jsonl` - NDJSON session log per piece execution
|
||||
- `.takt/runs/{slug}/meta.json` - Run metadata (`task`, `piece`, `start/end`, `status`, etc.)
|
||||
|
||||
Record types: `piece_start`, `step_start`, `step_complete`, `piece_complete`, `piece_abort`
|
||||
|
||||
Agents can read `previous.json` to inherit context from the previous execution. Session continuation is automatic — just run `takt "task"` to continue from the previous session.
|
||||
The latest previous response is stored at `.takt/runs/{slug}/context/previous_responses/latest.md` and inherited automatically.
|
||||
|
||||
### Adding Custom Pieces
|
||||
|
||||
@ -757,7 +757,7 @@ Variables available in `instruction_template`:
|
||||
| `{movement_iteration}` | Per-movement iteration count (times this movement has been executed) |
|
||||
| `{previous_response}` | Output from previous movement (auto-injected if not in template) |
|
||||
| `{user_inputs}` | Additional user inputs during piece (auto-injected if not in template) |
|
||||
| `{report_dir}` | Report directory path (e.g., `.takt/reports/20250126-143052-task-summary`) |
|
||||
| `{report_dir}` | Report directory path (e.g., `.takt/runs/20250126-143052-task-summary/reports`) |
|
||||
| `{report:filename}` | Expands to `{report_dir}/filename` (e.g., `{report:00-plan.md}`) |
|
||||
|
||||
### Piece Design
|
||||
|
||||
@ -84,7 +84,7 @@ InstructionBuilder が instruction_template 内の `{変数名}` を展開する
|
||||
| `{iteration}` | ピース全体のイテレーション数 |
|
||||
| `{max_iterations}` | 最大イテレーション数 |
|
||||
| `{movement_iteration}` | ムーブメント単位のイテレーション数 |
|
||||
| `{report_dir}` | レポートディレクトリ名 |
|
||||
| `{report_dir}` | レポートディレクトリ名(`.takt/runs/{slug}/reports`) |
|
||||
| `{report:filename}` | 指定レポートの内容展開(ファイルが存在する場合) |
|
||||
| `{cycle_count}` | ループモニターで検出されたサイクル回数(`loop_monitors` 専用) |
|
||||
|
||||
@ -222,7 +222,7 @@ InstructionBuilder が instruction_template 内の `{変数名}` を展開する
|
||||
|
||||
# 非許容
|
||||
**参照するレポート:**
|
||||
- .takt/reports/20250101-task/ai-review.md ← パスのハードコード
|
||||
- .takt/runs/20250101-task/reports/ai-review.md ← パスのハードコード
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
@ -157,7 +157,7 @@
|
||||
|
||||
1. **ポリシーの詳細ルール**: コード例・判定基準・例外リスト等の詳細はポリシーの責務(1行の行動指針は行動姿勢に記載してよい)
|
||||
2. **ピース固有の概念**: ムーブメント名、レポートファイル名、ステップ間ルーティング
|
||||
3. **ツール固有の環境情報**: `.takt/reports/` 等のディレクトリパス、テンプレート変数(`{report_dir}` 等)
|
||||
3. **ツール固有の環境情報**: `.takt/runs/` 等のディレクトリパス、テンプレート変数(`{report_dir}` 等)
|
||||
4. **実行手順**: 「まず〜を読み、次に〜を実行」のような手順はinstruction_templateの責務
|
||||
|
||||
### 例外: ドメイン知識としての重複
|
||||
|
||||
@ -100,7 +100,7 @@
|
||||
|
||||
1. **特定エージェント固有の知識**: Architecture Reviewer だけが使う検出手法等
|
||||
2. **ピース固有の概念**: ムーブメント名、レポートファイル名
|
||||
3. **ツール固有のパス**: `.takt/reports/` 等の具体的なディレクトリパス
|
||||
3. **ツール固有のパス**: `.takt/runs/` 等の具体的なディレクトリパス
|
||||
4. **実行手順**: どのファイルを読め、何を実行しろ等
|
||||
|
||||
---
|
||||
|
||||
@ -1,6 +1,6 @@
|
||||
# Temporary files
|
||||
logs/
|
||||
reports/
|
||||
runs/
|
||||
completed/
|
||||
tasks/
|
||||
worktrees/
|
||||
|
||||
@ -38,7 +38,7 @@ Fields:
|
||||
|
||||
- `takt add` creates `.takt/tasks/{slug}/order.md` automatically.
|
||||
- `takt run` and `takt watch` read `.takt/tasks.yaml` and resolve `task_dir`.
|
||||
- Report output is written to `.takt/reports/{slug}/`.
|
||||
- Report output is written to `.takt/runs/{slug}/reports/`.
|
||||
|
||||
## Commands
|
||||
|
||||
|
||||
@ -116,7 +116,15 @@ TeamCreate tool を呼ぶ:
|
||||
- `permission_mode = コマンドで解析された権限モード("bypassPermissions" / "acceptEdits" / "default")`
|
||||
- `movement_history = []`(遷移履歴。Loop Monitor 用)
|
||||
|
||||
**レポートディレクトリ**: いずれかの movement に `report` フィールドがある場合、`.takt/reports/{YYYYMMDD-HHmmss}-{slug}/` を作成し、パスを `report_dir` 変数に保持する。
|
||||
**実行ディレクトリ**: いずれかの movement に `report` フィールドがある場合、`.takt/runs/{YYYYMMDD-HHmmss}-{slug}/` を作成し、以下を配置する。
|
||||
- `reports/`(レポート出力)
|
||||
- `context/knowledge/`(Knowledge スナップショット)
|
||||
- `context/policy/`(Policy スナップショット)
|
||||
- `context/previous_responses/`(Previous Response 履歴 + `latest.md`)
|
||||
- `logs/`(実行ログ)
|
||||
- `meta.json`(run メタデータ)
|
||||
|
||||
レポート出力先パスを `report_dir` 変数(`.takt/runs/{slug}/reports`)として保持する。
|
||||
|
||||
次に **手順 5** に進む。
|
||||
|
||||
|
||||
@ -148,7 +148,7 @@ movement の `instruction:` キーから指示テンプレートファイルを
|
||||
| `{iteration}` | ピース全体のイテレーション数(1始まり) |
|
||||
| `{max_iterations}` | ピースの max_iterations 値 |
|
||||
| `{movement_iteration}` | この movement が実行された回数(1始まり) |
|
||||
| `{report_dir}` | レポートディレクトリパス |
|
||||
| `{report_dir}` | レポートディレクトリパス(`.takt/runs/{slug}/reports`) |
|
||||
| `{report:ファイル名}` | 指定レポートファイルの内容(Read で取得) |
|
||||
|
||||
### {report:ファイル名} の処理
|
||||
@ -212,7 +212,10 @@ report:
|
||||
チームメイトの出力からレポート内容を抽出し、Write tool でレポートディレクトリに保存する。
|
||||
**この作業は Team Lead(あなた)が行う。** チームメイトの出力を受け取った後に実施する。
|
||||
|
||||
**レポートディレクトリ**: `.takt/reports/{timestamp}-{slug}/` に作成する。
|
||||
**実行ディレクトリ**: `.takt/runs/{timestamp}-{slug}/` に作成する。
|
||||
- レポートは `.takt/runs/{timestamp}-{slug}/reports/` に保存する。
|
||||
- `Knowledge` / `Policy` / `Previous Response` は `.takt/runs/{timestamp}-{slug}/context/` 配下に保存する。
|
||||
- 最新の previous response は `.takt/runs/{timestamp}-{slug}/context/previous_responses/latest.md` とする。
|
||||
- `{timestamp}`: `YYYYMMDD-HHmmss` 形式
|
||||
- `{slug}`: タスク内容の先頭30文字をスラグ化
|
||||
|
||||
@ -358,17 +361,24 @@ loop_monitors:
|
||||
d. judge の出力を judge の `rules` で評価する
|
||||
e. マッチした rule の `next` に遷移する(通常のルール評価をオーバーライドする)
|
||||
|
||||
## レポート管理
|
||||
## 実行アーティファクト管理
|
||||
|
||||
### レポートディレクトリの作成
|
||||
### 実行ディレクトリの作成
|
||||
|
||||
ピース実行開始時にレポートディレクトリを作成する:
|
||||
ピース実行開始時に実行ディレクトリを作成する:
|
||||
|
||||
```
|
||||
.takt/reports/{YYYYMMDD-HHmmss}-{slug}/
|
||||
.takt/runs/{YYYYMMDD-HHmmss}-{slug}/
|
||||
reports/
|
||||
context/
|
||||
knowledge/
|
||||
policy/
|
||||
previous_responses/
|
||||
logs/
|
||||
meta.json
|
||||
```
|
||||
|
||||
このパスを `{report_dir}` 変数として全 movement から参照可能にする。
|
||||
このうち `reports/` のパスを `{report_dir}` 変数として全 movement から参照可能にする。
|
||||
|
||||
### レポートの保存
|
||||
|
||||
@ -392,7 +402,7 @@ loop_monitors:
|
||||
↓
|
||||
TeamCreate でチーム作成
|
||||
↓
|
||||
レポートディレクトリ作成
|
||||
実行ディレクトリ作成
|
||||
↓
|
||||
initial_movement を取得
|
||||
↓
|
||||
|
||||
@ -230,7 +230,7 @@ takt list --non-interactive --format json
|
||||
1. `takt add` を実行して `.takt/tasks.yaml` に pending レコードが作られることを確認する。
|
||||
2. 生成された `.takt/tasks/{slug}/order.md` を開き、必要なら仕様や参考資料を追記する。
|
||||
3. `takt run`(または `takt watch`)で `tasks.yaml` の pending タスクを実行する。
|
||||
4. `task_dir` と同じスラッグの `.takt/reports/{slug}/` を確認する。
|
||||
4. `task_dir` と同じスラッグの `.takt/runs/{slug}/reports/` を確認する。
|
||||
|
||||
### パイプラインモード(CI/自動化向け)
|
||||
|
||||
@ -541,12 +541,12 @@ Claude Code はエイリアス(`opus`、`sonnet`、`haiku`、`opusplan`、`def
|
||||
├── config.yaml # プロジェクト設定(現在のピース等)
|
||||
├── tasks/ # タスク入力ディレクトリ(.takt/tasks/{slug}/order.md など)
|
||||
├── tasks.yaml # 保留中タスクのメタデータ(task_dir, piece, worktree など)
|
||||
├── reports/ # 実行レポート(自動生成)
|
||||
│ └── {timestamp}-{slug}/
|
||||
└── logs/ # NDJSON 形式のセッションログ
|
||||
├── latest.json # 現在/最新セッションへのポインタ
|
||||
├── previous.json # 前回セッションへのポインタ
|
||||
└── {sessionId}.jsonl # ピース実行ごとの NDJSON セッションログ
|
||||
└── runs/ # 実行単位の成果物
|
||||
└── {slug}/
|
||||
├── reports/ # 実行レポート(自動生成)
|
||||
├── context/ # knowledge/policy/previous_response のスナップショット
|
||||
├── logs/ # この実行専用の NDJSON セッションログ
|
||||
└── meta.json # run メタデータ
|
||||
```
|
||||
|
||||
ビルトインリソースはnpmパッケージ(`builtins/`)に埋め込まれています。`~/.takt/` のユーザーファイルが優先されます。
|
||||
@ -646,8 +646,9 @@ TAKT は `.takt/tasks.yaml` にタスクのメタデータを保存し、長文
|
||||
schema.sql
|
||||
wireframe.png
|
||||
tasks.yaml
|
||||
reports/
|
||||
runs/
|
||||
20260201-015714-foptng/
|
||||
reports/
|
||||
```
|
||||
|
||||
**tasks.yaml レコード例**:
|
||||
@ -680,15 +681,14 @@ YAMLタスクファイルで`worktree`を指定すると、各タスクを`git c
|
||||
|
||||
### セッションログ
|
||||
|
||||
TAKTはセッションログをNDJSON(`.jsonl`)形式で`.takt/logs/`に書き込みます。各レコードはアトミックに追記されるため、プロセスが途中でクラッシュしても部分的なログが保持され、`tail -f`でリアルタイムに追跡できます。
|
||||
TAKTはセッションログをNDJSON(`.jsonl`)形式で`.takt/runs/{slug}/logs/`に書き込みます。各レコードはアトミックに追記されるため、プロセスが途中でクラッシュしても部分的なログが保持され、`tail -f`でリアルタイムに追跡できます。
|
||||
|
||||
- `.takt/logs/latest.json` - 現在(または最新の)セッションへのポインタ
|
||||
- `.takt/logs/previous.json` - 前回セッションへのポインタ
|
||||
- `.takt/logs/{sessionId}.jsonl` - ピース実行ごとのNDJSONセッションログ
|
||||
- `.takt/runs/{slug}/logs/{sessionId}.jsonl` - ピース実行ごとのNDJSONセッションログ
|
||||
- `.takt/runs/{slug}/meta.json` - run メタデータ(`task`, `piece`, `start/end`, `status` など)
|
||||
|
||||
レコード種別: `piece_start`, `step_start`, `step_complete`, `piece_complete`, `piece_abort`
|
||||
|
||||
エージェントは`previous.json`を読み取って前回の実行コンテキストを引き継ぐことができます。セッション継続は自動的に行われます — `takt "タスク"`を実行するだけで前回のセッションから続行されます。
|
||||
最新の previous response は `.takt/runs/{slug}/context/previous_responses/latest.md` に保存され、実行時に自動的に引き継がれます。
|
||||
|
||||
### カスタムピースの追加
|
||||
|
||||
@ -757,7 +757,7 @@ personas:
|
||||
| `{movement_iteration}` | ムーブメントごとのイテレーション数(このムーブメントが実行された回数) |
|
||||
| `{previous_response}` | 前のムーブメントの出力(テンプレートになければ自動注入) |
|
||||
| `{user_inputs}` | ピース中の追加ユーザー入力(テンプレートになければ自動注入) |
|
||||
| `{report_dir}` | レポートディレクトリパス(例: `.takt/reports/20250126-143052-task-summary`) |
|
||||
| `{report_dir}` | レポートディレクトリパス(例: `.takt/runs/20250126-143052-task-summary/reports`) |
|
||||
| `{report:filename}` | `{report_dir}/filename` に展開(例: `{report:00-plan.md}`) |
|
||||
|
||||
### ピースの設計
|
||||
|
||||
@ -431,7 +431,7 @@ TAKTのデータフローは以下の7つの主要なレイヤーで構成され
|
||||
2. **ログ初期化**:
|
||||
- `createSessionLog()`: セッションログオブジェクト作成
|
||||
- `initNdjsonLog()`: NDJSON形式のログファイル初期化
|
||||
- `updateLatestPointer()`: `latest.json` ポインタ更新
|
||||
- `meta.json` 更新: 実行ステータス(running/completed/aborted)と時刻を保存
|
||||
|
||||
3. **PieceEngine初期化**:
|
||||
```typescript
|
||||
@ -619,6 +619,7 @@ const match = await detectMatchedRule(step, response.content, tagContent, {...})
|
||||
- Step Iteration (per-step)
|
||||
- Step name
|
||||
- Report Directory/File info
|
||||
- Run Source Paths (`.takt/runs/{slug}/context/...`)
|
||||
|
||||
3. **User Request** (タスク本文):
|
||||
- `{task}` プレースホルダーがテンプレートにない場合のみ自動注入
|
||||
@ -626,6 +627,8 @@ const match = await detectMatchedRule(step, response.content, tagContent, {...})
|
||||
4. **Previous Response** (前ステップの出力):
|
||||
- `step.passPreviousResponse === true` かつ
|
||||
- `{previous_response}` プレースホルダーがテンプレートにない場合のみ自動注入
|
||||
- 長さ制御(2000 chars)と `...TRUNCATED...` を適用
|
||||
- Source Path を常時注入
|
||||
|
||||
5. **Additional User Inputs** (blocked時の追加入力):
|
||||
- `{user_inputs}` プレースホルダーがテンプレートにない場合のみ自動注入
|
||||
|
||||
@ -59,7 +59,7 @@ steps:
|
||||
| `{step_iteration}` | Per-step iteration count (how many times THIS step has run) |
|
||||
| `{previous_response}` | Previous step's output (auto-injected if not in template) |
|
||||
| `{user_inputs}` | Additional user inputs during piece (auto-injected if not in template) |
|
||||
| `{report_dir}` | Report directory path (e.g., `.takt/reports/20250126-143052-task-summary`) |
|
||||
| `{report_dir}` | Report directory path (e.g., `.takt/runs/20250126-143052-task-summary/reports`) |
|
||||
| `{report:filename}` | Resolves to `{report_dir}/filename` (e.g., `{report:00-plan.md}`) |
|
||||
|
||||
> **Note**: `{task}`, `{previous_response}`, and `{user_inputs}` are auto-injected into instructions. You only need explicit placeholders if you want to control their position in the template.
|
||||
|
||||
@ -63,7 +63,7 @@ describe('debug logging', () => {
|
||||
}
|
||||
});
|
||||
|
||||
it('should write debug log to project .takt/logs/ directory', () => {
|
||||
it('should write debug log to project .takt/runs/*/logs/ directory', () => {
|
||||
const projectDir = join(tmpdir(), 'takt-test-debug-project-' + Date.now());
|
||||
mkdirSync(projectDir, { recursive: true });
|
||||
|
||||
@ -71,7 +71,9 @@ describe('debug logging', () => {
|
||||
initDebugLogger({ enabled: true }, projectDir);
|
||||
const logFile = getDebugLogFile();
|
||||
expect(logFile).not.toBeNull();
|
||||
expect(logFile!).toContain(join(projectDir, '.takt', 'logs'));
|
||||
expect(logFile!).toContain(join(projectDir, '.takt', 'runs'));
|
||||
expect(logFile!).toContain(`${join(projectDir, '.takt', 'runs')}/`);
|
||||
expect(logFile!).toContain('/logs/');
|
||||
expect(logFile!).toMatch(/debug-.*\.log$/);
|
||||
expect(existsSync(logFile!)).toBe(true);
|
||||
} finally {
|
||||
@ -86,7 +88,8 @@ describe('debug logging', () => {
|
||||
try {
|
||||
initDebugLogger({ enabled: true }, projectDir);
|
||||
const promptsLogFile = resolvePromptsLogFilePath();
|
||||
expect(promptsLogFile).toContain(join(projectDir, '.takt', 'logs'));
|
||||
expect(promptsLogFile).toContain(join(projectDir, '.takt', 'runs'));
|
||||
expect(promptsLogFile).toContain('/logs/');
|
||||
expect(promptsLogFile).toMatch(/debug-.*-prompts\.jsonl$/);
|
||||
expect(existsSync(promptsLogFile)).toBe(true);
|
||||
} finally {
|
||||
|
||||
@ -5,7 +5,7 @@
|
||||
*/
|
||||
|
||||
import { describe, it, expect, vi, beforeEach, afterEach } from 'vitest';
|
||||
import { writeFileSync, mkdirSync } from 'node:fs';
|
||||
import { writeFileSync, mkdirSync, readFileSync, readdirSync } from 'node:fs';
|
||||
import { join } from 'node:path';
|
||||
|
||||
// Mock external dependencies before importing
|
||||
@ -94,6 +94,7 @@ function buildArpeggioPieceConfig(arpeggioConfig: ArpeggioMovementConfig, tmpDir
|
||||
function createEngineOptions(tmpDir: string): PieceEngineOptions {
|
||||
return {
|
||||
projectCwd: tmpDir,
|
||||
reportDirName: 'test-report-dir',
|
||||
detectRuleIndex: () => 0,
|
||||
callAiJudge: async () => 0,
|
||||
};
|
||||
@ -142,6 +143,12 @@ describe('ArpeggioRunner integration', () => {
|
||||
const output = state.movementOutputs.get('process');
|
||||
expect(output).toBeDefined();
|
||||
expect(output!.content).toBe('Processed Alice\nProcessed Bob\nProcessed Charlie');
|
||||
|
||||
const previousDir = join(tmpDir, '.takt', 'runs', 'test-report-dir', 'context', 'previous_responses');
|
||||
const previousFiles = readdirSync(previousDir);
|
||||
expect(state.previousResponseSourcePath).toMatch(/^\.takt\/runs\/test-report-dir\/context\/previous_responses\/process\.1\.\d{8}T\d{6}Z\.md$/);
|
||||
expect(previousFiles).toContain('latest.md');
|
||||
expect(readFileSync(join(previousDir, 'latest.md'), 'utf-8')).toBe('Processed Alice\nProcessed Bob\nProcessed Charlie');
|
||||
});
|
||||
|
||||
it('should handle batch_size > 1', async () => {
|
||||
|
||||
@ -8,7 +8,8 @@
|
||||
*/
|
||||
|
||||
import { describe, it, expect, beforeEach, afterEach, vi } from 'vitest';
|
||||
import { existsSync, rmSync } from 'node:fs';
|
||||
import { existsSync, rmSync, readdirSync, readFileSync } from 'node:fs';
|
||||
import { join } from 'node:path';
|
||||
|
||||
// --- Mock setup (must be before imports that use these modules) ---
|
||||
|
||||
@ -128,6 +129,46 @@ describe('PieceEngine Integration: Parallel Movement Aggregation', () => {
|
||||
expect(state.movementOutputs.get('security-review')!.content).toBe('Sec content');
|
||||
});
|
||||
|
||||
it('should persist aggregated previous_response snapshot for parallel parent movement', async () => {
|
||||
const config = buildDefaultPieceConfig();
|
||||
const engine = new PieceEngine(config, tmpDir, 'test task', { projectCwd: tmpDir });
|
||||
|
||||
mockRunAgentSequence([
|
||||
makeResponse({ persona: 'plan', content: 'Plan' }),
|
||||
makeResponse({ persona: 'implement', content: 'Impl' }),
|
||||
makeResponse({ persona: 'ai_review', content: 'OK' }),
|
||||
makeResponse({ persona: 'arch-review', content: 'Arch content' }),
|
||||
makeResponse({ persona: 'security-review', content: 'Sec content' }),
|
||||
makeResponse({ persona: 'supervise', content: 'Pass' }),
|
||||
]);
|
||||
|
||||
mockDetectMatchedRuleSequence([
|
||||
{ index: 0, method: 'phase1_tag' },
|
||||
{ index: 0, method: 'phase1_tag' },
|
||||
{ index: 0, method: 'phase1_tag' },
|
||||
{ index: 0, method: 'phase1_tag' },
|
||||
{ index: 0, method: 'phase1_tag' },
|
||||
{ index: 0, method: 'aggregate' },
|
||||
{ index: 0, method: 'phase1_tag' },
|
||||
]);
|
||||
|
||||
const state = await engine.run();
|
||||
const reviewersOutput = state.movementOutputs.get('reviewers')!.content;
|
||||
const previousDir = join(tmpDir, '.takt', 'runs', 'test-report-dir', 'context', 'previous_responses');
|
||||
const previousFiles = readdirSync(previousDir);
|
||||
|
||||
expect(state.previousResponseSourcePath).toMatch(/^\.takt\/runs\/test-report-dir\/context\/previous_responses\/supervise\.1\.\d{8}T\d{6}Z\.md$/);
|
||||
expect(previousFiles).toContain('latest.md');
|
||||
expect(previousFiles.some((name) => /^reviewers\.1\.\d{8}T\d{6}Z\.md$/.test(name))).toBe(true);
|
||||
expect(readFileSync(join(previousDir, 'latest.md'), 'utf-8')).toBe('Pass');
|
||||
expect(
|
||||
previousFiles.some((name) => {
|
||||
if (!/^reviewers\.1\.\d{8}T\d{6}Z\.md$/.test(name)) return false;
|
||||
return readFileSync(join(previousDir, name), 'utf-8') === reviewersOutput;
|
||||
})
|
||||
).toBe(true);
|
||||
});
|
||||
|
||||
it('should execute sub-movements concurrently (both runAgent calls happen)', async () => {
|
||||
const config = buildDefaultPieceConfig();
|
||||
const engine = new PieceEngine(config, tmpDir, 'test task', { projectCwd: tmpDir });
|
||||
|
||||
@ -15,7 +15,7 @@ import type { PieceMovement, OutputContractItem, OutputContractLabelPath, Output
|
||||
* Extracted emitMovementReports logic for unit testing.
|
||||
* Mirrors engine.ts emitMovementReports + emitIfReportExists.
|
||||
*
|
||||
* reportDir already includes the `.takt/reports/` prefix (set by engine constructor).
|
||||
* reportDir already includes the `.takt/runs/{slug}/reports` path (set by engine constructor).
|
||||
*/
|
||||
function emitMovementReports(
|
||||
emitter: EventEmitter,
|
||||
@ -59,8 +59,8 @@ function createMovement(overrides: Partial<PieceMovement> = {}): PieceMovement {
|
||||
describe('emitMovementReports', () => {
|
||||
let tmpDir: string;
|
||||
let reportBaseDir: string;
|
||||
// reportDir now includes .takt/reports/ prefix (matches engine constructor behavior)
|
||||
const reportDirName = '.takt/reports/test-report-dir';
|
||||
// reportDir now includes .takt/runs/{slug}/reports path (matches engine constructor behavior)
|
||||
const reportDirName = '.takt/runs/test-report-dir/reports';
|
||||
|
||||
beforeEach(() => {
|
||||
tmpDir = join(tmpdir(), `takt-report-test-${Date.now()}`);
|
||||
|
||||
@ -154,13 +154,17 @@ export function mockDetectMatchedRuleSequence(matches: (RuleMatch | undefined)[]
|
||||
// --- Test environment setup ---
|
||||
|
||||
/**
|
||||
* Create a temporary directory with the required .takt/reports structure.
|
||||
* Create a temporary directory with the required .takt/runs structure.
|
||||
* Returns the tmpDir path. Caller is responsible for cleanup.
|
||||
*/
|
||||
export function createTestTmpDir(): string {
|
||||
const tmpDir = join(tmpdir(), `takt-engine-test-${randomUUID()}`);
|
||||
mkdirSync(tmpDir, { recursive: true });
|
||||
mkdirSync(join(tmpDir, '.takt', 'reports', 'test-report-dir'), { recursive: true });
|
||||
mkdirSync(join(tmpDir, '.takt', 'runs', 'test-report-dir', 'reports'), { recursive: true });
|
||||
mkdirSync(join(tmpDir, '.takt', 'runs', 'test-report-dir', 'context', 'knowledge'), { recursive: true });
|
||||
mkdirSync(join(tmpDir, '.takt', 'runs', 'test-report-dir', 'context', 'policy'), { recursive: true });
|
||||
mkdirSync(join(tmpDir, '.takt', 'runs', 'test-report-dir', 'context', 'previous_responses'), { recursive: true });
|
||||
mkdirSync(join(tmpDir, '.takt', 'runs', 'test-report-dir', 'logs'), { recursive: true });
|
||||
return tmpDir;
|
||||
}
|
||||
|
||||
@ -178,8 +182,21 @@ export function applyDefaultMocks(): void {
|
||||
* Clean up PieceEngine instances to prevent EventEmitter memory leaks.
|
||||
* Call this in afterEach to ensure all event listeners are removed.
|
||||
*/
|
||||
export function cleanupPieceEngine(engine: any): void {
|
||||
if (engine && typeof engine.removeAllListeners === 'function') {
|
||||
type ListenerCleanupTarget = {
|
||||
removeAllListeners: () => void;
|
||||
};
|
||||
|
||||
function isListenerCleanupTarget(value: unknown): value is ListenerCleanupTarget {
|
||||
return (
|
||||
typeof value === 'object' &&
|
||||
value !== null &&
|
||||
'removeAllListeners' in value &&
|
||||
typeof value.removeAllListeners === 'function'
|
||||
);
|
||||
}
|
||||
|
||||
export function cleanupPieceEngine(engine: unknown): void {
|
||||
if (isListenerCleanupTarget(engine)) {
|
||||
engine.removeAllListeners();
|
||||
}
|
||||
}
|
||||
|
||||
@ -6,7 +6,7 @@
|
||||
*/
|
||||
|
||||
import { describe, it, expect, beforeEach, afterEach, vi } from 'vitest';
|
||||
import { existsSync, rmSync, mkdirSync } from 'node:fs';
|
||||
import { existsSync, rmSync, mkdirSync, readdirSync, readFileSync } from 'node:fs';
|
||||
import { join } from 'node:path';
|
||||
import { tmpdir } from 'node:os';
|
||||
import { randomUUID } from 'node:crypto';
|
||||
@ -51,11 +51,11 @@ function createWorktreeDirs(): { projectCwd: string; cloneCwd: string } {
|
||||
const projectCwd = join(base, 'project');
|
||||
const cloneCwd = join(base, 'clone');
|
||||
|
||||
// Project side: real .takt/reports directory (for non-worktree tests)
|
||||
mkdirSync(join(projectCwd, '.takt', 'reports', 'test-report-dir'), { recursive: true });
|
||||
// Project side: real .takt/runs directory (for non-worktree tests)
|
||||
mkdirSync(join(projectCwd, '.takt', 'runs', 'test-report-dir', 'reports'), { recursive: true });
|
||||
|
||||
// Clone side: .takt/reports directory (reports now written directly to clone)
|
||||
mkdirSync(join(cloneCwd, '.takt', 'reports', 'test-report-dir'), { recursive: true });
|
||||
// Clone side: .takt/runs directory (reports now written directly to clone)
|
||||
mkdirSync(join(cloneCwd, '.takt', 'runs', 'test-report-dir', 'reports'), { recursive: true });
|
||||
|
||||
return { projectCwd, cloneCwd };
|
||||
}
|
||||
@ -121,8 +121,8 @@ describe('PieceEngine: worktree reportDir resolution', () => {
|
||||
|
||||
// reportDir should be resolved from cloneCwd (cwd), not projectCwd
|
||||
// This prevents agents from discovering the main repository path via instruction
|
||||
const expectedPath = join(cloneCwd, '.takt/reports/test-report-dir');
|
||||
const unexpectedPath = join(projectCwd, '.takt/reports/test-report-dir');
|
||||
const expectedPath = join(cloneCwd, '.takt/runs/test-report-dir/reports');
|
||||
const unexpectedPath = join(projectCwd, '.takt/runs/test-report-dir/reports');
|
||||
|
||||
expect(phaseCtx.reportDir).toBe(expectedPath);
|
||||
expect(phaseCtx.reportDir).not.toBe(unexpectedPath);
|
||||
@ -166,10 +166,10 @@ describe('PieceEngine: worktree reportDir resolution', () => {
|
||||
expect(runAgentMock).toHaveBeenCalled();
|
||||
const instruction = runAgentMock.mock.calls[0][1] as string;
|
||||
|
||||
const expectedPath = join(cloneCwd, '.takt/reports/test-report-dir');
|
||||
const expectedPath = join(cloneCwd, '.takt/runs/test-report-dir/reports');
|
||||
expect(instruction).toContain(expectedPath);
|
||||
// In worktree mode, projectCwd path should NOT appear in instruction
|
||||
expect(instruction).not.toContain(join(projectCwd, '.takt/reports/test-report-dir'));
|
||||
expect(instruction).not.toContain(join(projectCwd, '.takt/runs/test-report-dir/reports'));
|
||||
});
|
||||
|
||||
it('should use same path in non-worktree mode (cwd === projectCwd)', async () => {
|
||||
@ -195,7 +195,7 @@ describe('PieceEngine: worktree reportDir resolution', () => {
|
||||
expect(reportPhaseMock).toHaveBeenCalled();
|
||||
const phaseCtx = reportPhaseMock.mock.calls[0][2] as { reportDir: string };
|
||||
|
||||
const expectedPath = join(normalDir, '.takt/reports/test-report-dir');
|
||||
const expectedPath = join(normalDir, '.takt/runs/test-report-dir/reports');
|
||||
expect(phaseCtx.reportDir).toBe(expectedPath);
|
||||
});
|
||||
|
||||
@ -219,7 +219,7 @@ describe('PieceEngine: worktree reportDir resolution', () => {
|
||||
const reportPhaseMock = vi.mocked(runReportPhase);
|
||||
expect(reportPhaseMock).toHaveBeenCalled();
|
||||
const phaseCtx = reportPhaseMock.mock.calls[0][2] as { reportDir: string };
|
||||
expect(phaseCtx.reportDir).toBe(join(normalDir, '.takt/reports/20260201-015714-foptng'));
|
||||
expect(phaseCtx.reportDir).toBe(join(normalDir, '.takt/runs/20260201-015714-foptng/reports'));
|
||||
});
|
||||
|
||||
it('should reject invalid explicit reportDirName', () => {
|
||||
@ -241,4 +241,54 @@ describe('PieceEngine: worktree reportDir resolution', () => {
|
||||
reportDirName: '',
|
||||
})).toThrow('Invalid reportDirName: ');
|
||||
});
|
||||
|
||||
it('should persist context snapshots and update latest previous response', async () => {
|
||||
const normalDir = projectCwd;
|
||||
const config: PieceConfig = {
|
||||
name: 'snapshot-test',
|
||||
description: 'Test',
|
||||
maxIterations: 10,
|
||||
initialMovement: 'implement',
|
||||
movements: [
|
||||
makeMovement('implement', {
|
||||
policyContents: ['Policy content'],
|
||||
knowledgeContents: ['Knowledge content'],
|
||||
rules: [makeRule('go-review', 'review')],
|
||||
}),
|
||||
makeMovement('review', {
|
||||
rules: [makeRule('approved', 'COMPLETE')],
|
||||
}),
|
||||
],
|
||||
};
|
||||
const engine = new PieceEngine(config, normalDir, 'test task', {
|
||||
projectCwd: normalDir,
|
||||
reportDirName: 'test-report-dir',
|
||||
});
|
||||
|
||||
mockRunAgentSequence([
|
||||
makeResponse({ persona: 'implement', content: 'implement output' }),
|
||||
makeResponse({ persona: 'review', content: 'review output' }),
|
||||
]);
|
||||
mockDetectMatchedRuleSequence([
|
||||
{ index: 0, method: 'tag' as const },
|
||||
{ index: 0, method: 'tag' as const },
|
||||
]);
|
||||
|
||||
await engine.run();
|
||||
|
||||
const base = join(normalDir, '.takt', 'runs', 'test-report-dir', 'context');
|
||||
const knowledgeDir = join(base, 'knowledge');
|
||||
const policyDir = join(base, 'policy');
|
||||
const previousResponsesDir = join(base, 'previous_responses');
|
||||
|
||||
const knowledgeFiles = readdirSync(knowledgeDir);
|
||||
const policyFiles = readdirSync(policyDir);
|
||||
const previousResponseFiles = readdirSync(previousResponsesDir);
|
||||
|
||||
expect(knowledgeFiles.some((name) => name.endsWith('.md'))).toBe(true);
|
||||
expect(policyFiles.some((name) => name.endsWith('.md'))).toBe(true);
|
||||
expect(previousResponseFiles).toContain('latest.md');
|
||||
expect(previousResponseFiles.filter((name) => name.endsWith('.md')).length).toBe(3);
|
||||
expect(readFileSync(join(previousResponsesDir, 'latest.md'), 'utf-8')).toBe('review output');
|
||||
});
|
||||
});
|
||||
|
||||
@ -112,6 +112,23 @@ describe('replaceTemplatePlaceholders', () => {
|
||||
expect(result).toBe('Previous: previous output text');
|
||||
});
|
||||
|
||||
it('should prefer preprocessed previous response text when provided', () => {
|
||||
const step = makeMovement({ passPreviousResponse: true });
|
||||
const ctx = makeContext({
|
||||
previousOutput: {
|
||||
persona: 'coder',
|
||||
status: 'done',
|
||||
content: 'raw previous output',
|
||||
timestamp: new Date(),
|
||||
},
|
||||
previousResponseText: 'processed previous output',
|
||||
});
|
||||
const template = 'Previous: {previous_response}';
|
||||
|
||||
const result = replaceTemplatePlaceholders(template, step, ctx);
|
||||
expect(result).toBe('Previous: processed previous output');
|
||||
});
|
||||
|
||||
it('should replace {previous_response} with empty string when no previous output', () => {
|
||||
const step = makeMovement({ passPreviousResponse: true });
|
||||
const ctx = makeContext();
|
||||
|
||||
@ -128,13 +128,13 @@ describe('instruction-builder', () => {
|
||||
);
|
||||
const context = createMinimalContext({
|
||||
cwd: '/project',
|
||||
reportDir: '/project/.takt/reports/20260128-test-report',
|
||||
reportDir: '/project/.takt/runs/20260128-test-report/reports',
|
||||
});
|
||||
|
||||
const result = buildInstruction(step, context);
|
||||
|
||||
expect(result).toContain(
|
||||
'- Report Directory: /project/.takt/reports/20260128-test-report/'
|
||||
'- Report Directory: /project/.takt/runs/20260128-test-report/reports/'
|
||||
);
|
||||
});
|
||||
|
||||
@ -145,14 +145,14 @@ describe('instruction-builder', () => {
|
||||
const context = createMinimalContext({
|
||||
cwd: '/clone/my-task',
|
||||
projectCwd: '/project',
|
||||
reportDir: '/project/.takt/reports/20260128-worktree-report',
|
||||
reportDir: '/project/.takt/runs/20260128-worktree-report/reports',
|
||||
});
|
||||
|
||||
const result = buildInstruction(step, context);
|
||||
|
||||
// reportDir is now absolute, pointing to projectCwd
|
||||
expect(result).toContain(
|
||||
'- Report: /project/.takt/reports/20260128-worktree-report/00-plan.md'
|
||||
'- Report: /project/.takt/runs/20260128-worktree-report/reports/00-plan.md'
|
||||
);
|
||||
expect(result).toContain('Working Directory: /clone/my-task');
|
||||
});
|
||||
@ -164,13 +164,13 @@ describe('instruction-builder', () => {
|
||||
const context = createMinimalContext({
|
||||
projectCwd: '/project',
|
||||
cwd: '/worktree',
|
||||
reportDir: '/project/.takt/reports/20260128-multi',
|
||||
reportDir: '/project/.takt/runs/20260128-multi/reports',
|
||||
});
|
||||
|
||||
const result = buildInstruction(step, context);
|
||||
|
||||
expect(result).toContain('/project/.takt/reports/20260128-multi/01-scope.md');
|
||||
expect(result).toContain('/project/.takt/reports/20260128-multi/02-decisions.md');
|
||||
expect(result).toContain('/project/.takt/runs/20260128-multi/reports/01-scope.md');
|
||||
expect(result).toContain('/project/.takt/runs/20260128-multi/reports/02-decisions.md');
|
||||
});
|
||||
|
||||
it('should replace standalone {report_dir} with absolute path', () => {
|
||||
@ -178,12 +178,108 @@ describe('instruction-builder', () => {
|
||||
'Report dir name: {report_dir}'
|
||||
);
|
||||
const context = createMinimalContext({
|
||||
reportDir: '/project/.takt/reports/20260128-standalone',
|
||||
reportDir: '/project/.takt/runs/20260128-standalone/reports',
|
||||
});
|
||||
|
||||
const result = buildInstruction(step, context);
|
||||
|
||||
expect(result).toContain('Report dir name: /project/.takt/reports/20260128-standalone');
|
||||
expect(result).toContain('Report dir name: /project/.takt/runs/20260128-standalone/reports');
|
||||
});
|
||||
});
|
||||
|
||||
describe('context length control and source path injection', () => {
|
||||
it('should truncate previous response and inject source path with conflict notice', () => {
|
||||
const step = createMinimalStep('Continue work');
|
||||
step.passPreviousResponse = true;
|
||||
const longResponse = 'x'.repeat(2100);
|
||||
const context = createMinimalContext({
|
||||
previousOutput: {
|
||||
persona: 'coder',
|
||||
status: 'done',
|
||||
content: longResponse,
|
||||
timestamp: new Date(),
|
||||
},
|
||||
previousResponseSourcePath: '.takt/runs/test/context/previous_responses/latest.md',
|
||||
});
|
||||
|
||||
const result = buildInstruction(step, context);
|
||||
|
||||
expect(result).toContain('...TRUNCATED...');
|
||||
expect(result).toContain('Source: .takt/runs/test/context/previous_responses/latest.md');
|
||||
expect(result).toContain('If prompt content conflicts with source files, source files take precedence.');
|
||||
});
|
||||
|
||||
it('should always inject source paths when content is not truncated', () => {
|
||||
const step = createMinimalStep('Do work');
|
||||
step.passPreviousResponse = true;
|
||||
const context = createMinimalContext({
|
||||
previousOutput: {
|
||||
persona: 'reviewer',
|
||||
status: 'done',
|
||||
content: 'short previous response',
|
||||
timestamp: new Date(),
|
||||
},
|
||||
previousResponseSourcePath: '.takt/runs/test/context/previous_responses/latest.md',
|
||||
knowledgeContents: ['short knowledge'],
|
||||
knowledgeSourcePath: '.takt/runs/test/context/knowledge/implement.1.20260210T010203Z.md',
|
||||
policyContents: ['short policy'],
|
||||
policySourcePath: '.takt/runs/test/context/policy/implement.1.20260210T010203Z.md',
|
||||
});
|
||||
|
||||
const result = buildInstruction(step, context);
|
||||
|
||||
expect(result).toContain('Knowledge Source: .takt/runs/test/context/knowledge/implement.1.20260210T010203Z.md');
|
||||
expect(result).toContain('Policy Source: .takt/runs/test/context/policy/implement.1.20260210T010203Z.md');
|
||||
expect(result).toContain('Source: .takt/runs/test/context/previous_responses/latest.md');
|
||||
expect(result).not.toContain('...TRUNCATED...');
|
||||
expect(result).not.toContain('Knowledge is truncated.');
|
||||
expect(result).not.toContain('Policy is authoritative. If truncated');
|
||||
expect(result).not.toContain('Previous Response is truncated.');
|
||||
});
|
||||
|
||||
it('should not truncate when content length is exactly 2000 chars', () => {
|
||||
const step = createMinimalStep('Do work');
|
||||
step.passPreviousResponse = true;
|
||||
const exactBoundary = 'x'.repeat(2000);
|
||||
const context = createMinimalContext({
|
||||
previousOutput: {
|
||||
persona: 'reviewer',
|
||||
status: 'done',
|
||||
content: exactBoundary,
|
||||
timestamp: new Date(),
|
||||
},
|
||||
previousResponseSourcePath: '.takt/runs/test/context/previous_responses/latest.md',
|
||||
knowledgeContents: [exactBoundary],
|
||||
knowledgeSourcePath: '.takt/runs/test/context/knowledge/implement.1.20260210T010203Z.md',
|
||||
policyContents: [exactBoundary],
|
||||
policySourcePath: '.takt/runs/test/context/policy/implement.1.20260210T010203Z.md',
|
||||
});
|
||||
|
||||
const result = buildInstruction(step, context);
|
||||
|
||||
expect(result).toContain('Knowledge Source: .takt/runs/test/context/knowledge/implement.1.20260210T010203Z.md');
|
||||
expect(result).toContain('Policy Source: .takt/runs/test/context/policy/implement.1.20260210T010203Z.md');
|
||||
expect(result).toContain('Source: .takt/runs/test/context/previous_responses/latest.md');
|
||||
expect(result).not.toContain('...TRUNCATED...');
|
||||
});
|
||||
|
||||
it('should inject required truncated warning and source path for knowledge/policy', () => {
|
||||
const step = createMinimalStep('Do work');
|
||||
const longKnowledge = 'k'.repeat(2200);
|
||||
const longPolicy = 'p'.repeat(2200);
|
||||
const context = createMinimalContext({
|
||||
knowledgeContents: [longKnowledge],
|
||||
knowledgeSourcePath: '.takt/runs/test/context/knowledge/implement.1.20260210T010203Z.md',
|
||||
policyContents: [longPolicy],
|
||||
policySourcePath: '.takt/runs/test/context/policy/implement.1.20260210T010203Z.md',
|
||||
});
|
||||
|
||||
const result = buildInstruction(step, context);
|
||||
|
||||
expect(result).toContain('Knowledge is truncated. You MUST consult the source files before making decisions.');
|
||||
expect(result).toContain('Policy is authoritative. If truncated, you MUST read the full policy file and follow it strictly.');
|
||||
expect(result).toContain('Knowledge Source: .takt/runs/test/context/knowledge/implement.1.20260210T010203Z.md');
|
||||
expect(result).toContain('Policy Source: .takt/runs/test/context/policy/implement.1.20260210T010203Z.md');
|
||||
});
|
||||
});
|
||||
|
||||
@ -380,7 +476,7 @@ describe('instruction-builder', () => {
|
||||
step.name = 'plan';
|
||||
step.outputContracts = [{ name: '00-plan.md' }];
|
||||
const context = createMinimalContext({
|
||||
reportDir: '/project/.takt/reports/20260129-test',
|
||||
reportDir: '/project/.takt/runs/20260129-test/reports',
|
||||
language: 'en',
|
||||
});
|
||||
|
||||
@ -399,7 +495,7 @@ describe('instruction-builder', () => {
|
||||
{ label: 'Decisions', path: '02-decisions.md' },
|
||||
];
|
||||
const context = createMinimalContext({
|
||||
reportDir: '/project/.takt/reports/20260129-test',
|
||||
reportDir: '/project/.takt/runs/20260129-test/reports',
|
||||
language: 'en',
|
||||
});
|
||||
|
||||
@ -414,7 +510,7 @@ describe('instruction-builder', () => {
|
||||
const step = createMinimalStep('Do work');
|
||||
step.outputContracts = [{ name: '00-plan.md' }];
|
||||
const context = createMinimalContext({
|
||||
reportDir: '/project/.takt/reports/20260129-test',
|
||||
reportDir: '/project/.takt/runs/20260129-test/reports',
|
||||
language: 'en',
|
||||
});
|
||||
|
||||
@ -559,7 +655,7 @@ describe('instruction-builder', () => {
|
||||
const step = createMinimalStep('Do work');
|
||||
step.outputContracts = [{ name: '00-plan.md' }];
|
||||
const context = createMinimalContext({
|
||||
reportDir: '/project/.takt/reports/20260129-test',
|
||||
reportDir: '/project/.takt/runs/20260129-test/reports',
|
||||
language: 'en',
|
||||
});
|
||||
|
||||
@ -579,7 +675,7 @@ describe('instruction-builder', () => {
|
||||
const step = createMinimalStep('Do work');
|
||||
step.outputContracts = [{ name: '00-plan.md', format: '**Format:**\n# Plan' }];
|
||||
const context = createMinimalContext({
|
||||
reportDir: '/project/.takt/reports/20260129-test',
|
||||
reportDir: '/project/.takt/runs/20260129-test/reports',
|
||||
language: 'en',
|
||||
});
|
||||
|
||||
@ -595,7 +691,7 @@ describe('instruction-builder', () => {
|
||||
order: 'Custom order instruction',
|
||||
}];
|
||||
const context = createMinimalContext({
|
||||
reportDir: '/project/.takt/reports/20260129-test',
|
||||
reportDir: '/project/.takt/runs/20260129-test/reports',
|
||||
language: 'en',
|
||||
});
|
||||
|
||||
@ -607,13 +703,13 @@ describe('instruction-builder', () => {
|
||||
it('should still replace {report:filename} in instruction_template', () => {
|
||||
const step = createMinimalStep('Write to {report:00-plan.md}');
|
||||
const context = createMinimalContext({
|
||||
reportDir: '/project/.takt/reports/20260129-test',
|
||||
reportDir: '/project/.takt/runs/20260129-test/reports',
|
||||
language: 'en',
|
||||
});
|
||||
|
||||
const result = buildInstruction(step, context);
|
||||
|
||||
expect(result).toContain('Write to /project/.takt/reports/20260129-test/00-plan.md');
|
||||
expect(result).toContain('Write to /project/.takt/runs/20260129-test/reports/00-plan.md');
|
||||
expect(result).not.toContain('{report:00-plan.md}');
|
||||
});
|
||||
});
|
||||
@ -622,7 +718,7 @@ describe('instruction-builder', () => {
|
||||
function createReportContext(overrides: Partial<ReportInstructionContext> = {}): ReportInstructionContext {
|
||||
return {
|
||||
cwd: '/project',
|
||||
reportDir: '/project/.takt/reports/20260129-test',
|
||||
reportDir: '/project/.takt/runs/20260129-test/reports',
|
||||
movementIteration: 1,
|
||||
language: 'en',
|
||||
...overrides,
|
||||
@ -663,12 +759,12 @@ describe('instruction-builder', () => {
|
||||
it('should include report directory and file for string report', () => {
|
||||
const step = createMinimalStep('Do work');
|
||||
step.outputContracts = [{ name: '00-plan.md' }];
|
||||
const ctx = createReportContext({ reportDir: '/project/.takt/reports/20260130-test' });
|
||||
const ctx = createReportContext({ reportDir: '/project/.takt/runs/20260130-test/reports' });
|
||||
|
||||
const result = buildReportInstruction(step, ctx);
|
||||
|
||||
expect(result).toContain('- Report Directory: /project/.takt/reports/20260130-test/');
|
||||
expect(result).toContain('- Report File: /project/.takt/reports/20260130-test/00-plan.md');
|
||||
expect(result).toContain('- Report Directory: /project/.takt/runs/20260130-test/reports/');
|
||||
expect(result).toContain('- Report File: /project/.takt/runs/20260130-test/reports/00-plan.md');
|
||||
});
|
||||
|
||||
it('should include report files for OutputContractEntry[] report', () => {
|
||||
@ -681,10 +777,10 @@ describe('instruction-builder', () => {
|
||||
|
||||
const result = buildReportInstruction(step, ctx);
|
||||
|
||||
expect(result).toContain('- Report Directory: /project/.takt/reports/20260129-test/');
|
||||
expect(result).toContain('- Report Directory: /project/.takt/runs/20260129-test/reports/');
|
||||
expect(result).toContain('- Report Files:');
|
||||
expect(result).toContain(' - Scope: /project/.takt/reports/20260129-test/01-scope.md');
|
||||
expect(result).toContain(' - Decisions: /project/.takt/reports/20260129-test/02-decisions.md');
|
||||
expect(result).toContain(' - Scope: /project/.takt/runs/20260129-test/reports/01-scope.md');
|
||||
expect(result).toContain(' - Decisions: /project/.takt/runs/20260129-test/reports/02-decisions.md');
|
||||
});
|
||||
|
||||
it('should include report file for OutputContractItem report', () => {
|
||||
@ -694,7 +790,7 @@ describe('instruction-builder', () => {
|
||||
|
||||
const result = buildReportInstruction(step, ctx);
|
||||
|
||||
expect(result).toContain('- Report File: /project/.takt/reports/20260129-test/00-plan.md');
|
||||
expect(result).toContain('- Report File: /project/.takt/runs/20260129-test/reports/00-plan.md');
|
||||
});
|
||||
|
||||
it('should include auto-generated report output instruction', () => {
|
||||
@ -719,7 +815,7 @@ describe('instruction-builder', () => {
|
||||
|
||||
const result = buildReportInstruction(step, ctx);
|
||||
|
||||
expect(result).toContain('Output to /project/.takt/reports/20260129-test/00-plan.md file.');
|
||||
expect(result).toContain('Output to /project/.takt/runs/20260129-test/reports/00-plan.md file.');
|
||||
expect(result).not.toContain('**Report output:**');
|
||||
});
|
||||
|
||||
@ -895,6 +991,24 @@ describe('instruction-builder', () => {
|
||||
expect(result).toContain('## Feedback\nReview feedback here');
|
||||
});
|
||||
|
||||
it('should apply truncation and source path when {previous_response} placeholder is used', () => {
|
||||
const step = createMinimalStep('## Feedback\n{previous_response}\n\nFix the issues.');
|
||||
step.passPreviousResponse = true;
|
||||
const context = createMinimalContext({
|
||||
previousOutput: { content: 'x'.repeat(2100), tag: '[TEST:1]' },
|
||||
previousResponseSourcePath: '.takt/runs/test/context/previous_responses/latest.md',
|
||||
language: 'en',
|
||||
});
|
||||
|
||||
const result = buildInstruction(step, context);
|
||||
|
||||
expect(result).not.toContain('## Previous Response\n');
|
||||
expect(result).toContain('## Feedback');
|
||||
expect(result).toContain('...TRUNCATED...');
|
||||
expect(result).toContain('Source: .takt/runs/test/context/previous_responses/latest.md');
|
||||
expect(result).toContain('If prompt content conflicts with source files, source files take precedence.');
|
||||
});
|
||||
|
||||
it('should skip auto-injected Additional User Inputs when template contains {user_inputs}', () => {
|
||||
const step = createMinimalStep('Inputs: {user_inputs}');
|
||||
const context = createMinimalContext({
|
||||
|
||||
@ -203,11 +203,11 @@ describe('Instruction Builder IT: report_dir expansion', () => {
|
||||
const step = makeMovement({
|
||||
instructionTemplate: 'Read the plan from {report_dir}/00-plan.md',
|
||||
});
|
||||
const ctx = makeContext({ reportDir: '/tmp/test-project/.takt/reports/20250126-task' });
|
||||
const ctx = makeContext({ reportDir: '/tmp/test-project/.takt/runs/20250126-task/reports' });
|
||||
|
||||
const result = buildInstruction(step, ctx);
|
||||
|
||||
expect(result).toContain('Read the plan from /tmp/test-project/.takt/reports/20250126-task/00-plan.md');
|
||||
expect(result).toContain('Read the plan from /tmp/test-project/.takt/runs/20250126-task/reports/00-plan.md');
|
||||
});
|
||||
|
||||
it('should replace {report:filename} with full path', () => {
|
||||
@ -289,13 +289,13 @@ describe('Instruction Builder IT: buildReportInstruction', () => {
|
||||
|
||||
const result = buildReportInstruction(step, {
|
||||
cwd: '/tmp/test',
|
||||
reportDir: '/tmp/test/.takt/reports/test-dir',
|
||||
reportDir: '/tmp/test/.takt/runs/test-dir/reports',
|
||||
movementIteration: 1,
|
||||
language: 'en',
|
||||
});
|
||||
|
||||
expect(result).toContain('00-plan.md');
|
||||
expect(result).toContain('/tmp/test/.takt/reports/test-dir');
|
||||
expect(result).toContain('/tmp/test/.takt/runs/test-dir/reports');
|
||||
expect(result).toContain('report');
|
||||
});
|
||||
|
||||
|
||||
@ -117,6 +117,8 @@ vi.mock('../infra/config/index.js', () => ({
|
||||
updateWorktreeSession: vi.fn(),
|
||||
loadGlobalConfig: mockLoadGlobalConfig,
|
||||
saveSessionState: vi.fn(),
|
||||
ensureDir: vi.fn(),
|
||||
writeFileAtomic: vi.fn(),
|
||||
}));
|
||||
|
||||
vi.mock('../shared/context.js', () => ({
|
||||
@ -148,23 +150,30 @@ vi.mock('../infra/fs/index.js', () => ({
|
||||
status: _status,
|
||||
endTime: new Date().toISOString(),
|
||||
})),
|
||||
updateLatestPointer: vi.fn(),
|
||||
initNdjsonLog: vi.fn().mockReturnValue('/tmp/test-log.jsonl'),
|
||||
appendNdjsonLine: vi.fn(),
|
||||
}));
|
||||
|
||||
vi.mock('../shared/utils/index.js', () => ({
|
||||
createLogger: vi.fn().mockReturnValue({
|
||||
debug: vi.fn(),
|
||||
info: vi.fn(),
|
||||
warn: vi.fn(),
|
||||
error: vi.fn(),
|
||||
}),
|
||||
notifySuccess: mockNotifySuccess,
|
||||
notifyError: mockNotifyError,
|
||||
playWarningSound: mockPlayWarningSound,
|
||||
preventSleep: vi.fn(),
|
||||
}));
|
||||
vi.mock('../shared/utils/index.js', async (importOriginal) => {
|
||||
const original = await importOriginal<typeof import('../shared/utils/index.js')>();
|
||||
return {
|
||||
...original,
|
||||
createLogger: vi.fn().mockReturnValue({
|
||||
debug: vi.fn(),
|
||||
info: vi.fn(),
|
||||
warn: vi.fn(),
|
||||
error: vi.fn(),
|
||||
}),
|
||||
notifySuccess: mockNotifySuccess,
|
||||
notifyError: mockNotifyError,
|
||||
playWarningSound: mockPlayWarningSound,
|
||||
preventSleep: vi.fn(),
|
||||
isDebugEnabled: vi.fn().mockReturnValue(false),
|
||||
writePromptLog: vi.fn(),
|
||||
generateReportDir: vi.fn().mockReturnValue('test-report-dir'),
|
||||
isValidReportDirName: vi.fn().mockImplementation((value: string) => /^[a-z0-9]+(?:-[a-z0-9]+)*$/.test(value)),
|
||||
};
|
||||
});
|
||||
|
||||
vi.mock('../shared/prompt/index.js', () => ({
|
||||
selectOption: mockSelectOption,
|
||||
|
||||
@ -96,7 +96,6 @@ vi.mock('../shared/utils/index.js', async (importOriginal) => ({
|
||||
iterations: 0,
|
||||
}),
|
||||
finalizeSessionLog: vi.fn().mockImplementation((log, status) => ({ ...log, status })),
|
||||
updateLatestPointer: vi.fn(),
|
||||
initNdjsonLog: vi.fn().mockReturnValue('/tmp/test.ndjson'),
|
||||
appendNdjsonLine: vi.fn(),
|
||||
generateReportDir: vi.fn().mockReturnValue('test-report-dir'),
|
||||
|
||||
@ -78,7 +78,6 @@ vi.mock('../shared/utils/index.js', async (importOriginal) => ({
|
||||
iterations: 0,
|
||||
}),
|
||||
finalizeSessionLog: vi.fn().mockImplementation((log, status) => ({ ...log, status })),
|
||||
updateLatestPointer: vi.fn(),
|
||||
initNdjsonLog: vi.fn().mockReturnValue('/tmp/test.ndjson'),
|
||||
appendNdjsonLine: vi.fn(),
|
||||
generateReportDir: vi.fn().mockReturnValue('test-report-dir'),
|
||||
@ -139,8 +138,8 @@ import { executePipeline } from '../features/pipeline/index.js';
|
||||
function createTestPieceDir(): { dir: string; piecePath: string } {
|
||||
const dir = mkdtempSync(join(tmpdir(), 'takt-it-pipeline-'));
|
||||
|
||||
// Create .takt/reports structure
|
||||
mkdirSync(join(dir, '.takt', 'reports', 'test-report-dir'), { recursive: true });
|
||||
// Create .takt/runs structure
|
||||
mkdirSync(join(dir, '.takt', 'runs', 'test-report-dir', 'reports'), { recursive: true });
|
||||
|
||||
// Create persona prompt files
|
||||
const personasDir = join(dir, 'personas');
|
||||
|
||||
@ -83,6 +83,8 @@ vi.mock('../infra/config/index.js', () => ({
|
||||
updateWorktreeSession: vi.fn(),
|
||||
loadGlobalConfig: vi.fn().mockReturnValue({ provider: 'claude' }),
|
||||
saveSessionState: vi.fn(),
|
||||
ensureDir: vi.fn(),
|
||||
writeFileAtomic: vi.fn(),
|
||||
}));
|
||||
|
||||
vi.mock('../shared/context.js', () => ({
|
||||
@ -114,25 +116,30 @@ vi.mock('../infra/fs/index.js', () => ({
|
||||
status: _status,
|
||||
endTime: new Date().toISOString(),
|
||||
})),
|
||||
updateLatestPointer: vi.fn(),
|
||||
initNdjsonLog: vi.fn().mockReturnValue('/tmp/test-log.jsonl'),
|
||||
appendNdjsonLine: vi.fn(),
|
||||
}));
|
||||
|
||||
vi.mock('../shared/utils/index.js', () => ({
|
||||
createLogger: vi.fn().mockReturnValue({
|
||||
debug: vi.fn(),
|
||||
info: vi.fn(),
|
||||
warn: vi.fn(),
|
||||
error: vi.fn(),
|
||||
}),
|
||||
notifySuccess: vi.fn(),
|
||||
notifyError: vi.fn(),
|
||||
playWarningSound: vi.fn(),
|
||||
preventSleep: vi.fn(),
|
||||
isDebugEnabled: vi.fn().mockReturnValue(false),
|
||||
writePromptLog: vi.fn(),
|
||||
}));
|
||||
vi.mock('../shared/utils/index.js', async (importOriginal) => {
|
||||
const original = await importOriginal<typeof import('../shared/utils/index.js')>();
|
||||
return {
|
||||
...original,
|
||||
createLogger: vi.fn().mockReturnValue({
|
||||
debug: vi.fn(),
|
||||
info: vi.fn(),
|
||||
warn: vi.fn(),
|
||||
error: vi.fn(),
|
||||
}),
|
||||
notifySuccess: vi.fn(),
|
||||
notifyError: vi.fn(),
|
||||
playWarningSound: vi.fn(),
|
||||
preventSleep: vi.fn(),
|
||||
isDebugEnabled: vi.fn().mockReturnValue(false),
|
||||
writePromptLog: vi.fn(),
|
||||
generateReportDir: vi.fn().mockReturnValue('test-report-dir'),
|
||||
isValidReportDirName: vi.fn().mockImplementation((value: string) => /^[a-z0-9]+(?:-[a-z0-9]+)*$/.test(value)),
|
||||
};
|
||||
});
|
||||
|
||||
vi.mock('../shared/prompt/index.js', () => ({
|
||||
selectOption: vi.fn(),
|
||||
|
||||
@ -2,7 +2,7 @@
|
||||
* Integration test for stageAndCommit
|
||||
*
|
||||
* Tests that gitignored files are NOT included in commits.
|
||||
* Regression test for c89ac4c where `git add -f .takt/reports/` caused
|
||||
* Regression test for c89ac4c where `git add -f .takt/runs/` caused
|
||||
* gitignored report files to be committed.
|
||||
*/
|
||||
|
||||
@ -36,15 +36,15 @@ describe('stageAndCommit', () => {
|
||||
}
|
||||
});
|
||||
|
||||
it('should not commit gitignored .takt/reports/ files', () => {
|
||||
it('should not commit gitignored .takt/runs/ files', () => {
|
||||
// Setup: .takt/ is gitignored
|
||||
writeFileSync(join(testDir, '.gitignore'), '.takt/\n');
|
||||
execFileSync('git', ['add', '.gitignore'], { cwd: testDir });
|
||||
execFileSync('git', ['commit', '-m', 'Add gitignore'], { cwd: testDir });
|
||||
|
||||
// Create .takt/reports/ with a report file
|
||||
mkdirSync(join(testDir, '.takt', 'reports', 'test-report'), { recursive: true });
|
||||
writeFileSync(join(testDir, '.takt', 'reports', 'test-report', '00-plan.md'), '# Plan');
|
||||
// Create .takt/runs/ with a report file
|
||||
mkdirSync(join(testDir, '.takt', 'runs', 'test-report', 'reports'), { recursive: true });
|
||||
writeFileSync(join(testDir, '.takt', 'runs', 'test-report', 'reports', '00-plan.md'), '# Plan');
|
||||
|
||||
// Also create a tracked file change to ensure commit happens
|
||||
writeFileSync(join(testDir, 'src.ts'), 'export const x = 1;');
|
||||
@ -52,7 +52,7 @@ describe('stageAndCommit', () => {
|
||||
const hash = stageAndCommit(testDir, 'test commit');
|
||||
expect(hash).toBeDefined();
|
||||
|
||||
// Verify .takt/reports/ is NOT in the commit
|
||||
// Verify .takt/runs/ is NOT in the commit
|
||||
const committedFiles = execFileSync('git', ['diff-tree', '--no-commit-id', '-r', '--name-only', 'HEAD'], {
|
||||
cwd: testDir,
|
||||
encoding: 'utf-8',
|
||||
@ -60,7 +60,7 @@ describe('stageAndCommit', () => {
|
||||
}).trim();
|
||||
|
||||
expect(committedFiles).toContain('src.ts');
|
||||
expect(committedFiles).not.toContain('.takt/reports/');
|
||||
expect(committedFiles).not.toContain('.takt/runs/');
|
||||
});
|
||||
|
||||
it('should commit normally when no gitignored files exist', () => {
|
||||
|
||||
@ -18,6 +18,9 @@ const { mockIsDebugEnabled, mockWritePromptLog, MockPieceEngine } = vi.hoisted((
|
||||
|
||||
constructor(config: PieceConfig, _cwd: string, task: string, _options: unknown) {
|
||||
super();
|
||||
if (task === 'constructor-throw-task') {
|
||||
throw new Error('mock constructor failure');
|
||||
}
|
||||
this.config = config;
|
||||
this.task = task;
|
||||
}
|
||||
@ -27,6 +30,7 @@ const { mockIsDebugEnabled, mockWritePromptLog, MockPieceEngine } = vi.hoisted((
|
||||
async run(): Promise<{ status: string; iteration: number }> {
|
||||
const step = this.config.movements[0]!;
|
||||
const timestamp = new Date('2026-02-07T00:00:00.000Z');
|
||||
const shouldAbort = this.task === 'abort-task';
|
||||
|
||||
const shouldRepeatMovement = this.task === 'repeat-movement-task';
|
||||
this.emit('movement:start', step, 1, 'movement instruction');
|
||||
@ -57,8 +61,11 @@ const { mockIsDebugEnabled, mockWritePromptLog, MockPieceEngine } = vi.hoisted((
|
||||
'movement instruction repeat'
|
||||
);
|
||||
}
|
||||
if (shouldAbort) {
|
||||
this.emit('piece:abort', { status: 'aborted', iteration: 1 }, 'user_interrupted');
|
||||
return { status: 'aborted', iteration: shouldRepeatMovement ? 2 : 1 };
|
||||
}
|
||||
this.emit('piece:complete', { status: 'completed', iteration: 1 });
|
||||
|
||||
return { status: 'completed', iteration: shouldRepeatMovement ? 2 : 1 };
|
||||
}
|
||||
}
|
||||
@ -83,6 +90,8 @@ vi.mock('../infra/config/index.js', () => ({
|
||||
updateWorktreeSession: vi.fn(),
|
||||
loadGlobalConfig: vi.fn().mockReturnValue({ provider: 'claude' }),
|
||||
saveSessionState: vi.fn(),
|
||||
ensureDir: vi.fn(),
|
||||
writeFileAtomic: vi.fn(),
|
||||
}));
|
||||
|
||||
vi.mock('../shared/context.js', () => ({
|
||||
@ -114,7 +123,6 @@ vi.mock('../infra/fs/index.js', () => ({
|
||||
status,
|
||||
endTime: new Date().toISOString(),
|
||||
})),
|
||||
updateLatestPointer: vi.fn(),
|
||||
initNdjsonLog: vi.fn().mockReturnValue('/tmp/test-log.jsonl'),
|
||||
appendNdjsonLine: vi.fn(),
|
||||
}));
|
||||
@ -131,6 +139,8 @@ vi.mock('../shared/utils/index.js', () => ({
|
||||
preventSleep: vi.fn(),
|
||||
isDebugEnabled: mockIsDebugEnabled,
|
||||
writePromptLog: mockWritePromptLog,
|
||||
generateReportDir: vi.fn().mockReturnValue('test-report-dir'),
|
||||
isValidReportDirName: vi.fn().mockImplementation((value: string) => /^[a-z0-9]+(?:-[a-z0-9]+)*$/.test(value)),
|
||||
}));
|
||||
|
||||
vi.mock('../shared/prompt/index.js', () => ({
|
||||
@ -147,6 +157,7 @@ vi.mock('../shared/exitCodes.js', () => ({
|
||||
}));
|
||||
|
||||
import { executePiece } from '../features/tasks/execute/pieceExecution.js';
|
||||
import { ensureDir, writeFileAtomic } from '../infra/config/index.js';
|
||||
|
||||
describe('executePiece debug prompts logging', () => {
|
||||
beforeEach(() => {
|
||||
@ -232,4 +243,69 @@ describe('executePiece debug prompts logging', () => {
|
||||
})
|
||||
).rejects.toThrow('taskPrefix and taskColorIndex must be provided together');
|
||||
});
|
||||
|
||||
it('should fail fast for invalid reportDirName before run directory writes', async () => {
|
||||
await expect(
|
||||
executePiece(makeConfig(), 'task', '/tmp/project', {
|
||||
projectCwd: '/tmp/project',
|
||||
reportDirName: '..',
|
||||
})
|
||||
).rejects.toThrow('Invalid reportDirName: ..');
|
||||
|
||||
expect(vi.mocked(ensureDir)).not.toHaveBeenCalled();
|
||||
expect(vi.mocked(writeFileAtomic)).not.toHaveBeenCalled();
|
||||
});
|
||||
|
||||
it('should update meta status from running to completed', async () => {
|
||||
await executePiece(makeConfig(), 'task', '/tmp/project', {
|
||||
projectCwd: '/tmp/project',
|
||||
reportDirName: 'test-report-dir',
|
||||
});
|
||||
|
||||
const calls = vi.mocked(writeFileAtomic).mock.calls;
|
||||
expect(calls).toHaveLength(2);
|
||||
|
||||
const firstMeta = JSON.parse(String(calls[0]![1])) as { status: string; endTime?: string };
|
||||
const secondMeta = JSON.parse(String(calls[1]![1])) as { status: string; endTime?: string };
|
||||
expect(firstMeta.status).toBe('running');
|
||||
expect(firstMeta.endTime).toBeUndefined();
|
||||
expect(secondMeta.status).toBe('completed');
|
||||
expect(secondMeta.endTime).toMatch(/^\d{4}-\d{2}-\d{2}T/);
|
||||
});
|
||||
|
||||
it('should update meta status from running to aborted', async () => {
|
||||
await executePiece(makeConfig(), 'abort-task', '/tmp/project', {
|
||||
projectCwd: '/tmp/project',
|
||||
reportDirName: 'test-report-dir',
|
||||
});
|
||||
|
||||
const calls = vi.mocked(writeFileAtomic).mock.calls;
|
||||
expect(calls).toHaveLength(2);
|
||||
|
||||
const firstMeta = JSON.parse(String(calls[0]![1])) as { status: string; endTime?: string };
|
||||
const secondMeta = JSON.parse(String(calls[1]![1])) as { status: string; endTime?: string };
|
||||
expect(firstMeta.status).toBe('running');
|
||||
expect(firstMeta.endTime).toBeUndefined();
|
||||
expect(secondMeta.status).toBe('aborted');
|
||||
expect(secondMeta.endTime).toMatch(/^\d{4}-\d{2}-\d{2}T/);
|
||||
});
|
||||
|
||||
it('should finalize meta as aborted when PieceEngine constructor throws', async () => {
|
||||
await expect(
|
||||
executePiece(makeConfig(), 'constructor-throw-task', '/tmp/project', {
|
||||
projectCwd: '/tmp/project',
|
||||
reportDirName: 'test-report-dir',
|
||||
})
|
||||
).rejects.toThrow('mock constructor failure');
|
||||
|
||||
const calls = vi.mocked(writeFileAtomic).mock.calls;
|
||||
expect(calls).toHaveLength(2);
|
||||
|
||||
const firstMeta = JSON.parse(String(calls[0]![1])) as { status: string; endTime?: string };
|
||||
const secondMeta = JSON.parse(String(calls[1]![1])) as { status: string; endTime?: string };
|
||||
expect(firstMeta.status).toBe('running');
|
||||
expect(firstMeta.endTime).toBeUndefined();
|
||||
expect(secondMeta.status).toBe('aborted');
|
||||
expect(secondMeta.endTime).toMatch(/^\d{4}-\d{2}-\d{2}T/);
|
||||
});
|
||||
});
|
||||
|
||||
19
src/__tests__/run-paths.test.ts
Normal file
19
src/__tests__/run-paths.test.ts
Normal file
@ -0,0 +1,19 @@
|
||||
import { describe, it, expect } from 'vitest';
|
||||
import { buildRunPaths } from '../core/piece/run/run-paths.js';
|
||||
|
||||
describe('buildRunPaths', () => {
|
||||
it('should build run-scoped relative and absolute paths', () => {
|
||||
const paths = buildRunPaths('/tmp/project', '20260210-demo-task');
|
||||
|
||||
expect(paths.runRootRel).toBe('.takt/runs/20260210-demo-task');
|
||||
expect(paths.reportsRel).toBe('.takt/runs/20260210-demo-task/reports');
|
||||
expect(paths.contextKnowledgeRel).toBe('.takt/runs/20260210-demo-task/context/knowledge');
|
||||
expect(paths.contextPolicyRel).toBe('.takt/runs/20260210-demo-task/context/policy');
|
||||
expect(paths.contextPreviousResponsesRel).toBe('.takt/runs/20260210-demo-task/context/previous_responses');
|
||||
expect(paths.logsRel).toBe('.takt/runs/20260210-demo-task/logs');
|
||||
expect(paths.metaRel).toBe('.takt/runs/20260210-demo-task/meta.json');
|
||||
|
||||
expect(paths.reportsAbs).toBe('/tmp/project/.takt/runs/20260210-demo-task/reports');
|
||||
expect(paths.metaAbs).toBe('/tmp/project/.takt/runs/20260210-demo-task/meta.json');
|
||||
});
|
||||
});
|
||||
@ -7,14 +7,11 @@ import { existsSync, readFileSync, mkdirSync, rmSync, writeFileSync } from 'node
|
||||
import { join } from 'node:path';
|
||||
import { tmpdir } from 'node:os';
|
||||
import {
|
||||
createSessionLog,
|
||||
updateLatestPointer,
|
||||
initNdjsonLog,
|
||||
appendNdjsonLine,
|
||||
loadNdjsonLog,
|
||||
loadSessionLog,
|
||||
extractFailureInfo,
|
||||
type LatestLogPointer,
|
||||
type SessionLog,
|
||||
type NdjsonRecord,
|
||||
type NdjsonStepComplete,
|
||||
@ -26,121 +23,18 @@ import {
|
||||
type NdjsonInteractiveEnd,
|
||||
} from '../infra/fs/session.js';
|
||||
|
||||
/** Create a temp project directory with .takt/logs structure */
|
||||
/** Create a temp project directory for each test */
|
||||
function createTempProject(): string {
|
||||
const dir = join(tmpdir(), `takt-test-${Date.now()}-${Math.random().toString(36).slice(2, 8)}`);
|
||||
mkdirSync(dir, { recursive: true });
|
||||
return dir;
|
||||
}
|
||||
|
||||
describe('updateLatestPointer', () => {
|
||||
let projectDir: string;
|
||||
|
||||
beforeEach(() => {
|
||||
projectDir = createTempProject();
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
rmSync(projectDir, { recursive: true, force: true });
|
||||
});
|
||||
|
||||
it('should create latest.json with pointer data', () => {
|
||||
const log = createSessionLog('my task', projectDir, 'default');
|
||||
const sessionId = 'abc-123';
|
||||
|
||||
updateLatestPointer(log, sessionId, projectDir);
|
||||
|
||||
const latestPath = join(projectDir, '.takt', 'logs', 'latest.json');
|
||||
expect(existsSync(latestPath)).toBe(true);
|
||||
|
||||
const pointer = JSON.parse(readFileSync(latestPath, 'utf-8')) as LatestLogPointer;
|
||||
expect(pointer.sessionId).toBe('abc-123');
|
||||
expect(pointer.logFile).toBe('abc-123.jsonl');
|
||||
expect(pointer.task).toBe('my task');
|
||||
expect(pointer.pieceName).toBe('default');
|
||||
expect(pointer.status).toBe('running');
|
||||
expect(pointer.iterations).toBe(0);
|
||||
expect(pointer.startTime).toBeDefined();
|
||||
expect(pointer.updatedAt).toBeDefined();
|
||||
});
|
||||
|
||||
it('should not create previous.json when copyToPrevious is false', () => {
|
||||
const log = createSessionLog('task', projectDir, 'wf');
|
||||
updateLatestPointer(log, 'sid-1', projectDir);
|
||||
|
||||
const previousPath = join(projectDir, '.takt', 'logs', 'previous.json');
|
||||
expect(existsSync(previousPath)).toBe(false);
|
||||
});
|
||||
|
||||
it('should not create previous.json when copyToPrevious is true but latest.json does not exist', () => {
|
||||
const log = createSessionLog('task', projectDir, 'wf');
|
||||
updateLatestPointer(log, 'sid-1', projectDir, { copyToPrevious: true });
|
||||
|
||||
const previousPath = join(projectDir, '.takt', 'logs', 'previous.json');
|
||||
// latest.json didn't exist before this call, so previous.json should not be created
|
||||
expect(existsSync(previousPath)).toBe(false);
|
||||
});
|
||||
|
||||
it('should copy latest.json to previous.json when copyToPrevious is true and latest exists', () => {
|
||||
const log1 = createSessionLog('first task', projectDir, 'wf1');
|
||||
updateLatestPointer(log1, 'sid-first', projectDir);
|
||||
|
||||
// Simulate a second piece starting
|
||||
const log2 = createSessionLog('second task', projectDir, 'wf2');
|
||||
updateLatestPointer(log2, 'sid-second', projectDir, { copyToPrevious: true });
|
||||
|
||||
const logsDir = join(projectDir, '.takt', 'logs');
|
||||
const latest = JSON.parse(readFileSync(join(logsDir, 'latest.json'), 'utf-8')) as LatestLogPointer;
|
||||
const previous = JSON.parse(readFileSync(join(logsDir, 'previous.json'), 'utf-8')) as LatestLogPointer;
|
||||
|
||||
// latest should point to second session
|
||||
expect(latest.sessionId).toBe('sid-second');
|
||||
expect(latest.task).toBe('second task');
|
||||
|
||||
// previous should point to first session
|
||||
expect(previous.sessionId).toBe('sid-first');
|
||||
expect(previous.task).toBe('first task');
|
||||
});
|
||||
|
||||
it('should not update previous.json on step-complete calls (no copyToPrevious)', () => {
|
||||
// Piece 1 creates latest
|
||||
const log1 = createSessionLog('first', projectDir, 'wf');
|
||||
updateLatestPointer(log1, 'sid-1', projectDir);
|
||||
|
||||
// Piece 2 starts → copies latest to previous
|
||||
const log2 = createSessionLog('second', projectDir, 'wf');
|
||||
updateLatestPointer(log2, 'sid-2', projectDir, { copyToPrevious: true });
|
||||
|
||||
// Step completes → updates only latest (no copyToPrevious)
|
||||
log2.iterations = 1;
|
||||
updateLatestPointer(log2, 'sid-2', projectDir);
|
||||
|
||||
const logsDir = join(projectDir, '.takt', 'logs');
|
||||
const previous = JSON.parse(readFileSync(join(logsDir, 'previous.json'), 'utf-8')) as LatestLogPointer;
|
||||
|
||||
// previous should still point to first session
|
||||
expect(previous.sessionId).toBe('sid-1');
|
||||
});
|
||||
|
||||
it('should update iterations and status in latest.json on subsequent calls', () => {
|
||||
const log = createSessionLog('task', projectDir, 'wf');
|
||||
updateLatestPointer(log, 'sid-1', projectDir, { copyToPrevious: true });
|
||||
|
||||
// Simulate step completion
|
||||
log.iterations = 2;
|
||||
updateLatestPointer(log, 'sid-1', projectDir);
|
||||
|
||||
// Simulate piece completion
|
||||
log.status = 'completed';
|
||||
log.iterations = 3;
|
||||
updateLatestPointer(log, 'sid-1', projectDir);
|
||||
|
||||
const latestPath = join(projectDir, '.takt', 'logs', 'latest.json');
|
||||
const pointer = JSON.parse(readFileSync(latestPath, 'utf-8')) as LatestLogPointer;
|
||||
expect(pointer.status).toBe('completed');
|
||||
expect(pointer.iterations).toBe(3);
|
||||
});
|
||||
});
|
||||
function initTestNdjsonLog(sessionId: string, task: string, pieceName: string, projectDir: string): string {
|
||||
const logsDir = join(projectDir, '.takt', 'runs', 'test-run', 'logs');
|
||||
mkdirSync(logsDir, { recursive: true });
|
||||
return initNdjsonLog(sessionId, task, pieceName, { logsDir });
|
||||
}
|
||||
|
||||
describe('NDJSON log', () => {
|
||||
let projectDir: string;
|
||||
@ -155,7 +49,7 @@ describe('NDJSON log', () => {
|
||||
|
||||
describe('initNdjsonLog', () => {
|
||||
it('should create a .jsonl file with piece_start record', () => {
|
||||
const filepath = initNdjsonLog('sess-001', 'my task', 'default', projectDir);
|
||||
const filepath = initTestNdjsonLog('sess-001', 'my task', 'default', projectDir);
|
||||
|
||||
expect(filepath).toContain('sess-001.jsonl');
|
||||
expect(existsSync(filepath)).toBe(true);
|
||||
@ -176,7 +70,7 @@ describe('NDJSON log', () => {
|
||||
|
||||
describe('appendNdjsonLine', () => {
|
||||
it('should append records as individual lines', () => {
|
||||
const filepath = initNdjsonLog('sess-002', 'task', 'wf', projectDir);
|
||||
const filepath = initTestNdjsonLog('sess-002', 'task', 'wf', projectDir);
|
||||
|
||||
const stepStart: NdjsonRecord = {
|
||||
type: 'step_start',
|
||||
@ -224,7 +118,7 @@ describe('NDJSON log', () => {
|
||||
|
||||
describe('loadNdjsonLog', () => {
|
||||
it('should reconstruct SessionLog from NDJSON file', () => {
|
||||
const filepath = initNdjsonLog('sess-003', 'build app', 'default', projectDir);
|
||||
const filepath = initTestNdjsonLog('sess-003', 'build app', 'default', projectDir);
|
||||
|
||||
// Add step_start + step_complete
|
||||
appendNdjsonLine(filepath, {
|
||||
@ -270,7 +164,7 @@ describe('NDJSON log', () => {
|
||||
});
|
||||
|
||||
it('should handle aborted piece', () => {
|
||||
const filepath = initNdjsonLog('sess-004', 'failing task', 'wf', projectDir);
|
||||
const filepath = initTestNdjsonLog('sess-004', 'failing task', 'wf', projectDir);
|
||||
|
||||
appendNdjsonLine(filepath, {
|
||||
type: 'step_start',
|
||||
@ -321,7 +215,7 @@ describe('NDJSON log', () => {
|
||||
});
|
||||
|
||||
it('should skip step_start records when reconstructing SessionLog', () => {
|
||||
const filepath = initNdjsonLog('sess-005', 'task', 'wf', projectDir);
|
||||
const filepath = initTestNdjsonLog('sess-005', 'task', 'wf', projectDir);
|
||||
|
||||
// Add various records
|
||||
appendNdjsonLine(filepath, {
|
||||
@ -358,7 +252,7 @@ describe('NDJSON log', () => {
|
||||
|
||||
describe('loadSessionLog with .jsonl extension', () => {
|
||||
it('should delegate to loadNdjsonLog for .jsonl files', () => {
|
||||
const filepath = initNdjsonLog('sess-006', 'jsonl task', 'wf', projectDir);
|
||||
const filepath = initTestNdjsonLog('sess-006', 'jsonl task', 'wf', projectDir);
|
||||
|
||||
appendNdjsonLine(filepath, {
|
||||
type: 'step_complete',
|
||||
@ -406,7 +300,7 @@ describe('NDJSON log', () => {
|
||||
|
||||
describe('appendNdjsonLine real-time characteristics', () => {
|
||||
it('should append without overwriting previous content', () => {
|
||||
const filepath = initNdjsonLog('sess-007', 'task', 'wf', projectDir);
|
||||
const filepath = initTestNdjsonLog('sess-007', 'task', 'wf', projectDir);
|
||||
|
||||
// Read after init
|
||||
const after1 = readFileSync(filepath, 'utf-8').trim().split('\n');
|
||||
@ -428,7 +322,7 @@ describe('NDJSON log', () => {
|
||||
});
|
||||
|
||||
it('should produce valid JSON on each line', () => {
|
||||
const filepath = initNdjsonLog('sess-008', 'task', 'wf', projectDir);
|
||||
const filepath = initTestNdjsonLog('sess-008', 'task', 'wf', projectDir);
|
||||
|
||||
for (let i = 0; i < 5; i++) {
|
||||
appendNdjsonLine(filepath, {
|
||||
@ -453,7 +347,7 @@ describe('NDJSON log', () => {
|
||||
|
||||
describe('phase NDJSON records', () => {
|
||||
it('should serialize and append phase_start records', () => {
|
||||
const filepath = initNdjsonLog('sess-phase-001', 'task', 'wf', projectDir);
|
||||
const filepath = initTestNdjsonLog('sess-phase-001', 'task', 'wf', projectDir);
|
||||
|
||||
const record: NdjsonPhaseStart = {
|
||||
type: 'phase_start',
|
||||
@ -480,7 +374,7 @@ describe('NDJSON log', () => {
|
||||
});
|
||||
|
||||
it('should serialize and append phase_complete records', () => {
|
||||
const filepath = initNdjsonLog('sess-phase-002', 'task', 'wf', projectDir);
|
||||
const filepath = initTestNdjsonLog('sess-phase-002', 'task', 'wf', projectDir);
|
||||
|
||||
const record: NdjsonPhaseComplete = {
|
||||
type: 'phase_complete',
|
||||
@ -509,7 +403,7 @@ describe('NDJSON log', () => {
|
||||
});
|
||||
|
||||
it('should serialize phase_complete with error', () => {
|
||||
const filepath = initNdjsonLog('sess-phase-003', 'task', 'wf', projectDir);
|
||||
const filepath = initTestNdjsonLog('sess-phase-003', 'task', 'wf', projectDir);
|
||||
|
||||
const record: NdjsonPhaseComplete = {
|
||||
type: 'phase_complete',
|
||||
@ -534,7 +428,7 @@ describe('NDJSON log', () => {
|
||||
});
|
||||
|
||||
it('should be skipped by loadNdjsonLog (default case)', () => {
|
||||
const filepath = initNdjsonLog('sess-phase-004', 'task', 'wf', projectDir);
|
||||
const filepath = initTestNdjsonLog('sess-phase-004', 'task', 'wf', projectDir);
|
||||
|
||||
// Add phase records
|
||||
appendNdjsonLine(filepath, {
|
||||
@ -577,7 +471,7 @@ describe('NDJSON log', () => {
|
||||
|
||||
describe('interactive NDJSON records', () => {
|
||||
it('should serialize and append interactive_start records', () => {
|
||||
const filepath = initNdjsonLog('sess-interactive-001', 'task', 'wf', projectDir);
|
||||
const filepath = initTestNdjsonLog('sess-interactive-001', 'task', 'wf', projectDir);
|
||||
|
||||
const record: NdjsonInteractiveStart = {
|
||||
type: 'interactive_start',
|
||||
@ -597,7 +491,7 @@ describe('NDJSON log', () => {
|
||||
});
|
||||
|
||||
it('should serialize and append interactive_end records', () => {
|
||||
const filepath = initNdjsonLog('sess-interactive-002', 'task', 'wf', projectDir);
|
||||
const filepath = initTestNdjsonLog('sess-interactive-002', 'task', 'wf', projectDir);
|
||||
|
||||
const record: NdjsonInteractiveEnd = {
|
||||
type: 'interactive_end',
|
||||
@ -620,7 +514,7 @@ describe('NDJSON log', () => {
|
||||
});
|
||||
|
||||
it('should be skipped by loadNdjsonLog (default case)', () => {
|
||||
const filepath = initNdjsonLog('sess-interactive-003', 'task', 'wf', projectDir);
|
||||
const filepath = initTestNdjsonLog('sess-interactive-003', 'task', 'wf', projectDir);
|
||||
|
||||
appendNdjsonLine(filepath, {
|
||||
type: 'interactive_start',
|
||||
@ -647,7 +541,7 @@ describe('NDJSON log', () => {
|
||||
});
|
||||
|
||||
it('should extract failure info from aborted piece log', () => {
|
||||
const filepath = initNdjsonLog('20260205-120000-abc123', 'failing task', 'wf', projectDir);
|
||||
const filepath = initTestNdjsonLog('20260205-120000-abc123', 'failing task', 'wf', projectDir);
|
||||
|
||||
// Add step_start for plan
|
||||
appendNdjsonLine(filepath, {
|
||||
@ -696,7 +590,7 @@ describe('NDJSON log', () => {
|
||||
});
|
||||
|
||||
it('should handle log with only completed movements (no abort)', () => {
|
||||
const filepath = initNdjsonLog('sess-success-001', 'task', 'wf', projectDir);
|
||||
const filepath = initTestNdjsonLog('sess-success-001', 'task', 'wf', projectDir);
|
||||
|
||||
appendNdjsonLine(filepath, {
|
||||
type: 'step_start',
|
||||
@ -731,7 +625,7 @@ describe('NDJSON log', () => {
|
||||
});
|
||||
|
||||
it('should handle log with no step_complete records', () => {
|
||||
const filepath = initNdjsonLog('sess-fail-early-001', 'task', 'wf', projectDir);
|
||||
const filepath = initTestNdjsonLog('sess-fail-early-001', 'task', 'wf', projectDir);
|
||||
|
||||
appendNdjsonLine(filepath, {
|
||||
type: 'step_start',
|
||||
|
||||
@ -233,6 +233,8 @@ export interface PieceState {
|
||||
movementOutputs: Map<string, AgentResponse>;
|
||||
/** Most recent movement output (used for Previous Response injection) */
|
||||
lastOutput?: AgentResponse;
|
||||
/** Source path of the latest previous response snapshot */
|
||||
previousResponseSourcePath?: string;
|
||||
userInputs: string[];
|
||||
personaSessions: Map<string, string>;
|
||||
/** Per-movement iteration counters (how many times each movement has been executed) */
|
||||
|
||||
@ -20,12 +20,14 @@ import { detectMatchedRule } from '../evaluation/index.js';
|
||||
import { incrementMovementIteration } from './state-manager.js';
|
||||
import { createLogger } from '../../../shared/utils/index.js';
|
||||
import type { OptionsBuilder } from './OptionsBuilder.js';
|
||||
import type { MovementExecutor } from './MovementExecutor.js';
|
||||
import type { PhaseName } from '../types.js';
|
||||
|
||||
const log = createLogger('arpeggio-runner');
|
||||
|
||||
export interface ArpeggioRunnerDeps {
|
||||
readonly optionsBuilder: OptionsBuilder;
|
||||
readonly movementExecutor: MovementExecutor;
|
||||
readonly getCwd: () => string;
|
||||
readonly getInteractive: () => boolean;
|
||||
readonly detectRuleIndex: (content: string, movementName: string) => number;
|
||||
@ -224,6 +226,12 @@ export class ArpeggioRunner {
|
||||
|
||||
state.movementOutputs.set(step.name, aggregatedResponse);
|
||||
state.lastOutput = aggregatedResponse;
|
||||
this.deps.movementExecutor.persistPreviousResponseSnapshot(
|
||||
state,
|
||||
step.name,
|
||||
movementIteration,
|
||||
aggregatedResponse.content,
|
||||
);
|
||||
|
||||
const instruction = `[Arpeggio] ${step.name}: ${batches.length} batches, source=${arpeggioConfig.source}`;
|
||||
|
||||
|
||||
@ -6,7 +6,7 @@
|
||||
* Phase 3: Status judgment (no tools, optional)
|
||||
*/
|
||||
|
||||
import { existsSync } from 'node:fs';
|
||||
import { existsSync, writeFileSync } from 'node:fs';
|
||||
import { join } from 'node:path';
|
||||
import type {
|
||||
PieceMovement,
|
||||
@ -23,6 +23,7 @@ import { buildSessionKey } from '../session-key.js';
|
||||
import { incrementMovementIteration, getPreviousOutput } from './state-manager.js';
|
||||
import { createLogger } from '../../../shared/utils/index.js';
|
||||
import type { OptionsBuilder } from './OptionsBuilder.js';
|
||||
import type { RunPaths } from '../run/run-paths.js';
|
||||
|
||||
const log = createLogger('movement-executor');
|
||||
|
||||
@ -31,6 +32,7 @@ export interface MovementExecutorDeps {
|
||||
readonly getCwd: () => string;
|
||||
readonly getProjectCwd: () => string;
|
||||
readonly getReportDir: () => string;
|
||||
readonly getRunPaths: () => RunPaths;
|
||||
readonly getLanguage: () => Language | undefined;
|
||||
readonly getInteractive: () => boolean;
|
||||
readonly getPieceMovements: () => ReadonlyArray<{ name: string; description?: string }>;
|
||||
@ -52,6 +54,77 @@ export class MovementExecutor {
|
||||
private readonly deps: MovementExecutorDeps,
|
||||
) {}
|
||||
|
||||
private static buildTimestamp(): string {
|
||||
return new Date().toISOString().replace(/[-:]/g, '').replace(/\.\d{3}Z$/, 'Z');
|
||||
}
|
||||
|
||||
private writeSnapshot(
|
||||
content: string,
|
||||
directoryRel: string,
|
||||
filename: string,
|
||||
): string {
|
||||
const absPath = join(this.deps.getCwd(), directoryRel, filename);
|
||||
writeFileSync(absPath, content, 'utf-8');
|
||||
return `${directoryRel}/${filename}`;
|
||||
}
|
||||
|
||||
private writeFacetSnapshot(
|
||||
facet: 'knowledge' | 'policy',
|
||||
movementName: string,
|
||||
movementIteration: number,
|
||||
contents: string[] | undefined,
|
||||
): { content: string[]; sourcePath: string } | undefined {
|
||||
if (!contents || contents.length === 0) return undefined;
|
||||
const merged = contents.join('\n\n---\n\n');
|
||||
const timestamp = MovementExecutor.buildTimestamp();
|
||||
const runPaths = this.deps.getRunPaths();
|
||||
const directoryRel = facet === 'knowledge'
|
||||
? runPaths.contextKnowledgeRel
|
||||
: runPaths.contextPolicyRel;
|
||||
const sourcePath = this.writeSnapshot(
|
||||
merged,
|
||||
directoryRel,
|
||||
`${movementName}.${movementIteration}.${timestamp}.md`,
|
||||
);
|
||||
return { content: [merged], sourcePath };
|
||||
}
|
||||
|
||||
private ensurePreviousResponseSnapshot(
|
||||
state: PieceState,
|
||||
movementName: string,
|
||||
movementIteration: number,
|
||||
): void {
|
||||
if (!state.lastOutput || state.previousResponseSourcePath) return;
|
||||
const timestamp = MovementExecutor.buildTimestamp();
|
||||
const runPaths = this.deps.getRunPaths();
|
||||
const fileName = `${movementName}.${movementIteration}.${timestamp}.md`;
|
||||
const sourcePath = this.writeSnapshot(
|
||||
state.lastOutput.content,
|
||||
runPaths.contextPreviousResponsesRel,
|
||||
fileName,
|
||||
);
|
||||
this.writeSnapshot(
|
||||
state.lastOutput.content,
|
||||
runPaths.contextPreviousResponsesRel,
|
||||
'latest.md',
|
||||
);
|
||||
state.previousResponseSourcePath = sourcePath;
|
||||
}
|
||||
|
||||
persistPreviousResponseSnapshot(
|
||||
state: PieceState,
|
||||
movementName: string,
|
||||
movementIteration: number,
|
||||
content: string,
|
||||
): void {
|
||||
const timestamp = MovementExecutor.buildTimestamp();
|
||||
const runPaths = this.deps.getRunPaths();
|
||||
const fileName = `${movementName}.${movementIteration}.${timestamp}.md`;
|
||||
const sourcePath = this.writeSnapshot(content, runPaths.contextPreviousResponsesRel, fileName);
|
||||
this.writeSnapshot(content, runPaths.contextPreviousResponsesRel, 'latest.md');
|
||||
state.previousResponseSourcePath = sourcePath;
|
||||
}
|
||||
|
||||
/** Build Phase 1 instruction from template */
|
||||
buildInstruction(
|
||||
step: PieceMovement,
|
||||
@ -60,6 +133,19 @@ export class MovementExecutor {
|
||||
task: string,
|
||||
maxIterations: number,
|
||||
): string {
|
||||
this.ensurePreviousResponseSnapshot(state, step.name, movementIteration);
|
||||
const policySnapshot = this.writeFacetSnapshot(
|
||||
'policy',
|
||||
step.name,
|
||||
movementIteration,
|
||||
step.policyContents,
|
||||
);
|
||||
const knowledgeSnapshot = this.writeFacetSnapshot(
|
||||
'knowledge',
|
||||
step.name,
|
||||
movementIteration,
|
||||
step.knowledgeContents,
|
||||
);
|
||||
const pieceMovements = this.deps.getPieceMovements();
|
||||
return new InstructionBuilder(step, {
|
||||
task,
|
||||
@ -78,8 +164,11 @@ export class MovementExecutor {
|
||||
pieceName: this.deps.getPieceName(),
|
||||
pieceDescription: this.deps.getPieceDescription(),
|
||||
retryNote: this.deps.getRetryNote(),
|
||||
policyContents: step.policyContents,
|
||||
knowledgeContents: step.knowledgeContents,
|
||||
policyContents: policySnapshot?.content ?? step.policyContents,
|
||||
policySourcePath: policySnapshot?.sourcePath,
|
||||
knowledgeContents: knowledgeSnapshot?.content ?? step.knowledgeContents,
|
||||
knowledgeSourcePath: knowledgeSnapshot?.sourcePath,
|
||||
previousResponseSourcePath: state.previousResponseSourcePath,
|
||||
}).build();
|
||||
}
|
||||
|
||||
@ -144,6 +233,7 @@ export class MovementExecutor {
|
||||
|
||||
state.movementOutputs.set(step.name, response);
|
||||
state.lastOutput = response;
|
||||
this.persistPreviousResponseSnapshot(state, step.name, movementIteration, response.content);
|
||||
this.emitMovementReports(step);
|
||||
return { response, instruction };
|
||||
}
|
||||
|
||||
@ -192,6 +192,12 @@ export class ParallelRunner {
|
||||
|
||||
state.movementOutputs.set(step.name, aggregatedResponse);
|
||||
state.lastOutput = aggregatedResponse;
|
||||
this.deps.movementExecutor.persistPreviousResponseSnapshot(
|
||||
state,
|
||||
step.name,
|
||||
movementIteration,
|
||||
aggregatedResponse.content,
|
||||
);
|
||||
this.deps.movementExecutor.emitMovementReports(step);
|
||||
return { response: aggregatedResponse, instruction: aggregatedInstruction };
|
||||
}
|
||||
|
||||
@ -8,7 +8,6 @@
|
||||
|
||||
import { EventEmitter } from 'node:events';
|
||||
import { mkdirSync, existsSync } from 'node:fs';
|
||||
import { join } from 'node:path';
|
||||
import type {
|
||||
PieceConfig,
|
||||
PieceState,
|
||||
@ -32,6 +31,7 @@ import { OptionsBuilder } from './OptionsBuilder.js';
|
||||
import { MovementExecutor } from './MovementExecutor.js';
|
||||
import { ParallelRunner } from './ParallelRunner.js';
|
||||
import { ArpeggioRunner } from './ArpeggioRunner.js';
|
||||
import { buildRunPaths, type RunPaths } from '../run/run-paths.js';
|
||||
|
||||
const log = createLogger('engine');
|
||||
|
||||
@ -56,6 +56,7 @@ export class PieceEngine extends EventEmitter {
|
||||
private loopDetector: LoopDetector;
|
||||
private cycleDetector: CycleDetector;
|
||||
private reportDir: string;
|
||||
private runPaths: RunPaths;
|
||||
private abortRequested = false;
|
||||
|
||||
private readonly optionsBuilder: OptionsBuilder;
|
||||
@ -83,8 +84,9 @@ export class PieceEngine extends EventEmitter {
|
||||
throw new Error(`Invalid reportDirName: ${options.reportDirName}`);
|
||||
}
|
||||
const reportDirName = options.reportDirName ?? generateReportDir(task);
|
||||
this.reportDir = `.takt/reports/${reportDirName}`;
|
||||
this.ensureReportDirExists();
|
||||
this.runPaths = buildRunPaths(this.cwd, reportDirName);
|
||||
this.reportDir = this.runPaths.reportsRel;
|
||||
this.ensureRunDirsExist();
|
||||
this.validateConfig();
|
||||
this.state = createInitialState(config, options);
|
||||
this.detectRuleIndex = options.detectRuleIndex ?? (() => {
|
||||
@ -112,6 +114,7 @@ export class PieceEngine extends EventEmitter {
|
||||
getCwd: () => this.cwd,
|
||||
getProjectCwd: () => this.projectCwd,
|
||||
getReportDir: () => this.reportDir,
|
||||
getRunPaths: () => this.runPaths,
|
||||
getLanguage: () => this.options.language,
|
||||
getInteractive: () => this.options.interactive === true,
|
||||
getPieceMovements: () => this.config.movements.map(s => ({ name: s.name, description: s.description })),
|
||||
@ -147,6 +150,7 @@ export class PieceEngine extends EventEmitter {
|
||||
|
||||
this.arpeggioRunner = new ArpeggioRunner({
|
||||
optionsBuilder: this.optionsBuilder,
|
||||
movementExecutor: this.movementExecutor,
|
||||
getCwd: () => this.cwd,
|
||||
getInteractive: () => this.options.interactive === true,
|
||||
detectRuleIndex: this.detectRuleIndex,
|
||||
@ -175,11 +179,21 @@ export class PieceEngine extends EventEmitter {
|
||||
}
|
||||
}
|
||||
|
||||
/** Ensure report directory exists (in cwd, which is clone dir in worktree mode) */
|
||||
private ensureReportDirExists(): void {
|
||||
const reportDirPath = join(this.cwd, this.reportDir);
|
||||
if (!existsSync(reportDirPath)) {
|
||||
mkdirSync(reportDirPath, { recursive: true });
|
||||
/** Ensure run directories exist (in cwd, which is clone dir in worktree mode) */
|
||||
private ensureRunDirsExist(): void {
|
||||
const requiredDirs = [
|
||||
this.runPaths.runRootAbs,
|
||||
this.runPaths.reportsAbs,
|
||||
this.runPaths.contextAbs,
|
||||
this.runPaths.contextKnowledgeAbs,
|
||||
this.runPaths.contextPolicyAbs,
|
||||
this.runPaths.contextPreviousResponsesAbs,
|
||||
this.runPaths.logsAbs,
|
||||
];
|
||||
for (const dir of requiredDirs) {
|
||||
if (!existsSync(dir)) {
|
||||
mkdirSync(dir, { recursive: true });
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@ -40,6 +40,7 @@ export class StateManager {
|
||||
iteration: 0,
|
||||
movementOutputs: new Map(),
|
||||
lastOutput: undefined,
|
||||
previousResponseSourcePath: undefined,
|
||||
userInputs,
|
||||
personaSessions,
|
||||
movementIterations: new Map(),
|
||||
|
||||
@ -11,6 +11,72 @@ import { buildEditRule } from './instruction-context.js';
|
||||
import { escapeTemplateChars, replaceTemplatePlaceholders } from './escape.js';
|
||||
import { loadTemplate } from '../../../shared/prompts/index.js';
|
||||
|
||||
const CONTEXT_MAX_CHARS = 2000;
|
||||
|
||||
interface PreparedContextBlock {
|
||||
readonly content: string;
|
||||
readonly truncated: boolean;
|
||||
}
|
||||
|
||||
function trimContextContent(content: string): PreparedContextBlock {
|
||||
if (content.length <= CONTEXT_MAX_CHARS) {
|
||||
return { content, truncated: false };
|
||||
}
|
||||
return {
|
||||
content: `${content.slice(0, CONTEXT_MAX_CHARS)}\n...TRUNCATED...`,
|
||||
truncated: true,
|
||||
};
|
||||
}
|
||||
|
||||
function renderConflictNotice(): string {
|
||||
return 'If prompt content conflicts with source files, source files take precedence.';
|
||||
}
|
||||
|
||||
function prepareKnowledgeContent(content: string, sourcePath?: string): string {
|
||||
const prepared = trimContextContent(content);
|
||||
const lines: string[] = [prepared.content];
|
||||
if (prepared.truncated && sourcePath) {
|
||||
lines.push(
|
||||
'',
|
||||
`Knowledge is truncated. You MUST consult the source files before making decisions. Source: ${sourcePath}`,
|
||||
);
|
||||
}
|
||||
if (sourcePath) {
|
||||
lines.push('', `Knowledge Source: ${sourcePath}`);
|
||||
}
|
||||
lines.push('', renderConflictNotice());
|
||||
return lines.join('\n');
|
||||
}
|
||||
|
||||
function preparePolicyContent(content: string, sourcePath?: string): string {
|
||||
const prepared = trimContextContent(content);
|
||||
const lines: string[] = [prepared.content];
|
||||
if (prepared.truncated && sourcePath) {
|
||||
lines.push(
|
||||
'',
|
||||
`Policy is authoritative. If truncated, you MUST read the full policy file and follow it strictly. Source: ${sourcePath}`,
|
||||
);
|
||||
}
|
||||
if (sourcePath) {
|
||||
lines.push('', `Policy Source: ${sourcePath}`);
|
||||
}
|
||||
lines.push('', renderConflictNotice());
|
||||
return lines.join('\n');
|
||||
}
|
||||
|
||||
function preparePreviousResponseContent(content: string, sourcePath?: string): string {
|
||||
const prepared = trimContextContent(content);
|
||||
const lines: string[] = [prepared.content];
|
||||
if (prepared.truncated && sourcePath) {
|
||||
lines.push('', `Previous Response is truncated. Source: ${sourcePath}`);
|
||||
}
|
||||
if (sourcePath) {
|
||||
lines.push('', `Source: ${sourcePath}`);
|
||||
}
|
||||
lines.push('', renderConflictNotice());
|
||||
return lines.join('\n');
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if an output contract entry is the item form (OutputContractItem).
|
||||
*/
|
||||
@ -72,8 +138,14 @@ export class InstructionBuilder {
|
||||
this.context.previousOutput &&
|
||||
!hasPreviousResponsePlaceholder
|
||||
);
|
||||
const previousResponse = hasPreviousResponse && this.context.previousOutput
|
||||
? escapeTemplateChars(this.context.previousOutput.content)
|
||||
const previousResponsePrepared = this.step.passPreviousResponse && this.context.previousOutput
|
||||
? preparePreviousResponseContent(
|
||||
this.context.previousOutput.content,
|
||||
this.context.previousResponseSourcePath,
|
||||
)
|
||||
: '';
|
||||
const previousResponse = hasPreviousResponse
|
||||
? escapeTemplateChars(previousResponsePrepared)
|
||||
: '';
|
||||
|
||||
// User Inputs
|
||||
@ -86,7 +158,10 @@ export class InstructionBuilder {
|
||||
const instructions = replaceTemplatePlaceholders(
|
||||
this.step.instructionTemplate,
|
||||
this.step,
|
||||
this.context,
|
||||
{
|
||||
...this.context,
|
||||
previousResponseText: previousResponsePrepared || undefined,
|
||||
},
|
||||
);
|
||||
|
||||
// Piece name and description
|
||||
@ -101,12 +176,18 @@ export class InstructionBuilder {
|
||||
// Policy injection (top + bottom reminder per "Lost in the Middle" research)
|
||||
const policyContents = this.context.policyContents ?? this.step.policyContents;
|
||||
const hasPolicy = !!(policyContents && policyContents.length > 0);
|
||||
const policyContent = hasPolicy ? policyContents!.join('\n\n---\n\n') : '';
|
||||
const policyJoined = hasPolicy ? policyContents!.join('\n\n---\n\n') : '';
|
||||
const policyContent = hasPolicy
|
||||
? preparePolicyContent(policyJoined, this.context.policySourcePath)
|
||||
: '';
|
||||
|
||||
// Knowledge injection (domain-specific knowledge, no reminder needed)
|
||||
const knowledgeContents = this.context.knowledgeContents ?? this.step.knowledgeContents;
|
||||
const hasKnowledge = !!(knowledgeContents && knowledgeContents.length > 0);
|
||||
const knowledgeContent = hasKnowledge ? knowledgeContents!.join('\n\n---\n\n') : '';
|
||||
const knowledgeJoined = hasKnowledge ? knowledgeContents!.join('\n\n---\n\n') : '';
|
||||
const knowledgeContent = hasKnowledge
|
||||
? prepareKnowledgeContent(knowledgeJoined, this.context.knowledgeSourcePath)
|
||||
: '';
|
||||
|
||||
// Quality gates injection (AI directives for movement completion)
|
||||
const hasQualityGates = !!(this.step.qualityGates && this.step.qualityGates.length > 0);
|
||||
|
||||
@ -37,7 +37,12 @@ export function replaceTemplatePlaceholders(
|
||||
|
||||
// Replace {previous_response}
|
||||
if (step.passPreviousResponse) {
|
||||
if (context.previousOutput) {
|
||||
if (context.previousResponseText !== undefined) {
|
||||
result = result.replace(
|
||||
/\{previous_response\}/g,
|
||||
escapeTemplateChars(context.previousResponseText),
|
||||
);
|
||||
} else if (context.previousOutput) {
|
||||
result = result.replace(
|
||||
/\{previous_response\}/g,
|
||||
escapeTemplateChars(context.previousOutput.content),
|
||||
|
||||
@ -26,6 +26,10 @@ export interface InstructionContext {
|
||||
userInputs: string[];
|
||||
/** Previous movement output if available */
|
||||
previousOutput?: AgentResponse;
|
||||
/** Source path for previous response snapshot */
|
||||
previousResponseSourcePath?: string;
|
||||
/** Preprocessed previous response text for template placeholder replacement */
|
||||
previousResponseText?: string;
|
||||
/** Report directory path */
|
||||
reportDir?: string;
|
||||
/** Language for metadata rendering. Defaults to 'en'. */
|
||||
@ -44,8 +48,12 @@ export interface InstructionContext {
|
||||
retryNote?: string;
|
||||
/** Resolved policy content strings for injection into instruction */
|
||||
policyContents?: string[];
|
||||
/** Source path for policy snapshot */
|
||||
policySourcePath?: string;
|
||||
/** Resolved knowledge content strings for injection into instruction */
|
||||
knowledgeContents?: string[];
|
||||
/** Source path for knowledge snapshot */
|
||||
knowledgeSourcePath?: string;
|
||||
}
|
||||
|
||||
/**
|
||||
|
||||
52
src/core/piece/run/run-paths.ts
Normal file
52
src/core/piece/run/run-paths.ts
Normal file
@ -0,0 +1,52 @@
|
||||
import { join } from 'node:path';
|
||||
|
||||
export interface RunPaths {
|
||||
readonly slug: string;
|
||||
readonly runRootRel: string;
|
||||
readonly reportsRel: string;
|
||||
readonly contextRel: string;
|
||||
readonly contextKnowledgeRel: string;
|
||||
readonly contextPolicyRel: string;
|
||||
readonly contextPreviousResponsesRel: string;
|
||||
readonly logsRel: string;
|
||||
readonly metaRel: string;
|
||||
readonly runRootAbs: string;
|
||||
readonly reportsAbs: string;
|
||||
readonly contextAbs: string;
|
||||
readonly contextKnowledgeAbs: string;
|
||||
readonly contextPolicyAbs: string;
|
||||
readonly contextPreviousResponsesAbs: string;
|
||||
readonly logsAbs: string;
|
||||
readonly metaAbs: string;
|
||||
}
|
||||
|
||||
export function buildRunPaths(cwd: string, slug: string): RunPaths {
|
||||
const runRootRel = `.takt/runs/${slug}`;
|
||||
const reportsRel = `${runRootRel}/reports`;
|
||||
const contextRel = `${runRootRel}/context`;
|
||||
const contextKnowledgeRel = `${contextRel}/knowledge`;
|
||||
const contextPolicyRel = `${contextRel}/policy`;
|
||||
const contextPreviousResponsesRel = `${contextRel}/previous_responses`;
|
||||
const logsRel = `${runRootRel}/logs`;
|
||||
const metaRel = `${runRootRel}/meta.json`;
|
||||
|
||||
return {
|
||||
slug,
|
||||
runRootRel,
|
||||
reportsRel,
|
||||
contextRel,
|
||||
contextKnowledgeRel,
|
||||
contextPolicyRel,
|
||||
contextPreviousResponsesRel,
|
||||
logsRel,
|
||||
metaRel,
|
||||
runRootAbs: join(cwd, runRootRel),
|
||||
reportsAbs: join(cwd, reportsRel),
|
||||
contextAbs: join(cwd, contextRel),
|
||||
contextKnowledgeAbs: join(cwd, contextKnowledgeRel),
|
||||
contextPolicyAbs: join(cwd, contextPolicyRel),
|
||||
contextPreviousResponsesAbs: join(cwd, contextPreviousResponsesRel),
|
||||
logsAbs: join(cwd, logsRel),
|
||||
metaAbs: join(cwd, metaRel),
|
||||
};
|
||||
}
|
||||
@ -55,7 +55,7 @@ export async function previewPrompts(cwd: string, pieceIdentifier?: string): Pro
|
||||
userInputs: [],
|
||||
pieceMovements: config.movements,
|
||||
currentMovementIndex: i,
|
||||
reportDir: movement.outputContracts && movement.outputContracts.length > 0 ? '.takt/reports/preview' : undefined,
|
||||
reportDir: movement.outputContracts && movement.outputContracts.length > 0 ? '.takt/runs/preview/reports' : undefined,
|
||||
language,
|
||||
};
|
||||
|
||||
@ -67,7 +67,7 @@ export async function previewPrompts(cwd: string, pieceIdentifier?: string): Pro
|
||||
if (movement.outputContracts && movement.outputContracts.length > 0) {
|
||||
const reportBuilder = new ReportInstructionBuilder(movement, {
|
||||
cwd,
|
||||
reportDir: '.takt/reports/preview',
|
||||
reportDir: '.takt/runs/preview/reports',
|
||||
movementIteration: 1,
|
||||
language,
|
||||
});
|
||||
|
||||
@ -35,7 +35,6 @@ import {
|
||||
generateSessionId,
|
||||
createSessionLog,
|
||||
finalizeSessionLog,
|
||||
updateLatestPointer,
|
||||
initNdjsonLog,
|
||||
appendNdjsonLine,
|
||||
type NdjsonStepStart,
|
||||
@ -55,11 +54,15 @@ import {
|
||||
playWarningSound,
|
||||
isDebugEnabled,
|
||||
writePromptLog,
|
||||
generateReportDir,
|
||||
isValidReportDirName,
|
||||
} from '../../../shared/utils/index.js';
|
||||
import type { PromptLogRecord } from '../../../shared/utils/index.js';
|
||||
import { selectOption, promptInput } from '../../../shared/prompt/index.js';
|
||||
import { getLabel } from '../../../shared/i18n/index.js';
|
||||
import { installSigIntHandler } from './sigintHandler.js';
|
||||
import { buildRunPaths } from '../../../core/piece/run/run-paths.js';
|
||||
import { writeFileAtomic, ensureDir } from '../../../infra/config/index.js';
|
||||
|
||||
const log = createLogger('piece');
|
||||
|
||||
@ -78,6 +81,20 @@ interface OutputFns {
|
||||
logLine: (text: string) => void;
|
||||
}
|
||||
|
||||
interface RunMeta {
|
||||
task: string;
|
||||
piece: string;
|
||||
runSlug: string;
|
||||
runRoot: string;
|
||||
reportDirectory: string;
|
||||
contextDirectory: string;
|
||||
logsDirectory: string;
|
||||
status: 'running' | 'completed' | 'aborted';
|
||||
startTime: string;
|
||||
endTime?: string;
|
||||
iterations?: number;
|
||||
}
|
||||
|
||||
function assertTaskPrefixPair(
|
||||
taskPrefix: string | undefined,
|
||||
taskColorIndex: number | undefined
|
||||
@ -206,11 +223,42 @@ export async function executePiece(
|
||||
out.header(`${headerPrefix} ${pieceConfig.name}`);
|
||||
|
||||
const pieceSessionId = generateSessionId();
|
||||
const runSlug = options.reportDirName ?? generateReportDir(task);
|
||||
if (!isValidReportDirName(runSlug)) {
|
||||
throw new Error(`Invalid reportDirName: ${runSlug}`);
|
||||
}
|
||||
const runPaths = buildRunPaths(cwd, runSlug);
|
||||
|
||||
const runMeta: RunMeta = {
|
||||
task,
|
||||
piece: pieceConfig.name,
|
||||
runSlug: runPaths.slug,
|
||||
runRoot: runPaths.runRootRel,
|
||||
reportDirectory: runPaths.reportsRel,
|
||||
contextDirectory: runPaths.contextRel,
|
||||
logsDirectory: runPaths.logsRel,
|
||||
status: 'running',
|
||||
startTime: new Date().toISOString(),
|
||||
};
|
||||
ensureDir(runPaths.runRootAbs);
|
||||
writeFileAtomic(runPaths.metaAbs, JSON.stringify(runMeta, null, 2));
|
||||
let isMetaFinalized = false;
|
||||
const finalizeRunMeta = (status: 'completed' | 'aborted', iterations?: number): void => {
|
||||
writeFileAtomic(runPaths.metaAbs, JSON.stringify({
|
||||
...runMeta,
|
||||
status,
|
||||
endTime: new Date().toISOString(),
|
||||
...(iterations != null ? { iterations } : {}),
|
||||
} satisfies RunMeta, null, 2));
|
||||
isMetaFinalized = true;
|
||||
};
|
||||
|
||||
let sessionLog = createSessionLog(task, projectCwd, pieceConfig.name);
|
||||
|
||||
// Initialize NDJSON log file + pointer at piece start
|
||||
const ndjsonLogPath = initNdjsonLog(pieceSessionId, task, pieceConfig.name, projectCwd);
|
||||
updateLatestPointer(sessionLog, pieceSessionId, projectCwd, { copyToPrevious: true });
|
||||
// Initialize NDJSON log file at run-scoped logs directory
|
||||
const ndjsonLogPath = initNdjsonLog(pieceSessionId, task, pieceConfig.name, {
|
||||
logsDir: runPaths.logsAbs,
|
||||
});
|
||||
|
||||
// Write interactive mode records if interactive mode was used before this piece
|
||||
if (options.interactiveMetadata) {
|
||||
@ -330,36 +378,41 @@ export async function executePiece(
|
||||
}
|
||||
: undefined;
|
||||
|
||||
const engine = new PieceEngine(pieceConfig, cwd, task, {
|
||||
abortSignal: options.abortSignal,
|
||||
onStream: streamHandler,
|
||||
onUserInput,
|
||||
initialSessions: savedSessions,
|
||||
onSessionUpdate: sessionUpdateHandler,
|
||||
onIterationLimit: iterationLimitHandler,
|
||||
projectCwd,
|
||||
language: options.language,
|
||||
provider: options.provider,
|
||||
model: options.model,
|
||||
personaProviders: options.personaProviders,
|
||||
interactive: interactiveUserInput,
|
||||
detectRuleIndex,
|
||||
callAiJudge,
|
||||
startMovement: options.startMovement,
|
||||
retryNote: options.retryNote,
|
||||
reportDirName: options.reportDirName,
|
||||
taskPrefix: options.taskPrefix,
|
||||
taskColorIndex: options.taskColorIndex,
|
||||
});
|
||||
|
||||
let abortReason: string | undefined;
|
||||
let lastMovementContent: string | undefined;
|
||||
let lastMovementName: string | undefined;
|
||||
let currentIteration = 0;
|
||||
const phasePrompts = new Map<string, string>();
|
||||
const movementIterations = new Map<string, number>();
|
||||
let engine: PieceEngine | null = null;
|
||||
let onAbortSignal: (() => void) | undefined;
|
||||
let sigintCleanup: (() => void) | undefined;
|
||||
let onEpipe: ((err: NodeJS.ErrnoException) => void) | undefined;
|
||||
|
||||
engine.on('phase:start', (step, phase, phaseName, instruction) => {
|
||||
try {
|
||||
engine = new PieceEngine(pieceConfig, cwd, task, {
|
||||
abortSignal: options.abortSignal,
|
||||
onStream: streamHandler,
|
||||
onUserInput,
|
||||
initialSessions: savedSessions,
|
||||
onSessionUpdate: sessionUpdateHandler,
|
||||
onIterationLimit: iterationLimitHandler,
|
||||
projectCwd,
|
||||
language: options.language,
|
||||
provider: options.provider,
|
||||
model: options.model,
|
||||
personaProviders: options.personaProviders,
|
||||
interactive: interactiveUserInput,
|
||||
detectRuleIndex,
|
||||
callAiJudge,
|
||||
startMovement: options.startMovement,
|
||||
retryNote: options.retryNote,
|
||||
reportDirName: runSlug,
|
||||
taskPrefix: options.taskPrefix,
|
||||
taskColorIndex: options.taskColorIndex,
|
||||
});
|
||||
|
||||
engine.on('phase:start', (step, phase, phaseName, instruction) => {
|
||||
log.debug('Phase starting', { step: step.name, phase, phaseName });
|
||||
const record: NdjsonPhaseStart = {
|
||||
type: 'phase_start',
|
||||
@ -376,7 +429,7 @@ export async function executePiece(
|
||||
}
|
||||
});
|
||||
|
||||
engine.on('phase:complete', (step, phase, phaseName, content, phaseStatus, phaseError) => {
|
||||
engine.on('phase:complete', (step, phase, phaseName, content, phaseStatus, phaseError) => {
|
||||
log.debug('Phase completed', { step: step.name, phase, phaseName, status: phaseStatus });
|
||||
const record: NdjsonPhaseComplete = {
|
||||
type: 'phase_complete',
|
||||
@ -409,7 +462,7 @@ export async function executePiece(
|
||||
}
|
||||
});
|
||||
|
||||
engine.on('movement:start', (step, iteration, instruction) => {
|
||||
engine.on('movement:start', (step, iteration, instruction) => {
|
||||
log.debug('Movement starting', { step: step.name, persona: step.personaDisplayName, iteration });
|
||||
currentIteration = iteration;
|
||||
const movementIteration = (movementIterations.get(step.name) ?? 0) + 1;
|
||||
@ -457,7 +510,7 @@ export async function executePiece(
|
||||
|
||||
});
|
||||
|
||||
engine.on('movement:complete', (step, response, instruction) => {
|
||||
engine.on('movement:complete', (step, response, instruction) => {
|
||||
log.debug('Movement completed', {
|
||||
step: step.name,
|
||||
status: response.status,
|
||||
@ -516,16 +569,15 @@ export async function executePiece(
|
||||
|
||||
// Update in-memory log for pointer metadata (immutable)
|
||||
sessionLog = { ...sessionLog, iterations: sessionLog.iterations + 1 };
|
||||
updateLatestPointer(sessionLog, pieceSessionId, projectCwd);
|
||||
});
|
||||
|
||||
engine.on('movement:report', (_step, filePath, fileName) => {
|
||||
engine.on('movement:report', (_step, filePath, fileName) => {
|
||||
const content = readFileSync(filePath, 'utf-8');
|
||||
out.logLine(`\n📄 Report: ${fileName}\n`);
|
||||
out.logLine(content);
|
||||
});
|
||||
|
||||
engine.on('piece:complete', (state) => {
|
||||
engine.on('piece:complete', (state) => {
|
||||
log.info('Piece completed successfully', { iterations: state.iteration });
|
||||
sessionLog = finalizeSessionLog(sessionLog, 'completed');
|
||||
|
||||
@ -536,7 +588,7 @@ export async function executePiece(
|
||||
endTime: new Date().toISOString(),
|
||||
};
|
||||
appendNdjsonLine(ndjsonLogPath, record);
|
||||
updateLatestPointer(sessionLog, pieceSessionId, projectCwd);
|
||||
finalizeRunMeta('completed', state.iteration);
|
||||
|
||||
// Save session state for next interactive mode
|
||||
try {
|
||||
@ -565,7 +617,7 @@ export async function executePiece(
|
||||
}
|
||||
});
|
||||
|
||||
engine.on('piece:abort', (state, reason) => {
|
||||
engine.on('piece:abort', (state, reason) => {
|
||||
interruptAllQueries();
|
||||
log.error('Piece aborted', { reason, iterations: state.iteration });
|
||||
if (displayRef.current) {
|
||||
@ -584,7 +636,7 @@ export async function executePiece(
|
||||
endTime: new Date().toISOString(),
|
||||
};
|
||||
appendNdjsonLine(ndjsonLogPath, record);
|
||||
updateLatestPointer(sessionLog, pieceSessionId, projectCwd);
|
||||
finalizeRunMeta('aborted', state.iteration);
|
||||
|
||||
// Save session state for next interactive mode
|
||||
try {
|
||||
@ -613,36 +665,34 @@ export async function executePiece(
|
||||
}
|
||||
});
|
||||
|
||||
// Suppress EPIPE errors from SDK child process stdin after interrupt.
|
||||
// When interruptAllQueries() kills the child process, the SDK may still
|
||||
// try to write to the dead process's stdin pipe, causing an unhandled
|
||||
// EPIPE error on the Socket. This handler catches it gracefully.
|
||||
const onEpipe = (err: NodeJS.ErrnoException) => {
|
||||
if (err.code === 'EPIPE') return;
|
||||
throw err;
|
||||
};
|
||||
// Suppress EPIPE errors from SDK child process stdin after interrupt.
|
||||
// When interruptAllQueries() kills the child process, the SDK may still
|
||||
// try to write to the dead process's stdin pipe, causing an unhandled
|
||||
// EPIPE error on the Socket. This handler catches it gracefully.
|
||||
onEpipe = (err: NodeJS.ErrnoException) => {
|
||||
if (err.code === 'EPIPE') return;
|
||||
throw err;
|
||||
};
|
||||
|
||||
const abortEngine = () => {
|
||||
process.on('uncaughtException', onEpipe);
|
||||
interruptAllQueries();
|
||||
engine.abort();
|
||||
};
|
||||
const abortEngine = () => {
|
||||
if (!engine || !onEpipe) {
|
||||
throw new Error('Abort handler invoked before PieceEngine initialization');
|
||||
}
|
||||
process.on('uncaughtException', onEpipe);
|
||||
interruptAllQueries();
|
||||
engine.abort();
|
||||
};
|
||||
|
||||
// SIGINT handling: when abortSignal is provided (parallel mode), delegate to caller
|
||||
const useExternalAbort = Boolean(options.abortSignal);
|
||||
// SIGINT handling: when abortSignal is provided (parallel mode), delegate to caller
|
||||
const useExternalAbort = Boolean(options.abortSignal);
|
||||
if (useExternalAbort) {
|
||||
onAbortSignal = abortEngine;
|
||||
options.abortSignal!.addEventListener('abort', onAbortSignal, { once: true });
|
||||
} else {
|
||||
const handler = installSigIntHandler(abortEngine);
|
||||
sigintCleanup = handler.cleanup;
|
||||
}
|
||||
|
||||
let onAbortSignal: (() => void) | undefined;
|
||||
let sigintCleanup: (() => void) | undefined;
|
||||
|
||||
if (useExternalAbort) {
|
||||
onAbortSignal = abortEngine;
|
||||
options.abortSignal!.addEventListener('abort', onAbortSignal, { once: true });
|
||||
} else {
|
||||
const handler = installSigIntHandler(abortEngine);
|
||||
sigintCleanup = handler.cleanup;
|
||||
}
|
||||
|
||||
try {
|
||||
const finalState = await engine.run();
|
||||
|
||||
return {
|
||||
@ -651,12 +701,19 @@ export async function executePiece(
|
||||
lastMovement: lastMovementName,
|
||||
lastMessage: lastMovementContent,
|
||||
};
|
||||
} catch (error) {
|
||||
if (!isMetaFinalized) {
|
||||
finalizeRunMeta('aborted');
|
||||
}
|
||||
throw error;
|
||||
} finally {
|
||||
prefixWriter?.flush();
|
||||
sigintCleanup?.();
|
||||
if (onAbortSignal && options.abortSignal) {
|
||||
options.abortSignal.removeEventListener('abort', onAbortSignal);
|
||||
}
|
||||
process.removeListener('uncaughtException', onEpipe);
|
||||
if (onEpipe) {
|
||||
process.removeListener('uncaughtException', onEpipe);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@ -14,7 +14,6 @@ export type {
|
||||
NdjsonInteractiveStart,
|
||||
NdjsonInteractiveEnd,
|
||||
NdjsonRecord,
|
||||
LatestLogPointer,
|
||||
} from './session.js';
|
||||
|
||||
export {
|
||||
@ -28,5 +27,4 @@ export {
|
||||
finalizeSessionLog,
|
||||
loadSessionLog,
|
||||
loadProjectContext,
|
||||
updateLatestPointer,
|
||||
} from './session.js';
|
||||
|
||||
@ -2,15 +2,14 @@
|
||||
* Session management utilities
|
||||
*/
|
||||
|
||||
import { existsSync, readFileSync, copyFileSync, appendFileSync } from 'node:fs';
|
||||
import { existsSync, readFileSync, appendFileSync } from 'node:fs';
|
||||
import { join } from 'node:path';
|
||||
import { getProjectLogsDir, getGlobalLogsDir, ensureDir, writeFileAtomic } from '../config/index.js';
|
||||
import { ensureDir } from '../config/index.js';
|
||||
import { generateReportDir as buildReportDir } from '../../shared/utils/index.js';
|
||||
import type {
|
||||
SessionLog,
|
||||
NdjsonRecord,
|
||||
NdjsonPieceStart,
|
||||
LatestLogPointer,
|
||||
} from '../../shared/utils/index.js';
|
||||
|
||||
export type {
|
||||
@ -25,7 +24,6 @@ export type {
|
||||
NdjsonInteractiveStart,
|
||||
NdjsonInteractiveEnd,
|
||||
NdjsonRecord,
|
||||
LatestLogPointer,
|
||||
} from '../../shared/utils/index.js';
|
||||
|
||||
/** Failure information extracted from session log */
|
||||
@ -44,7 +42,7 @@ export interface FailureInfo {
|
||||
|
||||
/**
|
||||
* Manages session lifecycle: ID generation, NDJSON logging,
|
||||
* session log creation/loading, and latest pointer maintenance.
|
||||
* and session log creation/loading.
|
||||
*/
|
||||
export class SessionManager {
|
||||
/** Append a single NDJSON line to a log file */
|
||||
@ -58,11 +56,9 @@ export class SessionManager {
|
||||
sessionId: string,
|
||||
task: string,
|
||||
pieceName: string,
|
||||
projectDir?: string,
|
||||
options: { logsDir: string },
|
||||
): string {
|
||||
const logsDir = projectDir
|
||||
? getProjectLogsDir(projectDir)
|
||||
: getGlobalLogsDir();
|
||||
const { logsDir } = options;
|
||||
ensureDir(logsDir);
|
||||
|
||||
const filepath = join(logsDir, `${sessionId}.jsonl`);
|
||||
@ -218,38 +214,6 @@ export class SessionManager {
|
||||
return contextParts.join('\n\n---\n\n');
|
||||
}
|
||||
|
||||
/** Update latest.json pointer file */
|
||||
updateLatestPointer(
|
||||
log: SessionLog,
|
||||
sessionId: string,
|
||||
projectDir?: string,
|
||||
options?: { copyToPrevious?: boolean },
|
||||
): void {
|
||||
const logsDir = projectDir
|
||||
? getProjectLogsDir(projectDir)
|
||||
: getGlobalLogsDir();
|
||||
ensureDir(logsDir);
|
||||
|
||||
const latestPath = join(logsDir, 'latest.json');
|
||||
const previousPath = join(logsDir, 'previous.json');
|
||||
|
||||
if (options?.copyToPrevious && existsSync(latestPath)) {
|
||||
copyFileSync(latestPath, previousPath);
|
||||
}
|
||||
|
||||
const pointer: LatestLogPointer = {
|
||||
sessionId,
|
||||
logFile: `${sessionId}.jsonl`,
|
||||
task: log.task,
|
||||
pieceName: log.pieceName,
|
||||
status: log.status,
|
||||
startTime: log.startTime,
|
||||
updatedAt: new Date().toISOString(),
|
||||
iterations: log.iterations,
|
||||
};
|
||||
|
||||
writeFileAtomic(latestPath, JSON.stringify(pointer, null, 2));
|
||||
}
|
||||
}
|
||||
|
||||
const defaultManager = new SessionManager();
|
||||
@ -262,9 +226,9 @@ export function initNdjsonLog(
|
||||
sessionId: string,
|
||||
task: string,
|
||||
pieceName: string,
|
||||
projectDir?: string,
|
||||
options: { logsDir: string },
|
||||
): string {
|
||||
return defaultManager.initNdjsonLog(sessionId, task, pieceName, projectDir);
|
||||
return defaultManager.initNdjsonLog(sessionId, task, pieceName, options);
|
||||
}
|
||||
|
||||
|
||||
@ -304,15 +268,6 @@ export function loadProjectContext(projectDir: string): string {
|
||||
return defaultManager.loadProjectContext(projectDir);
|
||||
}
|
||||
|
||||
export function updateLatestPointer(
|
||||
log: SessionLog,
|
||||
sessionId: string,
|
||||
projectDir?: string,
|
||||
options?: { copyToPrevious?: boolean },
|
||||
): void {
|
||||
defaultManager.updateLatestPointer(log, sessionId, projectDir, options);
|
||||
}
|
||||
|
||||
/**
|
||||
* Extract failure information from an NDJSON session log file.
|
||||
*
|
||||
|
||||
@ -22,6 +22,7 @@ Note: This section is metadata. Follow the language used in the rest of the prom
|
||||
|
||||
## Knowledge
|
||||
The following knowledge is domain-specific information for this movement. Use it as reference.
|
||||
Knowledge may be truncated. Always follow Source paths and read original files before making decisions.
|
||||
|
||||
{{knowledgeContent}}
|
||||
{{/if}}
|
||||
@ -72,6 +73,7 @@ Before completing this movement, ensure the following requirements are met:
|
||||
|
||||
## Policy
|
||||
The following policies are behavioral standards applied to this movement. You MUST comply with them.
|
||||
Policy is authoritative. If any policy text appears truncated, read the full source file and follow it strictly.
|
||||
|
||||
{{policyContent}}
|
||||
{{/if}}
|
||||
|
||||
@ -21,6 +21,7 @@
|
||||
|
||||
## Knowledge
|
||||
以下のナレッジはこのムーブメントに適用されるドメイン固有の知識です。参考にしてください。
|
||||
Knowledge はトリミングされる場合があります。Source Path に従い、判断前に必ず元ファイルを確認してください。
|
||||
|
||||
{{knowledgeContent}}
|
||||
{{/if}}
|
||||
@ -71,6 +72,7 @@
|
||||
|
||||
## Policy
|
||||
以下のポリシーはこのムーブメントに適用される行動規範です。必ず遵守してください。
|
||||
Policy は最優先です。トリミングされている場合は必ず Source Path の全文を確認して厳密に従ってください。
|
||||
|
||||
{{policyContent}}
|
||||
{{/if}}
|
||||
|
||||
@ -43,7 +43,8 @@ export class DebugLogger {
|
||||
/** Get default debug log file prefix */
|
||||
private static getDefaultLogPrefix(projectDir: string): string {
|
||||
const timestamp = new Date().toISOString().replace(/[:.]/g, '-').slice(0, 19);
|
||||
return join(projectDir, '.takt', 'logs', `debug-${timestamp}`);
|
||||
const runSlug = `debug-${timestamp}`;
|
||||
return join(projectDir, '.takt', 'runs', runSlug, 'logs', runSlug);
|
||||
}
|
||||
|
||||
/** Initialize debug logger from config */
|
||||
|
||||
@ -116,20 +116,6 @@ export type NdjsonRecord =
|
||||
| NdjsonInteractiveStart
|
||||
| NdjsonInteractiveEnd;
|
||||
|
||||
// --- Conversation log types ---
|
||||
|
||||
/** Pointer metadata for latest/previous log files */
|
||||
export interface LatestLogPointer {
|
||||
sessionId: string;
|
||||
logFile: string;
|
||||
task: string;
|
||||
pieceName: string;
|
||||
status: SessionLog['status'];
|
||||
startTime: string;
|
||||
updatedAt: string;
|
||||
iterations: number;
|
||||
}
|
||||
|
||||
/** Record for debug prompt/response log (debug-*-prompts.jsonl) */
|
||||
export interface PromptLogRecord {
|
||||
movement: string;
|
||||
|
||||
Loading…
x
Reference in New Issue
Block a user