rust_tui_coder 1.0.0

AI-powered terminal coding assistant with interactive TUI, supporting multiple LLMs and comprehensive development tools
Documentation
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
# Getting Started Guide

Welcome to Rust TUI Coder! This guide will help you get up and running quickly.

## Quick Start

### Prerequisites

- **Rust** 1.70 or higher
- An API key for one of:
  - OpenAI (GPT-3.5, GPT-4)
  - Anthropic (Claude)
  - Local LLM with OpenAI-compatible API

### Installation

#### Option 1: Install from crates.io (Recommended)

```bash
cargo install rust_tui_coder
```

#### Option 2: Build from Source

```bash
# Clone the repository
git clone https://github.com/yourusername/rust_tui_coder.git
cd rust_tui_coder

# Build the project
cargo build --release

# The binary will be at target/release/rct
```

## Configuration

### Step 1: Create Configuration File

Create a `config.toml` file in your project directory:

```toml
[llm]
api_key = "your-api-key-here"
api_base_url = "https://api.openai.com/v1"
model_name = "gpt-4"
```

You can also use the example configuration:

```bash
cp config_example.toml config.toml
# Edit config.toml with your API key
```

### Step 2: Configure Your Provider

#### For OpenAI

```toml
[llm]
provider = "openai"  # Optional, auto-detected
api_key = "sk-..."
api_base_url = "https://api.openai.com/v1"
model_name = "gpt-4"  # or "gpt-3.5-turbo"
```

#### For Anthropic Claude

```toml
[llm]
provider = "anthropic"
api_key = "sk-ant-..."
api_base_url = "https://api.anthropic.com"
model_name = "claude-3-opus-20240229"
```

#### For Local Models (Ollama, LM Studio, etc.)

```toml
[llm]
provider = "local"
api_key = "not-needed"
api_base_url = "http://localhost:11434/v1"  # Ollama default
model_name = "codellama"
```

### Environment Variables (Alternative)

Instead of `config.toml`, you can use environment variables:

```bash
export LLM_API_KEY="your-api-key"
export LLM_API_BASE_URL="https://api.openai.com/v1"
export LLM_MODEL_NAME="gpt-4"
```

## First Run

### Launch the Application

```bash
rct
```

Or if built from source:

```bash
./target/release/rct
```

### Initial Screen

You'll see a terminal interface with:
- **Conversation area** (top) - Shows your chat with the AI
- **Tool logs area** (middle) - Shows tool execution details
- **Status bar** - Shows available commands
- **Input area** (bottom) - Where you type your messages

### Your First Interaction

1. Type a message in the input area:
   ```
   Create a hello world program in Python
   ```

2. Press **Enter** to send

3. Watch as the AI:
   - Generates code
   - Uses tools (like `write_file`)
   - Executes the code
   - Shows you the results

## Basic Usage

### Sending Messages

1. Type your message at the bottom
2. Press **Enter** to send
3. Watch the response in the conversation area

### Keyboard Shortcuts

| Key | Action |
|-----|--------|
| `Enter` | Send message |
| `Up` / `Down` | Scroll conversation |
| `PgUp` / `PgDn` | Page up/down |
| `Home` | Scroll to top |
| `End` | Scroll to bottom |
| `Ctrl+C` | Quit application |

### Special Commands

| Command | Description |
|---------|-------------|
| `/quit` | Exit the application |
| `/stats` | Show session statistics |

## Common Tasks

### Example 1: Create a File

```
Create a file named hello.py with a hello world program
```

The AI will:
1. Write the code
2. Save it to `hello.py`
3. Confirm the file was created

### Example 2: Read and Modify a File

```
Read example.txt and add a timestamp at the beginning
```

The AI will:
1. Read the file
2. Add a timestamp
3. Update the file
4. Show you the changes

### Example 3: Execute Code

```
Write a Python script to calculate fibonacci numbers and run it
```

The AI will:
1. Write the script
2. Save it to a file
3. Execute it
4. Show you the output

### Example 4: Git Operations

```
Show me the current git status
```

The AI will use the `git_status` tool to show repository status.

### Example 5: Create a Development Plan

```
Create a plan to build a REST API with user authentication
```

The AI will:
1. Create a structured plan
2. Save it to `plan.md`
3. You can ask it to implement steps one by one

## Understanding Tool Execution

When the AI needs to perform actions, it uses **tools**:

### File Tools
- `read_file` - Read file contents
- `write_file` - Create/overwrite file
- `append_to_file` - Add to end of file
- `search_and_replace` - Find and replace text
- `delete_file` - Remove file

### Directory Tools
- `create_directory` - Create folders
- `list_directory` - List folder contents
- `list_directory_recursive` - Show folder tree

### Execution Tools
- `execute_python` - Run Python code
- `execute_bash` - Run shell commands
- `execute_node` - Run JavaScript
- `execute_ruby` - Run Ruby code

### Planning Tools
- `create_plan` - Make implementation plan
- `update_plan_step` - Mark steps complete
- `clear_plan` - Remove current plan

### Version Control
- `git_status` - Check git status

Tool execution is shown in the **Tool Logs** area.

## Tips for Effective Use

### 1. Be Specific

 "Make a website"
 "Create an HTML file with a form that collects name and email"

### 2. Break Down Complex Tasks

Instead of asking for everything at once, work step-by-step:
1. "Create the project structure"
2. "Implement the database models"
3. "Add the API endpoints"

### 3. Use the Plan Feature

For complex projects:
```
Create a plan to build a todo list application with React and Express
```

Then:
```
Implement step 1 of the plan
```

### 4. Review Tool Logs

The tool logs area shows exactly what the AI is doing. Check it to:
- Verify file operations
- See command outputs
- Understand execution results

### 5. Iterative Development

You can refine the AI's work:
```
The function is good but add error handling
```

### 6. Ask for Explanations

```
Explain what this code does
```

or

```
Why did you use this approach?
```

## Session Management

### View Statistics

Type `/stats` to see:
- Session duration
- Tokens used
- Number of requests
- Tools executed
- Average tokens per request

### Scroll Through History

- Use **Up/Down** to scroll through conversation
- Use **PgUp/PgDn** for faster scrolling
- Use **Home/End** to jump to top/bottom

### Clear the Plan

If you want to start a new plan:
```
Clear the current plan
```

## Troubleshooting

### "Config file not found"

**Solution:** Create a `config.toml` file in the directory where you run the app.

```bash
cp config_example.toml config.toml
# Edit with your API key
```

### "API key invalid"

**Solution:** Check your API key in `config.toml`:
- OpenAI keys start with `sk-`
- Anthropic keys start with `sk-ant-`
- Ensure no extra spaces or quotes

### "Connection refused"

**Solution:** Check your `api_base_url`:
- OpenAI: `https://api.openai.com/v1`
- Anthropic: `https://api.anthropic.com`
- Local: Ensure your local server is running

### "Tool execution failed"

**Solution:** Check the tool logs for details. Common issues:
- File permissions
- Missing dependencies (Python, Node, etc.)
- Invalid file paths

### Terminal Display Issues

**Solution:** 
- Ensure your terminal supports UTF-8
- Try a different terminal emulator
- Check terminal size (minimum 80x24 recommended)

### Garbled Text

**Solution:**
- Your terminal may not support all features
- Try running: `export TERM=xterm-256color`

## Advanced Configuration

### Custom Model Parameters

While not directly supported in config, you can mention preferences:
```
Please be concise in your responses
```

### Working Directory

The app operates in the directory where you launch it. To work on a specific project:

```bash
cd /path/to/your/project
rct
```

### Multiple Projects

Create a `config.toml` in each project directory, or use environment variables:

```bash
cd project1
LLM_MODEL_NAME="gpt-3.5-turbo" rct
```

## Next Steps

Now that you're set up, explore these resources:

- **[README.md]README.md** - Full feature documentation
- **[ARCHITECTURE.md]ARCHITECTURE.md** - System design details
- **[API.md]API.md** - API reference
- **[EXAMPLES.md]EXAMPLES.md** - More usage examples

## Getting Help

### Check Documentation
- Read the README for detailed features
- Check ARCHITECTURE.md for system internals
- Review API.md for technical details

### Common Questions

**Q: How much does it cost?**
A: Cost depends on your LLM provider and usage. Check with OpenAI/Anthropic for pricing.

**Q: Can I use it offline?**
A: Yes, with a local model (Ollama, LM Studio).

**Q: Is my code safe?**
A: Code is processed by your chosen LLM provider. Read their privacy policies.

**Q: Can I customize the tools?**
A: Currently, tools are built-in. Custom tools require modifying the source code.

**Q: What languages can I execute?**
A: Python, Bash, Node.js, and Ruby are supported out of the box.

## Support

- **Issues**: Report bugs on GitHub
- **Features**: Suggest features via GitHub issues
- **Documentation**: All docs are in the `docs/` folder

Happy coding!