Metadata-Version: 2.4
Name: sandflare
Version: 0.1.7
Summary: Sandflare Python SDK for sandboxes and managed Postgres
License: MIT
Project-URL: Homepage, https://sandflare.io
Project-URL: Documentation, https://docs.sandflare.io
Keywords: sandflare,sandbox,microvm,postgres,firecracker
Classifier: Development Status :: 3 - Alpha
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: MIT License
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3 :: Only
Requires-Python: >=3.9
Description-Content-Type: text/markdown

# Sandflare Python SDK

The official Python SDK for creating and managing Sandflare sandboxes and managed Postgres databases.

## Install

```bash
pip install sandflare
```

For local development:

```bash
pip install -e sdk/python/
```

## Quick start

```python
from sandflare import Sandbox

sandbox = Sandbox.create("agent")
sandbox.run_code("value = 10")
result = sandbox.run_code("value += 5\nprint(value)")
print(result.stdout)
```

## Environment variables

- `SANDFLARE_API_KEY`: preferred API key
- `SANDFLARE_API_URL`: optional API base URL override
- `PANDAAGENT_API_KEY`: legacy API key alias
- `PANDAAGENT_BASE_URL`: legacy base URL alias

## Sandbox methods

### exec / run_python / run_node

```python
result = sb.exec("ls /home/agent")
result = sb.run_python("print('hello')")
result = sb.run_node("console.log('hello')")
```

### stream / exec_stream

`stream()` reads all SSE output then yields `StreamEvent` objects (backward-compatible):

```python
for event in sb.stream("python3 train.py"):
    if event.event in ("stdout", "stderr"):
        print(event.line, end="", flush=True)
    elif event.event == "done":
        print(f"\nexited {event.exit_code}")
```

`exec_stream()` is a true SSE generator that yields events as they arrive:

```python
for event in sb.exec_stream("python3 train.py", timeout=120):
    if event.type in ("stdout", "stderr"):
        print(event.data, flush=True)
    elif event.type == "done":
        print(f"\nexited {event.exit_code}")
```

### metrics()

```python
m = sb.metrics()
print(f"CPU: {m.cpu_used_pct:.1f}%  RAM: {m.mem_used}/{m.mem_total} bytes")
```

### kill_process(pid)

```python
sb.kill_process(1234)
```

### git_clone(repo, ...)

```python
result = sb.git_clone(
    "https://github.com/org/repo",
    path="/home/agent/repo",
    branch="main",
    depth=1,
)
print(result.output)
```

### File I/O

```python
sb.write_file("/home/agent/data.csv", csv_text)
content = sb.read_file("/home/agent/output.txt")
sb.upload("local.png", "/home/agent/img.png")
raw_bytes = sb.download("/home/agent/img.png")
entries = sb.ls("/home/agent")
```

## Template class

Build custom sandbox templates from Dockerfiles:

```python
from sandflare import Template

# Submit a build (returns immediately)
job = Template.build(
    name="my-template",
    dockerfile="FROM ubuntu:22.04\nRUN apt-get update && apt-get install -y python3",
    description="Custom Python env",
)
print(job.id, job.status)  # e.g. "tmpl-abc123", "building"

# Poll until ready (up to 30 minutes)
job = Template.wait_for_build(job.id, timeout_seconds=600)
print(job.status)  # "ready"

# Use template when creating a sandbox
sb = Sandbox.create("agent", template_id=job.id)

# List all templates
jobs = Template.list()

# Check build status manually
job = Template.get_build_status("tmpl-abc123")

# Delete a template
Template.delete("tmpl-abc123")
```

## New dataclasses

| Class | Fields |
|---|---|
| `ProcessInfo` | `pid`, `command`, `cpu_percent`, `memory_percent` |
| `SandboxMetrics` | `sandbox_id`, `status`, `cpu_count`, `cpu_used_pct`, `mem_total`, `mem_used`, `disk_total`, `disk_used`, `ts` |
| `GitCloneResult` | `path`, `repo`, `branch`, `output` |
| `TemplateBuildJob` | `id`, `name`, `status`, `description`, `error`, `team_id`, `created_at`, `updated_at`, `logs` |

## Build locally

```bash
cd sdk/python
python3 -m pip wheel . --no-deps -w dist/
```

## Publish to PyPI

```bash
cd sdk/python
rm -rf dist/
python3 -m pip wheel . --no-deps -w dist/
python3 -m twine upload dist/*
```

Use a PyPI API token via `TWINE_USERNAME=__token__` and `TWINE_PASSWORD=<pypi-token>`.
