You want to let your agent or application execute arbitrary Python — data processing, math, string manipulation, quick scripts. You don't want to manage containers, configure Lambda, or explain to your infra team why you need a new sandbox environment. Here's the one-endpoint answer.
Send code as a string. Get stdout, stderr, exit code, and execution time back. The sandbox enforces a timeout and captures all output.
curl -X POST https://slopshop.gg/v1/exec-python \ -H "Authorization: Bearer demo_key_slopshop" \ -H "Content-Type: application/json" \ -d '{ "code": "import math\nresult = math.sqrt(144) * 3.14159\nprint(f\"Result: {result:.4f}\")", "timeout": 10 }'
{
"ok": true,
"stdout": "Result: 37.6991\n",
"stderr": "",
"exit_code": 0,
"execution_time_ms": 38,
"timed_out": false,
"credits_used": 5,
"_engine": "real"
}
Pass a timeout in seconds (default: 10, max: 30). If the code exceeds the limit, execution is killed and you get timed_out: true with whatever stdout was captured up to that point.
curl -X POST https://slopshop.gg/v1/exec-python \ -H "Authorization: Bearer demo_key_slopshop" \ -d '{ "code": "import time\nfor i in range(100):\n print(i)\n time.sleep(1)", "timeout": 3 }'
{
"ok": true,
"stdout": "0\n1\n2\n",
"stderr": "",
"exit_code": -1,
"timed_out": true,
"execution_time_ms": 3001,
"credits_used": 5
}
Runtime errors come back in stderr with a non-zero exit_code. Your calling code doesn't need to guess whether execution failed.
curl -X POST https://slopshop.gg/v1/exec-python \ -H "Authorization: Bearer demo_key_slopshop" \ -d '{ "code": "x = 1 / 0\nprint(x)" }'
{
"ok": true,
"stdout": "",
"stderr": "Traceback (most recent call last):\n File \"\", line 1, in \nZeroDivisionError: division by zero\n" ,
"exit_code": 1,
"timed_out": false,
"credits_used": 5
}
Execution killed at exactly the timeout you specify. No runaway processes.
Code runs in a read-only filesystem. It can't write files to disk or access host paths.
The sandbox has no outbound internet. Good for untrusted code; expected for data transforms.
Hard memory limit prevents OOM from crashing the host. Code gets killed, not the server.
| Option | Setup time | Per-execution cost | Timeout control | stdout/stderr | Cold starts |
|---|---|---|---|---|---|
| Slopshop | 0 — one API key | 5 credits (~$0.0045) | Yes, per-request | Yes, both captured | None |
| AWS Lambda | IAM roles, VPC, packaging, deploy pipeline | ~$0.0000002/req + duration | Function-level only | CloudWatch only | Yes, 100-500ms+ |
| E2B | SDK install, sandbox config | $0.001/min compute | Yes | Yes | ~1-2s warm-up |
| Replit | Account, repl setup, persistent cost | Subscription-based | No per-request control | Partial | Yes, if sleeping |
Cost comparison approximate as of March 2026.
Python execution costs 5 credits per run. You get 500 free credits on signup — that's 400 executions before you pay anything. Credits start at $9 for 10,000. See full pricing.
5 credits per run. 500 free on signup. No containers to configure.