Together AI hosts the best open models at scale. Add Slopshop tools across 82 categories via function_calling — same openai SDK, just point base_url at api.together.xyz.
Free memory. Every result verified.
These Together AI models support function_calling with Slopshop tools. Use the full Together model ID as the model parameter.
Together AI's API is OpenAI-compatible. Set base_url="https://api.together.xyz/v1" and your Together API key — the rest is identical to the OpenAI integration.
Use the openai SDK directly. Together AI's OpenAI-compatible endpoint means zero new dependencies.
Create an OpenAI client with Together's base_url. Use the full Together model ID — e.g. meta-llama/Llama-3.3-70B-Instruct-Turbo.
Pull Slopshop tool definitions from /v1/openapi.json and pass them in the tools array. The model handles the rest.
Full agentic loop. The model ID is the only Together-specific part — everything else is standard OpenAI SDK.
import os, json, requests from openai import OpenAI # Together AI is OpenAI-compatible client = OpenAI( api_key=os.environ["TOGETHER_API_KEY"], base_url="https://api.together.xyz/v1" ) SLOP_KEY = os.environ["SLOPSHOP_API_KEY"] SLOP_URL = "https://slopshop.gg" # Models with strong function_calling support on Together: MODEL = "meta-llama/Llama-3.3-70B-Instruct-Turbo" # Alternatives: # "Qwen/Qwen2.5-72B-Instruct-Turbo" # "mistralai/Mixtral-8x22B-Instruct-v0.1" # "meta-llama/Meta-Llama-3.1-405B-Instruct-Turbo" def get_tools(slugs: list[str]) -> list[dict]: schema = requests.get( f"{SLOP_URL}/v1/openapi.json", headers={"Authorization": f"Bearer {SLOP_KEY}"} ).json() tools = [] for slug in slugs: path = schema["paths"].get(f"/v1/{slug}", {}).get("post", {}) if path: tools.append({ "type": "function", "function": { "name": slug, "description": path.get("summary", ""), "parameters": path["requestBody"]["content"]["application/json"]["schema"] } }) return tools def call_tool(name: str, args: dict) -> str: res = requests.post( f"{SLOP_URL}/v1/{name}", json=args, headers={"Authorization": f"Bearer {SLOP_KEY}"} ) return json.dumps(res.json()) def run_agent(user_message: str) -> str: tools = get_tools(["dns-lookup", "ssl-check", "http-request", "whois", "hash", "memory-set", "memory-get"]) messages = [{"role": "user", "content": user_message}] while True: response = client.chat.completions.create( model=MODEL, messages=messages, tools=tools, tool_choice="auto" ) msg = response.choices[0].message if response.choices[0].finish_reason == "stop": return msg.content messages.append(msg) for tc in (msg.tool_calls or []): result = call_tool(tc.function.name, json.loads(tc.function.arguments)) messages.append({ "role": "tool", "tool_call_id": tc.id, "content": result }) answer = run_agent("Audit slopshop.gg: DNS records, SSL cert, HTTP headers, save all to memory") print(answer)
Together AI gives you the best open models at scale. Slopshop gives them real tools across 82 categories to act with — no simulated outputs, no hallucinated function results.
DNS, SSL, HTTP, crypto, hashing, code execution, data transforms. All real. Verified with _engine: real on every response.
Together agents can write to Slopshop's key-value store at zero credit cost. State persists across API calls automatically.
Llama 3.3, Llama 3.1, Mixtral, Qwen 2.5 — any Together model with function calling support works with Slopshop tools.
Audit logs, per-request tracing, credit usage per tool call. Every invocation logged with inputs, outputs, and latency.
Run Slopshop on your own infrastructure. Only Together API calls go external — all compute tools run locally.