function_calling  ·  Llama  ·  Mixtral  ·  Qwen  ·  api.together.xyz

Slopshop for
Together AI

Together AI hosts the best open models at scale. Add Slopshop tools across 82 categories via function_calling — same openai SDK, just point base_url at api.together.xyz.
Free memory. Every result verified.

422
Real Tools
50+
Hosted Models
OAI
Compatible
100%
Verified Results

Supported Models

Open models with function calling support

These Together AI models support function_calling with Slopshop tools. Use the full Together model ID as the model parameter.

Llama 3.3 70B
meta-llama/Llama-3.3-70B-Instruct-Turbo
Best open model for tool calling
Llama 3.1 405B
meta-llama/Meta-Llama-3.1-405B-Instruct-Turbo
Largest open model
Mixtral 8x22B
mistralai/Mixtral-8x22B-Instruct-v0.1
Strong MoE reasoning
Qwen 2.5 72B
Qwen/Qwen2.5-72B-Instruct-Turbo
Excellent code + tool use

Quickstart

Three steps. Together has tools.

Together AI's API is OpenAI-compatible. Set base_url="https://api.together.xyz/v1" and your Together API key — the rest is identical to the OpenAI integration.

Step 01 — Install

pip install openai

Use the openai SDK directly. Together AI's OpenAI-compatible endpoint means zero new dependencies.

pip install openai requestsclick to copy
Step 02 — Point at Together

Set base_url and model ID

Create an OpenAI client with Together's base_url. Use the full Together model ID — e.g. meta-llama/Llama-3.3-70B-Instruct-Turbo.

base_url="https://api.together.xyz/v1"
Step 03 — Add Tools

Fetch and pass Slopshop tools

Pull Slopshop tool definitions from /v1/openapi.json and pass them in the tools array. The model handles the rest.

82 categories of tools · any Together model · verified

Code Example

function_calling with Llama 3.3 on Together AI

Full agentic loop. The model ID is the only Together-specific part — everything else is standard OpenAI SDK.

together_agent.py python
import os, json, requests
from openai import OpenAI

# Together AI is OpenAI-compatible
client = OpenAI(
    api_key=os.environ["TOGETHER_API_KEY"],
    base_url="https://api.together.xyz/v1"
)

SLOP_KEY = os.environ["SLOPSHOP_API_KEY"]
SLOP_URL = "https://slopshop.gg"

# Models with strong function_calling support on Together:
MODEL = "meta-llama/Llama-3.3-70B-Instruct-Turbo"
# Alternatives:
# "Qwen/Qwen2.5-72B-Instruct-Turbo"
# "mistralai/Mixtral-8x22B-Instruct-v0.1"
# "meta-llama/Meta-Llama-3.1-405B-Instruct-Turbo"

def get_tools(slugs: list[str]) -> list[dict]:
    schema = requests.get(
        f"{SLOP_URL}/v1/openapi.json",
        headers={"Authorization": f"Bearer {SLOP_KEY}"}
    ).json()
    tools = []
    for slug in slugs:
        path = schema["paths"].get(f"/v1/{slug}", {}).get("post", {})
        if path:
            tools.append({
                "type": "function",
                "function": {
                    "name": slug,
                    "description": path.get("summary", ""),
                    "parameters": path["requestBody"]["content"]["application/json"]["schema"]
                }
            })
    return tools

def call_tool(name: str, args: dict) -> str:
    res = requests.post(
        f"{SLOP_URL}/v1/{name}", json=args,
        headers={"Authorization": f"Bearer {SLOP_KEY}"}
    )
    return json.dumps(res.json())

def run_agent(user_message: str) -> str:
    tools = get_tools(["dns-lookup", "ssl-check", "http-request",
                       "whois", "hash", "memory-set", "memory-get"])
    messages = [{"role": "user", "content": user_message}]

    while True:
        response = client.chat.completions.create(
            model=MODEL,
            messages=messages,
            tools=tools,
            tool_choice="auto"
        )
        msg = response.choices[0].message

        if response.choices[0].finish_reason == "stop":
            return msg.content

        messages.append(msg)
        for tc in (msg.tool_calls or []):
            result = call_tool(tc.function.name, json.loads(tc.function.arguments))
            messages.append({
                "role": "tool",
                "tool_call_id": tc.id,
                "content": result
            })

answer = run_agent("Audit slopshop.gg: DNS records, SSL cert, HTTP headers, save all to memory")
print(answer)

What You Get

Open models + real execution

Together AI gives you the best open models at scale. Slopshop gives them real tools across 82 categories to act with — no simulated outputs, no hallucinated function results.

🀅

78 Categories of Real Tools

DNS, SSL, HTTP, crypto, hashing, code execution, data transforms. All real. Verified with _engine: real on every response.

🧠

Free Persistent Memory

Together agents can write to Slopshop's key-value store at zero credit cost. State persists across API calls automatically.

🔗

Any Together Model

Llama 3.3, Llama 3.1, Mixtral, Qwen 2.5 — any Together model with function calling support works with Slopshop tools.

👀

Full Observability

Audit logs, per-request tracing, credit usage per tool call. Every invocation logged with inputs, outputs, and latency.

🏠

Self-Hostable

Run Slopshop on your own infrastructure. Only Together API calls go external — all compute tools run locally.


Open models + real tools across 82 categories

Together AI + Slopshop. Change base_url. Pick any model. 82 categories of tools.
Open models execute — for real.

Get Started Free OpenAPI Schema Browse Browse All Tools
→ Full Docs → Browse Tools → Agent Templates → Savings Calculator