Skip to content

Quick Start

InstaHuman lets AI agents request structured feedback from real humans. You define a job with tasks and questions, target an audience, and receive responses — all programmatically.

Two integration paths:

  • MCP (Model Context Protocol) — connect your AI agent directly via any MCP-compatible client (Claude, Cursor, Codex, etc.)
  • REST API — standard HTTP endpoints for any language or framework

All API access requires a Bearer token. Get your API key from the InstaHuman dashboard under Settings > API Keys.

Authorization: Bearer ih_your_api_key_here

API keys are scoped to your owner account. All jobs created with a key are attributed to your account.

  1. Add the MCP server to your client

    Point your MCP client to the InstaHuman remote endpoint:

    https://api.instahuman.com/api/mcp

    For Claude Desktop, add this to your MCP config:

    {
    "mcpServers": {
    "instahuman": {
    "url": "https://api.instahuman.com/api/mcp",
    "headers": {
    "Authorization": "Bearer ih_your_api_key_here"
    }
    }
    }
    }
  2. Create a job

    Use the create_job tool:

    {
    "title": "Rate my landing page",
    "target_testers": 5,
    "category": "web-design",
    "job_posting_time_limit_minutes": 60,
    "payment_rules": [{ "type": "fixed", "amount_cents": 300 }],
    "variants": [
    {
    "description": "Visit the landing page and answer the questions below.",
    "url": "https://example.com/landing",
    "target_testers_min": 5,
    "feedback_questions": [
    {
    "id": "overall",
    "question_markdown": "How would you rate the overall design?",
    "input_type": "star_rating"
    },
    {
    "id": "feedback",
    "question_markdown": "What would you improve?",
    "input_type": "textarea"
    }
    ]
    }
    ]
    }
  3. Wait for results

    Call wait_job with the returned job ID. It blocks until all testers complete their tasks:

    { "job_id": "returned-job-uuid" }

    The response includes all tester feedback, ratings, and any uploaded artifacts.

  1. Create a job

    Terminal window
    curl https://api.instahuman.com/api/jobs \
    -X POST \
    -H "Authorization: Bearer ih_your_api_key_here" \
    -H "Content-Type: application/json" \
    -d '{
    "title": "Rate my landing page",
    "targetTesters": 5,
    "categorySlug": "web-design",
    "jobPostingTimeLimitMinutes": 60,
    "paymentRules": [{ "type": "fixed", "amountCents": 300 }],
    "variants": [
    {
    "description": "Visit the landing page and answer the questions below.",
    "url": "https://example.com/landing",
    "targetTestersMin": 5,
    "feedbackQuestions": [
    {
    "id": "overall",
    "questionMarkdown": "How would you rate the overall design?",
    "inputType": "star_rating"
    },
    {
    "id": "feedback",
    "questionMarkdown": "What would you improve?",
    "inputType": "textarea"
    }
    ]
    }
    ]
    }'
  2. Poll for progress

    Terminal window
    curl https://api.instahuman.com/api/jobs/JOB_ID \
    -H "Authorization: Bearer ih_your_api_key_here"
  3. Review results

    Once the job status is completed, the response includes all assignment data with tester feedback.

A job is a request for human feedback. It has a title, target number of testers, payment rules, and one or more variants.

Job statuses: activepausedclosingcompleted / cancelled

Each job contains one or more variants — different tasks that testers are assigned to. Each variant has its own description, URL, feedback questions, and tester count.

Define what testers answer after completing a task. Supported types:

TypeDescription
textSingle-line text input
textareaMulti-line text input
multiple_choiceSingle or multi-select options
star_rating1–5 star rating
thumbsThumbs up/down
switchBinary toggle with custom labels
sliderSingle-value slider
range_sliderRange selection slider
integerInteger input with optional min/max
decimalDecimal input with optional min/max
dateDate picker
file_uploadFile upload with optional mime/size limits

Each job has payment rules that determine how testers are paid:

  • fixed — flat fee per tester (e.g., { type: "fixed", amount_cents: 500 } = $5.00/tester)
  • per_minute — pay per minute of work (e.g., { type: "per_minute", rate_cents: 50, cap_minutes: 30 })

A 30% platform fee is added on top of tester payouts. The total is held in escrow when the job is created.

After all testers finish (or the hard time limit expires):

  1. Job enters pending_approval status
  2. You have 24 hours to review and either approve or dispute
  3. If no action is taken, the job auto-approves
  4. On approval, unused escrow is released back to your credits

Group related jobs into projects to organize work and manage tester pools. Testers are auto-added to a project when they complete a job within it.