Export a competitor's store in 60 seconds
Zero-to-CSV walkthrough — take any Shopify store domain and end up with a full product catalog on disk in under a minute.
Overview
This guide takes you from "I want to export competitor X's catalog" to "products.csv is on my disk" in four steps, using either the web UI or the API. By the end you'll have a Shopify-compatible CSV, a products JSON file, and a downloadable report URL you can share with your team.
Prerequisite: A ShopSniffer API key (free to create once you sign up) or $9.99 in crypto for the x402 path.
Path A — web UI (no code)
Go to the homepage
Visit shopsniffer.com and enter the competitor's store domain in the input box. For example, gymshark.com.
Click 'Get full report'
The free preview (first 100 products) loads immediately so you can verify it's the right store. Then click the full-report CTA.
Pay
Choose Sign in and pay with card (Stripe) or Pay with crypto (AnySpend accepts any token on any chain). See buying a report for the full payment walkthrough.
Download
The job runs in ~30-60 seconds. When the report page shows "Complete", click Download CSV to get products.csv — the Shopify-import-ready file. JSON files for collections and pages are on the same page.
Done. Import the CSV directly into your own Shopify store via Admin → Products → Import if you're migrating, or open it in a spreadsheet for analysis.
Path B — API (script it)
If you're exporting multiple stores, automating the flow is faster than clicking. This is the minimum viable script.
bash#!/usr/bin/env bash set -euo pipefail DOMAIN="gymshark.com" API="https://shopsniffer.com/api" KEY="ss_your_key_here" # 1. Create the job JOB_ID=$(curl -sX POST "$API/jobs" \ -H "X-API-Key: $KEY" \ -H "Content-Type: application/json" \ -d "{\"domain\": \"$DOMAIN\"}" | jq -r '.job.id') echo "Created job $JOB_ID" # 2. Poll until complete while true; do STATUS=$(curl -s "$API/jobs/$JOB_ID/status" | jq -r '.status') echo "Status: $STATUS" [[ "$STATUS" == "completed" ]] && break [[ "$STATUS" == "errored" ]] && { echo "Job failed"; exit 1; } sleep 5 done # 3. Download the CSV curl -s "$API/downloads?jobId=$JOB_ID&key=products.csv" -o "$DOMAIN.csv" echo "Saved $DOMAIN.csv ($(wc -l < "$DOMAIN.csv") lines)"
pythonimport requests, time DOMAIN = "gymshark.com" API = "https://shopsniffer.com/api" KEY = "ss_your_key_here" headers = {"X-API-Key": KEY, "Content-Type": "application/json"} # 1. Create the job r = requests.post(f"{API}/jobs", json={"domain": DOMAIN}, headers=headers) r.raise_for_status() job_id = r.json()["job"]["id"] print(f"Created job {job_id}") # 2. Poll until complete while True: status = requests.get(f"{API}/jobs/{job_id}/status").json() print(f"Status: {status['status']}") if status["status"] == "completed": break if status["status"] == "errored": raise RuntimeError("Job failed") time.sleep(5) # 3. Download the CSV csv = requests.get( f"{API}/downloads", params={"jobId": job_id, "key": "products.csv"}, ).text with open(f"{DOMAIN}.csv", "w") as f: f.write(csv) print(f"Saved {DOMAIN}.csv")
typescriptconst DOMAIN = "gymshark.com"; const API = "https://shopsniffer.com/api"; const KEY = "ss_your_key_here"; const headers = { "X-API-Key": KEY, "Content-Type": "application/json" }; // 1. Create the job const createRes = await fetch(`${API}/jobs`, { method: "POST", headers, body: JSON.stringify({ domain: DOMAIN }), }); const { job } = await createRes.json(); console.log(`Created job ${job.id}`); // 2. Poll until complete let status: { status: string }; do { await new Promise((r) => setTimeout(r, 5_000)); status = await fetch(`${API}/jobs/${job.id}/status`).then((r) => r.json()); console.log(`Status: ${status.status}`); if (status.status === "errored") throw new Error("Job failed"); } while (status.status !== "completed"); // 3. Download the CSV const csv = await fetch( `${API}/downloads?jobId=${job.id}&key=products.csv`, ).then((r) => r.text()); await Bun.write(`${DOMAIN}.csv`, csv); console.log(`Saved ${DOMAIN}.csv`);
For faster completion notification, pass webhook_url when creating the job and skip the polling loop entirely. See webhooks.
Batching multiple stores
To export a list of competitors, wrap the above in a loop and parallelize. Rate-wise, a few dozen jobs submitted concurrently is fine; if you need hundreds per hour, contact support to bump your concurrency cap.
pythonCOMPETITORS = ["gymshark.com", "lululemon.com", "alo.com", "vuori.com"] for domain in COMPETITORS: export_store(domain) # the function from Path B above
What you get
Every $9.99 report includes the full files — see downloads for the complete list. For a single-store export you typically want products.csv (for Shopify re-import) or products.json (for data analysis).