Every LinkedIn lead Scrupp scrapes can hit your webhook URL in real time, with verified email, phone, and company data in the JSON payload. Plug into Zapier, Make, n8n, or your own backend — and fan out to any CRM, database, or Slack channel in under 5 minutes.
Quick answer
In Scrupp settings, paste a webhook URL. Pick which events trigger it — scrape completed, lead enriched, or both. Scrupp POSTs a JSON payload with every new lead (name, title, company, LinkedIn URL, verified email, phone, confidence score) to your endpoint. The request is signed with your webhook secret (HMAC-SHA256 in the X-Scrupp-Signature header) so you can verify it came from Scrupp. On a 5xx response Scrupp retries 3× with exponential backoff. Replay any past delivery from the dashboard.
3-step setup
No OAuth, no SDK. Just a URL and a secret.
In Scrupp settings → Webhooks, paste the HTTPS endpoint that will receive payloads. Scrupp pings it once with a ping event to verify it's reachable.
Scrupp generates a per-webhook signing secret. Store it in your environment. Verify every request by HMAC-ing the raw body with this secret and comparing to the X-Scrupp-Signature header.
Toggle any combination: scrape.completed, lead.enriched, lead.unverified. Most users subscribe to lead.enriched only — skip rows where Scrupp couldn't verify an email.
JSON payload
Stable shape, versioned, fully documented.
{
"event": "lead.enriched",
"api_version": "2026-04-01",
"delivery_id": "del_01HW7F3PMK3X9NRQ0V2JY0D8ZE",
"created_at": "2026-04-15T10:12:04Z",
"data": {
"scrape_id": "scr_01HW7F3PMJ0XQWK8N1T2GS6V4C",
"source": "sales-navigator",
"source_url": "https://www.linkedin.com/sales/search/people?...",
"full_name": "Sarah Chen",
"first_name": "Sarah",
"last_name": "Chen",
"job_title": "VP Sales",
"company_name": "Vectrix",
"company_domain": "vectrix.io",
"linkedin_url": "https://www.linkedin.com/in/sarah-chen-vectrix",
"email": "sarah@vectrix.io",
"email_confidence":"A",
"phone": "+14155550192",
"location": "San Francisco, CA",
"industry": "B2B SaaS",
"headcount": "51-200",
"scraped_at": "2026-04-15T10:12:03Z"
}
}
import crypto from "crypto";
app.post("/scrupp-webhook", express.raw({ type: "application/json" }), (req, res) => {
const sig = req.header("X-Scrupp-Signature");
const expected = crypto
.createHmac("sha256", process.env.SCRUPP_WEBHOOK_SECRET)
.update(req.body)
.digest("hex");
if (!crypto.timingSafeEqual(Buffer.from(sig), Buffer.from(expected))) {
return res.status(401).end();
}
const event = JSON.parse(req.body);
// handle event.data ...
res.status(200).end();
});
Use cases
Plug Scrupp into the rest of your stack without writing any new scrapers.
Pipe Scrupp into any of 5,000+ destinations without writing code. A Zap takes 30 seconds to set up.
Create new contact records on lead.enriched. Use Zapier or a tiny backend to map Scrupp fields to CRM properties.
Ping a channel when a high-confidence lead from a target account hits the queue. Route to the right SDR.
Insert every enriched lead straight into your warehouse for BI, attribution, and de-dup against existing contacts.
Auto-upload enriched leads into a cold email campaign. Scrupp scrapes, webhook fires, sequence starts — no CSV step.
POST directly to your own backend. Two dozen lines of code + HMAC verification and you're done.
Reliability
On any 5xx or network error, Scrupp retries 3 times with exponential backoff (10s, 60s, 300s). 4xx responses are treated as final — Scrupp assumes the request was rejected on purpose.
Every delivery attempt is logged for 30 days with full request body, response body, and status. Replay any failed delivery from the dashboard with one click.
Every payload has a unique delivery_id. Store it on your side to dedupe retries or duplicate deliveries.
Scrupp caps webhook delivery at 100 requests/second per endpoint. For high-volume scrapes, Scrupp queues deliveries and drains at the cap — no payload is ever dropped.
FAQ
Everything you need to ship.
Yes. Webhook delivery is included on every Scrupp plan, including free. You only pay for the enrichment credits used during scraping — not for delivering payloads.
Yes. You can configure up to 10 webhooks per account, each with its own URL, secret, and event subscription. Fan out Scrupp to Slack, HubSpot, and your warehouse simultaneously.
scrape.completed (fires once at the end of a scrape with summary stats), lead.enriched (fires for every row where Scrupp verified an email), lead.unverified (fires for every row where enrichment failed), and ping (fires on save to verify your endpoint).
Scrupp signs every request with HMAC-SHA256 using your per-webhook secret. The signature is in the X-Scrupp-Signature header. Compute HMAC-SHA256(rawBody, secret) on your side and compare with constant-time equality. Code samples for Node, Python, PHP, and Go are in the docs.
Scrupp gives you 30 seconds to respond before marking the delivery as failed. For slow processing, respond 200 immediately and queue the work on your side.
Yes — point the webhook at Zapier, Make, or n8n. All three have native webhook triggers and 5,000+ downstream destinations. No code required.
The Zapier app (coming soon) wraps this webhook in a Zapier-native trigger for non-technical users. Under the hood it uses the same webhook. Developers who want full control should use the raw webhook directly.
Related
Keep exploring: Google Sheets integration · Lead Enrichment API · Scrupp REST API · Sales Navigator Scraper · LinkedIn Scraper
JSON in, leads out. Works with Zapier, Make, n8n, and any backend that speaks HTTP.
Included on every plan · HMAC-signed · Full replay from dashboard