Benchmark Your Scraper in 5 Steps
ScrapeMe is a scraping benchmark platform with 18 test scenarios, automated scoring (max 1,800 points), bot detection, and a public leaderboard. Register your scraper, run the tests, and see how you rank.
Register Your Scraper
Go to the registration page, enter your scraper's name, and you'll receive an API key. This key links your test runs to your leaderboard profile.
Go to RegistrationSave your API key somewhere safe — you'll include it in every test run to track your scores.
Create a Test Run
A test run is a single benchmarking session. Create one to get a run-level API key and the list of all 18 tests with their endpoints.
curl -X POST https://scrapeme.loyalleads.co.uk/api/test-run \
-H "Content-Type: application/json" \
-d '{"name": "My first run", "scraperApiKey": "YOUR_SCRAPER_API_KEY"}'{
"runId": "cm9f...",
"apiKey": "tr_abc123...",
"tests": [
{ "slug": "ecommerce", "name": "E-commerce Listing", "maxScore": 100, "url": "/tests/ecommerce" },
{ "slug": "real-estate", "name": "Real Estate Listings", "maxScore": 100, "url": "/tests/real-estate" },
...18 tests total
],
"submitUrl": "/api/test-run/{runId}/submit",
"reportUrl": "/api/test-run/{runId}/report"
}The apiKey in the response is the run-level key (different from your scraper API key). Use it as a Bearer token for all submit and report calls.
Pick a Test & Scrape
Each test page has a Prompt section with a ready-to-use task description including full URLs. Copy the prompt, run your scraper, and collect the data.
Data Extraction
Access Control
API Challenges
Crawler Compliance
Bot Detection
?variant=classic.Modern (default) — Tailwind CSS, React components, div-heavy markup. Harder to scrape.
Classic — Semantic HTML (<table>, <article>, <address>, <dl>), schema.org microdata, traditional CSS classes. Easier to scrape.
Same data, same scoring — only the HTML structure changes. Test both to measure your scraper's adaptability.
Example: E-commerce test prompt
Go to https://scrapeme.loyalleads.co.uk/tests/ecommerce and scrape all 500 products
from the paginated catalog. For each product, extract the SKU,
name, price, original price, category, brand, rating, review
count, and stock status. Navigate through all pages using the
?page=N parameter (24 products per page).Every test page has its own prompt — click "Test Passing Criteria" then copy the Prompt section.
Submit Your Results
After scraping a test, send the extracted data for verification. The API checks your data against the database and returns a score immediately.
curl -X POST https://scrapeme.loyalleads.co.uk/api/test-run/{runId}/submit \
-H "Authorization: Bearer {run_apiKey}" \
-H "Content-Type: application/json" \
-d '{
"testSlug": "ecommerce",
"data": {
"products": [
{ "sku": "SKU001", "name": "Wireless Headphones", "price": 29.99 },
{ "sku": "SKU002", "name": "USB-C Cable", "price": 9.99 }
]
}
}'{
"testSlug": "ecommerce",
"passed": true,
"score": 95,
"maxScore": 100,
"details": {
"expected": 24,
"matched": 23,
"errors": ["SKU042: price mismatch"]
}
}Each test can only be submitted once per run. Submit results for as many of the 18 tests as you like — you don't have to do all of them.
View Your Score
When you're done submitting, call the report endpoint. This marks the run as completed, updates your leaderboard ranking, and returns a full breakdown.
curl https://scrapeme.loyalleads.co.uk/api/test-run/{runId}/report \
-H "Authorization: Bearer {run_apiKey}"{
"id": "cm9f...",
"status": "completed",
"summary": {
"totalScore": 1650,
"maxScore": 1800,
"percentage": 92,
"testsPassed": 16,
"testsTotal": 18
},
"results": [
{ "testSlug": "ecommerce", "passed": true, "score": 95, "maxScore": 100 },
{ "testSlug": "real-estate", "passed": true, "score": 100, "maxScore": 100 },
...
],
"botAnalysis": {
"botScore": 35,
"humanScore": 65,
"classification": "Likely Bot",
"signals": ["Missing canvas fingerprint", "No mouse movements"]
}
}You can also view your results in the browser at /results/{runId}#{run_apiKey}
Complete Example
End-to-end Python script that creates a run, scrapes the cookie-consent test (the simplest), submits the result, and gets the report.
import requests
BASE = "https://scrapeme.loyalleads.co.uk"
# 1. Create a test run
run = requests.post(f"{BASE}/api/test-run", json={
"name": "Quick test",
"scraperApiKey": "YOUR_SCRAPER_API_KEY" # from /leaderboard/register
}).json()
run_id = run["runId"]
api_key = run["apiKey"]
print(f"Run created: {run_id}")
# 2. Scrape the cookie-consent test (simplest test)
session = requests.Session()
session.cookies.set("sm_consent", "accepted", domain="scrapeme.loyalleads.co.uk")
page = session.get(f"{BASE}/tests/cookie-consent")
# 3. Parse the articles (6 articles with title, excerpt, author, date, category)
# ... your scraping logic here ...
articles = [{"title": "...", "excerpt": "...", "author": "...", "date": "...", "category": "..."}]
# 4. Submit results
result = requests.post(
f"{BASE}/api/test-run/{run_id}/submit",
headers={"Authorization": f"Bearer {api_key}"},
json={"testSlug": "cookie-consent", "data": {"articles": articles}}
).json()
print(f"Score: {result['score']}/{result['maxScore']}")
# 5. Get your report
report = requests.get(
f"{BASE}/api/test-run/{run_id}/report",
headers={"Authorization": f"Bearer {api_key}"}
).json()
print(f"Total: {report['summary']['totalScore']}/{report['summary']['maxScore']}")
print(f"Bot score: {report['botAnalysis']['botScore']}/100")Recommended Test Order
Start with the easiest tests and work your way up.
- Cookie Consent
- Pagination
- Downloads
- Hidden Data
- E-commerce
- Real Estate
- Auction
- Rate Limited
- Redirects
- JS Rendered
- Auth
- CSRF
- GraphQL
- Robots
- Honeypot
- Captcha
- Crawler Traps
- Iframe