What Separates a 94-Scoring Lead From a 67-Scoring One
Published 7 April 2026 · 7 min read · LeadTYZER Intelligence
Every lead delivered by LeadTYZER has been scored 1-100 by AI across 12 dimensions. A score of 70 is the minimum for delivery. But there is a massive difference between a 70 and a 94. Here is exactly what creates that gap — and why it should change how you prioritise your outreach.
Why lead scoring matters more than lead volume
The B2B sales industry has spent a decade optimising for volume. More leads, more emails, more calls. The result is a market where decision-makers receive an average of 121 cold emails per week and reply to fewer than 3% of them. The volume game is over. The quality game is just beginning.
Internal data from LeadTYZER clients shows a clear pattern: leads scoring 85-100 generate reply rates of 8-14%. Leads scoring 70-84 generate reply rates of 4-7%. Leads scoring below 70 — which LeadTYZER does not deliver — generate reply rates under 2%, comparable to industry averages on unverified lists.
The compounding effect is significant. An SDR working 50 leads at 90+ score will book more meetings than the same SDR working 200 leads at 72 average score. The difference is not the SDR. It is the data.
The 12 dimensions explained
LeadTYZER's AI scoring engine evaluates every lead on exactly 12 dimensions before it can be delivered to a client. Each dimension is weighted based on its empirical correlation with positive outreach outcomes. Here is what each dimension measures and what separates a high score from a low one.
1. Contact completeness (15% weight)
Does the lead have name, title, email, phone, and LinkedIn?
A lead with a verified email, full name, specific job title, direct phone, and LinkedIn URL scores 9-10. A lead with only an email and company name scores 3-4. Missing the phone alone drops a lead by 2 points. Missing LinkedIn drops it by 1. This dimension has the highest weight because incomplete contacts cannot be reached across multiple channels.
2. Email deliverability (15% weight)
Will the email actually land in the inbox?
Verified as valid by our engine with 90%+ confidence scores 10. Verified as valid with 70-89% confidence scores 7-8. Verification failed or bounced scores 0 — this lead never passes the quality gate regardless of other scores. Email deliverability is a binary gate as well as a scoring dimension.
3. Sector relevance (12% weight)
Does the company actually match the requested sector?
An exact match to the requested sector scores 10. A closely related sector scores 6-8. An unrelated sector scores 0-2. A SaaS company requesting finance leads and receiving a manufacturing company scores 0 on this dimension. The AI cross-references sector classification against multiple sources to prevent false positives.
4. Company data quality (12% weight)
Is the company data complete and consistent?
Company with verified website, Companies House or equivalent registry entry, sector classification from two sources, employee count, and LinkedIn company page scores 9-10. A company with only a name and website scores 5-6. Government registry confirmation adds 2-3 points to this dimension.
5. Geographic match (10% weight)
Is the company actually in the requested country?
Confirmed match via registry data or website geolocation scores 10. Probable match based on TLD and address format scores 7-8. Uncertain match scores 4-5. A company registered in the requested country but headquartered elsewhere scores 6.
6. Source reliability (8% weight)
How authoritative is the data source?
Government business registry (Companies House, SIRENE, FSC) scores 10. Our verification engine domain search scores 8. Europages or Kompass directory scores 6. Crawled website data scores 5. Manual entry or unverified source scores 1-2. Source reliability is why government registry leads consistently outperform directory leads.
7. Website authority (8% weight)
Does the company have a legitimate, active web presence?
HTTPS website with domain age 3+ years, no broken pages, and matching company name scores 9-10. HTTP or recently registered domain scores 4-6. No website or parked domain scores 0. A working website signals an active company with real operations.
8. Company size match (8% weight)
Does the employee count match what the client requested?
Exact bracket match scores 10. Adjacent bracket scores 6-7. Two brackets away scores 3-4. Completely outside requested range scores 0-1. If a client requests 51-200 employees and the company has 12 employees, it scores 1 on this dimension regardless of other quality signals.
9. Duplicate risk (8% weight)
Has this contact been sold to this client before?
Never delivered to this client scores 10. Already in this client's inventory scores 0 — hard block regardless of other scores. This dimension guarantees clients never receive the same contact twice.
10. Data freshness (7% weight)
How recently was this data collected and verified?
Collected within 7 days scores 10. Within 30 days scores 8. Within 90 days scores 6. Within 180 days scores 3. Beyond 180 days triggers automatic re-verification before the lead can be delivered. Stale data is a major cause of bounced emails and dead phone numbers.
11. LinkedIn presence (7% weight)
Do the company and contact have active LinkedIn profiles?
Both company page and contact LinkedIn URL verified scores 10. Company page only scores 6. Contact LinkedIn only scores 5. Neither scores 0. LinkedIn presence is a strong signal of professional activity and increases reply rates by approximately 40% when used in personalised sequences.
12. Intent signals (5% weight)
Is the company showing buying signals?
Active hiring for relevant roles (e.g. VP Sales, Growth Manager), recent funding announcement, or expansion news scores 8-10. Recent industry news mention scores 5-7. No signals scores 3 (neutral, not penalised). Intent signals are the lowest-weight dimension because they are the most difficult to verify consistently.
Score benchmarks by source type
Based on LeadTYZER delivery data across 248,431 verified leads, average scores by primary source are: government registry leads average 84.7, Hunter.io domain search leads average 78.3, Europages/Kompass directory leads average 71.2, and enrichment-sourced leads average 73.6. This is why LeadTYZER always checks verified inventory and runs targeted collection before using any paid enrichment source — the free sources consistently produce higher-quality leads.
How to use score data in your sequences
The score and breakdown are delivered with every lead in the CSV export. Use the breakdown to personalise your outreach. A lead with a 10 on intent signals (active hiring for a sales role) should receive a different opening line than a lead with a 10 on source reliability (government registry confirmed) but a 3 on intent signals. The breakdown tells you what is compelling about each specific contact.
The most effective sequence strategy is to tier your outreach by score. Leads scoring 90-100 receive a fully personalised email with a specific call to the intent signal or company detail that drove the high score. Leads scoring 80-89 receive a templated email with company name and title personalisation. The effort matches the quality signal.
Frequently Asked Questions
What is a good B2B lead quality score?
A score of 70-79 is acceptable. 80-89 is good. 90-100 is excellent. LeadTYZER delivers only leads scoring 70 or above with verified email. The platform average across all delivered leads is 84.2.
Can I filter leads by score after delivery?
Yes. The CSV export includes the full score and a breakdown JSON for every lead. You can sort by score in Excel or import directly to your CRM with score as a field.
Does a high score guarantee a reply?
No score model guarantees replies. A 95-scoring lead is the highest-quality contact data available, but the reply depends on your message, timing, and offer. What it guarantees is that the email address works, the person is who we say they are, and they match your criteria exactly.
How does the scoring handle leads from different countries?
Geographic match is scored relative to the client's criteria, not an absolute standard. A lead in Mauritius scores 10 on geographic match for a client who requested Mauritius, regardless of the absolute size of the Mauritian market.