PLATFORM

The Deepinfo Platform brings together five integrated modules that share the same data foundation, the same continuous monitoring engine, and the same intelligence. Each module delivers specialist depth in its domain. Together, they cover your entire external threat exposure.
GO TO DETAILS

SOLUTIONS

All Domain Names Feed

The complete list of all registered domain names across every active TLD.
Access over 400 million registered domain names in a single dataset. Continuously maintained as new domains are registered, existing domains are modified, and expired domains are removed. Delivered in CSV and JSON formats, ready for integration into your infrastructure.
ABOUT

What Is the All Domain Names Feed?

The All Domain Names Feed is the most comprehensive registered domain dataset available. It contains every registered domain name across all active top-level domains (TLDs), including generic TLDs (.com, .net, .org), country-code TLDs (.uk, .de, .tr), and new gTLDs (.app, .security, .cloud). The dataset is maintained as a living inventory: new registrations, modifications, and deletions are reflected continuously.

This feed is designed for organizations that need a complete, current view of the global domain namespace. Security companies use it to build threat detection engines. Risk scoring platforms use it as a data foundation. Researchers use it for large-scale internet measurement. Brand protection teams use it to monitor for look-alike domains at scale.

Unlike zone file access or WHOIS scraping, which cover only a fraction of the namespace, the Deepinfo All Domain Names Feed draws from multiple collection methods to achieve the broadest coverage available. The result is a single, unified dataset that reflects the current state of the internet's domain landscape.

 

{
  "domain": "example.com",
  "tld": "com",
  "registration_date": "2005-03-14",
  "expiration_date": "2026-03-14",
  "registrar": "GoDaddy.com LLC",
  "nameservers": [
    "ns1.example.com",
    "ns2.example.com"
  ],
  "dns_status": "active",
  "whois_status": "available",
  "registrant_country": "US",
  "created_at": "2024-01-15T10:32:00Z",
  "updated_at": "2026-03-10T08:22:14Z"
}
METHODOLOGY

How We Collect This Data

The Deepinfo data infrastructure combines multiple collection methods to build and maintain the most comprehensive domain dataset available. No single source covers the entire namespace. Our approach layers multiple techniques to maximize coverage and accuracy.

Zone File Processing
We process TLD zone files from registries that provide them, covering hundreds of TLDs. Zone files are ingested daily and cross-referenced with our existing inventory.
Active DNS Probing
Continuous DNS resolution across known and predicted domain names. We resolve billions of DNS queries to discover domains that don't appear in zone files, including many ccTLDs.
Certificate Transparency
Real-time monitoring of Certificate Transparency logs to discover domains at the moment SSL certificates are issued, often before a domain goes live.
Cross-Source Validation
Every discovered domain is validated across multiple signals: DNS resolution, Whois availability, web response, and certificate status. Invalid or parked entries are flagged accordingly.
METHODOLOGY

How We Collect This Data

The Deepinfo data infrastructure combines multiple collection methods to build and maintain the most comprehensive domain dataset available. No single source covers the entire namespace. Our approach layers multiple techniques to maximize coverage and accuracy.
Zone File Processing
We process TLD zone files from registries that provide them, covering hundreds of TLDs. Zone files are ingested daily and cross-referenced with our existing inventory.
Active DNS Probing
Continuous DNS resolution across known and predicted domain names. We resolve billions of DNS queries to discover domains that don't appear in zone files, including many ccTLDs.
Certificate Transparency
Real-time monitoring of Certificate Transparency logs to discover domains at the moment SSL certificates are issued, often before a domain goes live.
Cross-Source Validation
Every discovered domain is validated across multiple signals: DNS resolution, Whois availability, web response, and certificate status. Invalid or parked entries are flagged accordingly.
DATA COVERAGE

What's Included in Every Record

Each record in the All Domain Names Feed contains the following fields. This is a summary of the primary fields. Full schema documentation with all available attributes is provided with your subscription.

FieldTypeDescriptionExample
domainstringThe fully qualified domain nameexample.com
tldstringThe top-level domaincom
registration_datedateDate the domain was first registered2005-03-14
expiration_datedateDate the domain registration expires2026-03-14
registrarstringThe registrar through which the domain is registeredGoDaddy.com LLC
nameserversarrayAuthoritative nameservers for the domain["ns1.example.com"]
dns_statusstringCurrent DNS resolution statusactive
whois_statusstringWhois record availability statusavailable
registrant_countrystringCountry of the registrant (where available)US
created_atdatetimeTimestamp when the record was first added to the feed2024-01-15T10:32:00Z
updated_atdatetimeTimestamp of the most recent update to this record2026-03-10T08:22:14Z
SAMPLE DATA

Sample Data

A representative sample record from the All Domain Names Feed. Actual records may contain additional fields. Full schema documentation is provided with your subscription.

{
  "domain": "example.com",
  "tld": "com",
  "registration_date": "2005-03-14",
  "expiration_date": "2026-03-14",
  "registrar": "GoDaddy.com LLC",
  "nameservers": [
    "ns1.example.com",
    "ns2.example.com"
  ],
  "dns_status": "active",
  "whois_status": "available",
  "registrant_country": "US",
  "created_at": "2024-01-15T10:32:00Z",
  "updated_at": "2026-03-10T08:22:14Z"
}
COVERAGE

Data Coverage and Scale

The All Domain Names Feed covers the broadest scope of registered domains available in a single dataset.

Total RecordsOver 400 million registered domain names and growing. The count increases daily as new domains are registered globally.
TLD CoverageAll active TLDs including generic TLDs (.com, .net, .org, .info), country-code TLDs (.uk, .de, .fr, .tr, .jp, and 200+ others), and new gTLDs (.app, .dev, .cloud, .security, and 1,200+ others).
Geographic ScopeGlobal. Domain names from every country and territory with an active TLD are included.
Historical DepthThe feed represents the current state of the domain namespace. For historical registration and ownership data, see the Historical Whois Records Feed.
ExclusionsInternal-use and reserved domains, unresolvable test entries, and domains that have never been publicly registered are excluded.
Data Freshness and Update Cycle

The All Domain Names Feed is not a static snapshot. It is a continuously maintained dataset that reflects changes in the domain namespace as they happen.

Update FrequencyContinuous. New registrations, modifications, and deletions are processed and reflected in the feed on an ongoing basis.
New Domain LatencyNewly registered domains typically appear in the feed within hours of registration, depending on the TLD and registry.
Modification DetectionChanges to Whois records, DNS configuration, and registration status are detected through regular rescanning and reflected in the updated_at timestamp.
Deletion HandlingExpired, deleted, or dropped domains are marked accordingly. They remain in the dataset with an updated status rather than being silently removed.
Full Rescan CycleThe entire dataset undergoes a complete validation pass every 3 days to ensure accuracy across all records.

Why Deepinfo's Domain Data

Most domain data providers rely on a single collection method, typically zone file access. This creates fundamental coverage gaps that no amount of processing can fix. Deepinfo takes a different approach.

 

CapabilityZone File ProvidersWHOIS Scraping ServicesPassive DNS ProvidersDeepinfo
gTLD CoveragePartial (only participating TLDs)Partial (rate-limited)Partial (traffic-dependent)All active gTLDs
ccTLD CoverageVery limitedPartial (varies by jurisdiction)Partial (resolver-dependent)Broad (multi-method discovery)
New gTLD CoverageVaries by registryLimitedLimitedComprehensive
Discovery MethodSingle sourceSingle sourceSingle sourceFour layered methods
Registration MetadataLimited (NS records only)Detailed (when accessible)NoneFull (Whois, DNS, certificate)
Update LatencyDaily at bestHours to weeksReal-time but incompleteContinuous + full rescan every 3 days
IDN SupportDepends on registryInconsistentVariesFull UTF-8 and Punycode
Data ValidationMinimal (trusts source)MinimalNone (observational)Multi-source cross-validation
Estimated Global Coverage40-60% of registered domains30-50% depending on access20-40% depending on traffic90%+ of publicly registered domains
Broadest Coverage Available
Zone files cover only the TLDs that provide them. Many ccTLDs and some new gTLDs do not share zone files at all. Deepinfo combines zone file processing, active DNS probing, Certificate Transparency monitoring, and cross-source validation to cover domains that no single method can reach. The result is over 400 million domains across 1,500+ TLDs.
Not Just Names, Full Context
Zone files typically contain only the domain name and nameserver records. Deepinfo enriches every record with registration dates, expiration dates, registrar information, Whois status, DNS status, registrant country, and metadata. You get a complete record, not just a name.
Validated, Not Assumed
Every domain in the feed has been validated against multiple independent signals. Deepinfo does not blindly trust any single source. Domains must be confirmed through DNS resolution, Whois lookup, or certificate records before inclusion. Invalid, parked, and reserved entries are flagged, not silently mixed into the dataset.

Delivery and Format

The feed is designed for easy integration into any data pipeline or security platform.

File FormatsCSV and JSON. Both formats contain identical data. Choose based on your pipeline requirements.
Delivery MethodSecure download. Files are available for direct download from your Deepinfo account.
File SizeThe full dataset is delivered as compressed files. Total uncompressed size varies with record count but typically exceeds 50 GB for the complete feed.
CompressionFiles are compressed using gzip. Compressed file sizes are significantly smaller than uncompressed.
EncodingUTF-8. All text fields, including internationalized domain names (IDNs), are encoded in UTF-8.
Incremental UpdatesFor ongoing synchronization, use the Daily Registered Domain Names Feed, Daily Updated Domain Names Feed, and Daily Deleted Domain Names Feed to apply incremental changes to your local copy of the full dataset.

Integration Guide

The All Domain Names Feed is designed to fit into any data pipeline with minimal effort. Below are common integration patterns used by our customers.

Pattern 1: Initial Full Load

Download the full compressed feed. Decompress and ingest into your database, data warehouse, or search index. The CSV format works with standard ETL tools. The JSON format is ready for document stores and streaming pipelines.

# Download and decompress
wget https://data.deepinfo.com/feeds/all-domains-latest.json.gz
gunzip all-domains-latest.json.gz
# Load into PostgreSQL (CSV example)
COPY domains FROM '/path/to/all-domains-latest.csv' WITH CSV HEADER;
# Load into Elasticsearch (JSON example)
curl -X POST "localhost:9200/domains/_bulk" \
  --data-binary @all-domains-latest.json

Pattern 2: Incremental Daily Sync

After the initial load, use the Daily Registered, Daily Updated, and Daily Deleted feeds to keep your local copy current. Apply inserts, updates, and deletes in sequence each day.

# Daily sync workflow (run once per day)
# Step 1: Add newly registered domains
wget https://data.deepinfo.com/feeds/daily-registered-latest.json.gz
gunzip daily-registered-latest.json.gz
# → INSERT new records into your database
# Step 2: Update modified domains
wget https://data.deepinfo.com/feeds/daily-updated-latest.json.gz
gunzip daily-updated-latest.json.gz
# → UPDATE existing records with new field values
# Step 3: Mark or remove deleted domains
wget https://data.deepinfo.com/feeds/daily-deleted-latest.json.gz
gunzip daily-deleted-latest.json.gz
# → DELETE or MARK records as inactive

Pattern 3: Periodic Full Refresh

For simpler architectures, download the full feed on a regular schedule (weekly or monthly) and replace your local copy entirely. This avoids incremental sync complexity at the cost of slightly higher bandwidth usage.

# Weekly full refresh (cron example)
# Runs every Sunday at 2:00 AM
0 2 * * 0 wget -q https://data.deepinfo.com/feeds/all-domains-latest.json.gz \
  -O /data/all-domains-latest.json.gz && \
  gunzip -f /data/all-domains-latest.json.gz && \
  /scripts/reload-domains.sh
Full API documentation, authentication details, and code examples in Python, Go, and JavaScript are provided with your subscription. View Full Documentation
FAQ
Frequently Asked Questions
It is a long established fact that a reader will be distracted by the readable content of a page when looking at its layout. The point of using Lorem Ipsum is that it has a more-or-less normal distribution of letters, as opposed to using 'Content here, content here', making it look like readable English. Many desktop publishing packages and web page editors now use Lorem Ipsum as their default model text, and a search for 'lorem ipsum' will uncover many web sites still in their infancy. Various versions have evolved over the years, sometimes by accident, sometimes on purpose (injected humour and the like).