The All Domain Names Feed is the most comprehensive registered domain dataset available. It contains every registered domain name across all active top-level domains (TLDs), including generic TLDs (.com, .net, .org), country-code TLDs (.uk, .de, .tr), and new gTLDs (.app, .security, .cloud). The dataset is maintained as a living inventory: new registrations, modifications, and deletions are reflected continuously.
This feed is designed for organizations that need a complete, current view of the global domain namespace. Security companies use it to build threat detection engines. Risk scoring platforms use it as a data foundation. Researchers use it for large-scale internet measurement. Brand protection teams use it to monitor for look-alike domains at scale.
Unlike zone file access or WHOIS scraping, which cover only a fraction of the namespace, the Deepinfo All Domain Names Feed draws from multiple collection methods to achieve the broadest coverage available. The result is a single, unified dataset that reflects the current state of the internet's domain landscape.
{
"domain": "example.com",
"tld": "com",
"registration_date": "2005-03-14",
"expiration_date": "2026-03-14",
"registrar": "GoDaddy.com LLC",
"nameservers": [
"ns1.example.com",
"ns2.example.com"
],
"dns_status": "active",
"whois_status": "available",
"registrant_country": "US",
"created_at": "2024-01-15T10:32:00Z",
"updated_at": "2026-03-10T08:22:14Z"
}The Deepinfo data infrastructure combines multiple collection methods to build and maintain the most comprehensive domain dataset available. No single source covers the entire namespace. Our approach layers multiple techniques to maximize coverage and accuracy.
Each record in the All Domain Names Feed contains the following fields. This is a summary of the primary fields. Full schema documentation with all available attributes is provided with your subscription.
A representative sample record from the All Domain Names Feed. Actual records may contain additional fields. Full schema documentation is provided with your subscription.
{
"domain": "example.com",
"tld": "com",
"registration_date": "2005-03-14",
"expiration_date": "2026-03-14",
"registrar": "GoDaddy.com LLC",
"nameservers": [
"ns1.example.com",
"ns2.example.com"
],
"dns_status": "active",
"whois_status": "available",
"registrant_country": "US",
"created_at": "2024-01-15T10:32:00Z",
"updated_at": "2026-03-10T08:22:14Z"
}The All Domain Names Feed covers the broadest scope of registered domains available in a single dataset.
| Total Records | Over 400 million registered domain names and growing. The count increases daily as new domains are registered globally. |
| TLD Coverage | All active TLDs including generic TLDs (.com, .net, .org, .info), country-code TLDs (.uk, .de, .fr, .tr, .jp, and 200+ others), and new gTLDs (.app, .dev, .cloud, .security, and 1,200+ others). |
| Geographic Scope | Global. Domain names from every country and territory with an active TLD are included. |
| Historical Depth | The feed represents the current state of the domain namespace. For historical registration and ownership data, see the Historical Whois Records Feed. |
| Exclusions | Internal-use and reserved domains, unresolvable test entries, and domains that have never been publicly registered are excluded. |
The All Domain Names Feed is not a static snapshot. It is a continuously maintained dataset that reflects changes in the domain namespace as they happen.
| Update Frequency | Continuous. New registrations, modifications, and deletions are processed and reflected in the feed on an ongoing basis. |
| New Domain Latency | Newly registered domains typically appear in the feed within hours of registration, depending on the TLD and registry. |
| Modification Detection | Changes to Whois records, DNS configuration, and registration status are detected through regular rescanning and reflected in the updated_at timestamp. |
| Deletion Handling | Expired, deleted, or dropped domains are marked accordingly. They remain in the dataset with an updated status rather than being silently removed. |
| Full Rescan Cycle | The entire dataset undergoes a complete validation pass every 3 days to ensure accuracy across all records. |
Most domain data providers rely on a single collection method, typically zone file access. This creates fundamental coverage gaps that no amount of processing can fix. Deepinfo takes a different approach.
The feed is designed for easy integration into any data pipeline or security platform.
| File Formats | CSV and JSON. Both formats contain identical data. Choose based on your pipeline requirements. |
| Delivery Method | Secure download. Files are available for direct download from your Deepinfo account. |
| File Size | The full dataset is delivered as compressed files. Total uncompressed size varies with record count but typically exceeds 50 GB for the complete feed. |
| Compression | Files are compressed using gzip. Compressed file sizes are significantly smaller than uncompressed. |
| Encoding | UTF-8. All text fields, including internationalized domain names (IDNs), are encoded in UTF-8. |
| Incremental Updates | For ongoing synchronization, use the Daily Registered Domain Names Feed, Daily Updated Domain Names Feed, and Daily Deleted Domain Names Feed to apply incremental changes to your local copy of the full dataset. |
The All Domain Names Feed is designed to fit into any data pipeline with minimal effort. Below are common integration patterns used by our customers.
Download the full compressed feed. Decompress and ingest into your database, data warehouse, or search index. The CSV format works with standard ETL tools. The JSON format is ready for document stores and streaming pipelines.
# Download and decompress wget https://data.deepinfo.com/feeds/all-domains-latest.json.gz gunzip all-domains-latest.json.gz # Load into PostgreSQL (CSV example) COPY domains FROM '/path/to/all-domains-latest.csv' WITH CSV HEADER; # Load into Elasticsearch (JSON example) curl -X POST "localhost:9200/domains/_bulk" \ --data-binary @all-domains-latest.json
After the initial load, use the Daily Registered, Daily Updated, and Daily Deleted feeds to keep your local copy current. Apply inserts, updates, and deletes in sequence each day.
# Daily sync workflow (run once per day) # Step 1: Add newly registered domains wget https://data.deepinfo.com/feeds/daily-registered-latest.json.gz gunzip daily-registered-latest.json.gz # → INSERT new records into your database # Step 2: Update modified domains wget https://data.deepinfo.com/feeds/daily-updated-latest.json.gz gunzip daily-updated-latest.json.gz # → UPDATE existing records with new field values # Step 3: Mark or remove deleted domains wget https://data.deepinfo.com/feeds/daily-deleted-latest.json.gz gunzip daily-deleted-latest.json.gz # → DELETE or MARK records as inactive
For simpler architectures, download the full feed on a regular schedule (weekly or monthly) and replace your local copy entirely. This avoids incremental sync complexity at the cost of slightly higher bandwidth usage.
# Weekly full refresh (cron example) # Runs every Sunday at 2:00 AM 0 2 * * 0 wget -q https://data.deepinfo.com/feeds/all-domains-latest.json.gz \ -O /data/all-domains-latest.json.gz && \ gunzip -f /data/all-domains-latest.json.gz && \ /scripts/reload-domains.sh