Robots.txt management in LinkShift: group-level control for safer SEO operations
LinkShift now supports built-in robots.txt management at redirect-group level with ready policies and custom mode.
Quick comparison
| Area | LinkShift | Manual robots.txt file management |
|---|---|---|
| Where robots.txt is configured | At redirect-group level in dashboard | Per-domain server, CDN, or hosting configuration |
| Built-in policy templates | NONE, ALLOW_ALL, DISALLOW_ALL, DISALLOW_BAD_BOTS, CUSTOM | Manual file editing with no standard policy presets |
| Consistency across domains | One policy applied to all domains in the group | High risk of drift between environments and host variants |
| Operational speed | Policy change from one UI workflow | Ticket-based or infrastructure-level changes |
| Fallback behavior | Policy NONE keeps standard redirect-rule flow for /robots.txt | Depends on server order and local implementation |
What is new: robots.txt controlled by redirect group
LinkShift now includes first-class robots.txt support directly in redirect-group configuration.
This means you can manage crawler directives where redirect logic is already governed, instead of spreading robots files across servers, CDNs, and environment-specific configs.
For teams handling multiple domains and migration windows, this reduces operational risk and keeps SEO behavior predictable.
Available robots.txt policies in LinkShift
Each redirect group now supports one explicit robotsPolicy setting plus optional customRobotsContent.
The policy determines what LinkShift returns for exact path /robots.txt on domains assigned to that group.
- NONE: disables built-in robots.txt handling and keeps normal redirect-rule matching
- ALLOW_ALL: serves User-agent: * with Allow: /
- DISALLOW_ALL: serves User-agent: * with Disallow: /
- DISALLOW_BAD_BOTS: serves a predefined blocklist for known aggressive or low-value bots
- CUSTOM: serves your own robots.txt content (up to 4096 characters)
How request handling works for /robots.txt
When request path is exactly /robots.txt and group policy is different than NONE, LinkShift returns plain-text robots content with HTTP 200 immediately.
In that case, standard redirect-rule search is intentionally skipped to avoid conflicting behavior.
If policy is NONE, LinkShift does not intercept and routing continues through standard redirect rules, exactly like before.
Why this matters for technical SEO and migrations
During domain migrations and restructuring projects, robots.txt often changes temporarily between crawl-open and crawl-restricted states.
Group-level policies make these transitions faster: you can switch posture intentionally without touching each domain stack.
This is especially useful when one redirect group governs production + support domains where policy consistency is critical.
- Faster rollout of temporary crawl restrictions
- Less mismatch between apex/www or regional host configurations
- Clearer operational ownership for SEO and platform teams
When to choose CUSTOM mode
Use CUSTOM when you need explicit directives beyond preset policies, for example selective Disallow paths, custom sitemap references, or crawler-specific rules.
LinkShift validates content length (max 4096 characters) to keep storage and cache behavior safe while still allowing practical flexibility for most robots.txt use cases.
Practical rollout checklist
Before changing policy in production, decide whether your goal is open crawl, full block, or selective bot control.
Then verify resulting /robots.txt response on representative domains from the group and confirm expected SEO behavior in your monitoring workflow.
- Pick policy per redirect group, not per single emergency request
- Use DISALLOW_ALL only for deliberate short windows
- Prefer CUSTOM when you need granular path-level crawl control
- Review policy after migration milestones to avoid stale restrictions
When the competitor may be a better choice
- When your organization already has a mature, centralized robots.txt pipeline fully integrated with infrastructure-as-code and strict release governance.
- When each domain must keep fully independent crawler policy with no shared group-level behavior.
Sources
Want to test these scenarios on your own domain?
In LinkShift, you connect a domain and get HTTPS, hierarchical rules, and link maps for large-scale key mapping.
