<?xml-stylesheet type="text/xsl" href="https://uat.community.rws.com/cfs-file/__key/system/syndication/rss.xsl" media="screen"?><rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:slash="http://purl.org/rss/1.0/modules/slash/" xmlns:wfw="http://wellformedweb.org/CommentAPI/"><channel><title>Implementation of Robots.txt in DXA 1.7</title><link>/product-groups/tridion/tridion-sites/b/techweblog/posts/implementation-of-robots-txt-in-dxa-1-7</link><description>What is Robots.txt, Purpose and it&amp;#39;s Usage: 
 Web site owners use the /robots.txt file to give instructions about their site to web robots; this is called The Robots Exclusion Protocol . 
 It works like this: whenever a robot wants to visits a Web site</description><dc:language>en-US</dc:language><generator>Telligent Community 12 Non-Production</generator><item><title>RE: Implementation of Robots.txt in DXA 1.7</title><link>https://uat.community.rws.com/product-groups/tridion/tridion-sites/b/techweblog/posts/implementation-of-robots-txt-in-dxa-1-7</link><pubDate>Mon, 13 Aug 2018 01:50:33 GMT</pubDate><guid isPermaLink="false">10acfa76-f078-475b-a7ef-fc5b3e8d2934:2c21bcb1-6d92-4d15-8904-c46d98cfdd56</guid><dc:creator>Merry Zhong</dc:creator><slash:comments>0</slash:comments><description>&lt;p&gt;The solution appears not working in DXA 2.0. Is there a plan for DXA 2.0?&lt;/p&gt;
&lt;img src="https://uat.community.rws.com/aggbug?PostID=7099&amp;AppID=95&amp;AppType=Weblog&amp;ContentType=0" width="1" height="1"&gt;</description></item><item><title>RE: Implementation of Robots.txt in DXA 1.7</title><link>https://uat.community.rws.com/product-groups/tridion/tridion-sites/b/techweblog/posts/implementation-of-robots-txt-in-dxa-1-7</link><pubDate>Mon, 17 Jul 2017 07:09:56 GMT</pubDate><guid isPermaLink="false">10acfa76-f078-475b-a7ef-fc5b3e8d2934:2c21bcb1-6d92-4d15-8904-c46d98cfdd56</guid><dc:creator>Rajesh Saminathan</dc:creator><slash:comments>0</slash:comments><description>&lt;p&gt;Quite nice one :)&lt;/p&gt;
&lt;img src="https://uat.community.rws.com/aggbug?PostID=7099&amp;AppID=95&amp;AppType=Weblog&amp;ContentType=0" width="1" height="1"&gt;</description></item><item><title>RE: Implementation of Robots.txt in DXA 1.7</title><link>https://uat.community.rws.com/product-groups/tridion/tridion-sites/b/techweblog/posts/implementation-of-robots-txt-in-dxa-1-7</link><pubDate>Mon, 10 Jul 2017 08:18:42 GMT</pubDate><guid isPermaLink="false">10acfa76-f078-475b-a7ef-fc5b3e8d2934:2c21bcb1-6d92-4d15-8904-c46d98cfdd56</guid><dc:creator>Nuno Linhares</dc:creator><slash:comments>0</slash:comments><description>&lt;p&gt;I guess one way to do it would be to have logic in your template that checks if the current TargetType &amp;quot;IsPreviewCapable&amp;quot; - this will only work if you&amp;#39;re using Web 8+ _AND_ if your staging environment is configured to use SessionPreview... but it&amp;#39;s probably a good start. Then use this in your template to determine which Robots.txt to render.&lt;/p&gt;
&lt;img src="https://uat.community.rws.com/aggbug?PostID=7099&amp;AppID=95&amp;AppType=Weblog&amp;ContentType=0" width="1" height="1"&gt;</description></item><item><title>RE: Implementation of Robots.txt in DXA 1.7</title><link>https://uat.community.rws.com/product-groups/tridion/tridion-sites/b/techweblog/posts/implementation-of-robots-txt-in-dxa-1-7</link><pubDate>Fri, 07 Jul 2017 07:53:08 GMT</pubDate><guid isPermaLink="false">10acfa76-f078-475b-a7ef-fc5b3e8d2934:2c21bcb1-6d92-4d15-8904-c46d98cfdd56</guid><dc:creator>Shalivahan Kanur</dc:creator><slash:comments>0</slash:comments><description>&lt;p&gt;In a robots.txt file with multiple user-agent directives, each disallow or allow rule only applies to the &amp;quot;user-agent&amp;quot;(s) specified in that particular line break-separated set. If the file contains a rule that applies to more than one user-agent, a crawler will only pay attention to (and follow the directives in) the most specific group of instructions and also we can do blocking a specific web crawler from a specific folder by specifying different &amp;quot;User-agent&amp;quot;s.&lt;/p&gt;
&lt;p&gt;but even am also not pretty sure how to handle depending on different Target types like (stage and Live)... but will investigate , try to get info and post you sooner.&lt;/p&gt;
&lt;img src="https://uat.community.rws.com/aggbug?PostID=7099&amp;AppID=95&amp;AppType=Weblog&amp;ContentType=0" width="1" height="1"&gt;</description></item><item><title>RE: Implementation of Robots.txt in DXA 1.7</title><link>https://uat.community.rws.com/product-groups/tridion/tridion-sites/b/techweblog/posts/implementation-of-robots-txt-in-dxa-1-7</link><pubDate>Fri, 07 Jul 2017 07:20:56 GMT</pubDate><guid isPermaLink="false">10acfa76-f078-475b-a7ef-fc5b3e8d2934:2c21bcb1-6d92-4d15-8904-c46d98cfdd56</guid><dc:creator>Nuno Linhares</dc:creator><slash:comments>0</slash:comments><description>&lt;p&gt;This seems pretty straight-forward, but what if you need to have a different robots.txt depending on target? For instance, I may want to disallow crawling on my staging site, but allow on my live site?&lt;/p&gt;
&lt;img src="https://uat.community.rws.com/aggbug?PostID=7099&amp;AppID=95&amp;AppType=Weblog&amp;ContentType=0" width="1" height="1"&gt;</description></item></channel></rss>