I have my site made in drupal 7 and I am using gtranslate pro on the website. I found that robot.txt don't restrict bots when they are prefixed with language code.
I went to webmaster tools and tried testing /admin/config is crawl-able by the bot or not and it showed that it is not. I tried hi/admin/config and it shows that the url is allowed. I have a bunch of urls as shipped with default robot.txt in drupal which should not be crawled. Is there a way that these urls are not crawled by the bots even if they get visited with language code or I have to set the robot.txt with all language code i.e. /hi/admin/config /de/admin/config and so on?