I followed installation steps and installed the pro version, it works fine.
I have created sitemap.txt with the languages I want to be indexed, but I didn't understand the step 5 in the readme file, "modify robots.txt....
1- I don't find any robots.txt file on the root or anywhere else on my site, should I create one myself?
2- Except the sitemap: yourdomain.com/sitemap.txt what else should be inside?
I don't know for example what should be DISALLOWED and what allowed from security aspect.
Should I put a(*) for User-agent, allowing all user agents or limit it for example to only Googlebot?
Is it correct to put the following in the file: