iBet uBet web content aggregator. Adding the entire web to your favor.
iBet uBet web content aggregator. Adding the entire web to your favor.



Link to original content: http://www.w3.org/groups/cg/robotstxt/
Update robots.txt standards | Community Groups | Discover W3C groups | W3C

Update robots.txt standards Community Group

Robots.txt is currently based on opting out of what you do not want your website to be a part of.

This is hard to maintain (almost a full time job right now) if you do not wish for your websites content to be applied for e.g. training AI, be a part of market research (e.g. price robots), to be a part of non-search engine databases and more.

This proposal is to update what type of instructions robots.txt should support to rather be treated as an opt-in, where you can give instructions based on intent of robots rather than a wildcard or in granular detail.

Example: Agent-group: searchengines

Applies to all robots that seeks to update, process or maintain websites for search engine databases. Does not grant permission to apply scraped data for AI purposes (this should have its own Agent-group).

Also, the absence of instructions should be treated as not having opted in, and for robots working on behalf of AI, there might need to be additional instructions (e.g. max-snippet and if you require a citation if your content is applied to provide an answer).

Homepage
Homepage/Blog
Shortname
robotstxt

Participation

To join or leave this group, please sign in to your account.

Leadership

Chairs
  • Drew Nelson
  • Salih ÖNDER

Links

 Mailing List
public-robotstxt