The phrase "when Google processed and indexed the entire internet without permission" caught my attention. Technically, parsing text found on the internet and "copying" it into a token database that will be used by a search algorithm is no different than parsing and copying that same text into a token database that will be used by an AI algorithm. But somehow there is a gut reaction that use of your tokenized work by an AI algorithm just "feels" unethical?
Techdirt has not posted any stories submitted by sguidos.
Indexing for AI or Searching is No Different
The phrase "when Google processed and indexed the entire internet without permission" caught my attention. Technically, parsing text found on the internet and "copying" it into a token database that will be used by a search algorithm is no different than parsing and copying that same text into a token database that will be used by an AI algorithm. But somehow there is a gut reaction that use of your tokenized work by an AI algorithm just "feels" unethical?