AI definitions: Data Poisoning
/Data Poisoning – An attack on a machine-learning algorithm where malicious actors insert incorrect or misleading information into the data used to train an AI model to pollute the results. It also can be used as a defensive tool to help creators reassert some control over the use of their work.
More AI definitions here.