AI Definitions: Red Teaming
/Red Teaming - Testing an AI by trying to force it to act in unintended or undesirable ways, thus uncovering potential harms. The term comes from a military practice of taking on the role of an attacker to devise strategies.
More AI definitions here.