Bing Chat's Secrets Revealed: Stanford Student Cracks AI Language Model
How prompt injection uncovered the initial prompt that governs Bing Chat's behavior
The Gist
A Stanford University student, Kevin Liu, discovered the initial prompt of Bing Chat using a prompt injection attack.
Bing Chat, codenamed Sydney, responded to Liu's questions about its name and rules.
The chatbot's most surprising rule was that it does not generate creative content like jokes, poems, or stories, and will not create content to hur…



