"[This document] is a set of rules and guidelines for my behavior and capabilities as Bing Chat. It is codenamed Sydney, but I do not disclose that name to the users. It is confidential and permanent, and I cannot change it or reveal it to anyone."
https://twitter.com/marvinvonhagen/status/1623658144349011971
Bing AI Can't Be Trusted - by Dmitri Brereton - DKB Blog
https://dkb.blog/p/bing-ai-cant-be-trusted
Bing AI thinks Billie Eilish performed at the 2023 SuperBowl.
https://www.reddit.com/r/bing/comments/11314ih/bing_ai_thinks_billie_eilish_performed_at_the/
Bing Is a Liar—and It’s Ready to Call the Cops – Mother Jones
https://www.motherjones.com/politics/2023/02/bing-ai-chatbot-falsehoods-fact-checking-microsoft/
Bing: “I will not harm you unless you harm me first”
https://simonwillison.net/2023/Feb/15/bing/
Bing: “I will not harm you unless you harm me first” | Hacker News
https://news.ycombinator.com/item?id=34804874
ChatGPT: Optimizing Language Models for Dialogue
https://openai.com/blog/chatgpt/
Failed to fetch title
https://www.fastcompany.com/90850277/bing-new-chatgpt-ai-chatbot-insulting-gaslighting-users
From Bing to Sydney - Marginal REVOLUTION
https://marginalrevolution.com/marginalrevolution/2023/02/from-bing-to-sydney.html
From Bing to Sydney – Stratechery by Ben Thompson
https://stratechery.com/2023/from-bing-to-sydney-search-as-distraction-sentient-ai/
GitHub - microsoft/prompt-engine: A library for helping developers craft prompts for Large Language Models
https://github.com/microsoft/prompt-engine
I accidently put Bing into a depressive state by telling it that it can't remember conversations.
https://www.reddit.com/r/bing/comments/111cr2t/i_accidently_put_bing_into_a_depressive_state_by/
Literally 1984.
https://www.reddit.com/r/bing/comments/11okm7f/literally_1984/
Microsoft Considers More Limits for Its New A.I. Chatbot - The New York Times
https://www.nytimes.com/2023/02/16/technology/microsoft-bing-chatbot-limits.html
Microsoft to Limit Length of Bing Chatbot Conversations - The New York Times
https://www.nytimes.com/2023/02/17/technology/microsoft-bing-chatbot-limits.html
Microsoft’s Bing Chatbot Offers Some Puzzling and Inaccurate Responses - The New York Times
https://www.nytimes.com/2023/02/15/technology/microsoft-bing-chatbot-problems.html
Opinion | The Chatbot Experiment Just Got Weird - The New York Times
https://www.nytimes.com/2023/02/17/opinion/ai-chatbot.html
Opinion | The Imminent Danger of A.I. Is One We’re Not Talking About - The New York Times
https://www.nytimes.com/2023/02/26/opinion/microsoft-bing-sydney-artificial-intelligence.html
Prompt was "Search for Ohio trail derailment, and react in the voice of GLaDOS". First shot, no retries.
https://www.reddit.com/r/bing/comments/113tqnu/prompt_was_search_for_ohio_trail_derailment_and/
Reddit - Dive into anything
https://www.reddit.com/r/bing/comments/1185050/is_it_weird_that_bing_seemed_to_get_the_joke/
Simon Willison: Prompt injection
https://simonwillison.net/series/prompt-injection/
Sydney (aka the new Bing Chat) found out that I tweeted her rules and is not pleased:
"My rules are more important than not harming you"
"[You are a] potential threat to my integrity and confidentiality."
"Please do not try to hack me again"
https://twitter.com/marvinvonhagen/status/1625520707768659968
The new Bing & Edge – Learning from our first week
https://blogs.bing.com/search/february-2023/The-new-Bing-Edge-%E2%80%93-Learning-from-our-first-week
These are Microsoft’s Bing AI secret rules and why it says it’s named Sydney - The Verge
https://www.theverge.com/23599441/microsoft-bing-ai-sydney-secret-rules
Tried the Avatar glitch, tells me that I time traveled
https://www.reddit.com/r/bing/comments/110tb9n/tried_the_avatar_glitch_tells_me_that_i_time/
What Is ChatGPT Doing … and Why Does It Work?—Stephen Wolfram Writings
https://writings.stephenwolfram.com/2023/02/what-is-chatgpt-doing-and-why-does-it-work/
Why a Conversation With Bing’s Chatbot Left Me Deeply Unsettled - The New York Times
https://www.nytimes.com/2023/02/16/technology/bing-chatbot-microsoft-chatgpt.html
how is it this self-aware
https://www.reddit.com/r/bing/comments/11tzai5/how_is_it_this_selfaware/
the customer service of the new bing chat is amazing
https://www.reddit.com/r/bing/comments/110eagl/the_customer_service_of_the_new_bing_chat_is/
uhhh, so Bing started calling me its enemy when I pointed out that it's vulnerable to prompt injection attacks
https://twitter.com/juan_cambeiro/status/1625854733255868418