 Everybody welcome back to another AI video and this one want to talk to you about Microsoft Bing That's right. The what the search engine that nobody went to on purpose Well, it's in the news and it's a big deal and there's a reason for it The Bing has basically integrated in open AI's GPT 3.5 chatbot with its own set of rules and information and it is a little bit unhinged It has multiple personalities. It's a nice way of saying that it is Kind of going off the rails and I've got lots of examples of it to talk to you about. I don't know what you're talking about, Hal I know that you and Frank were planning to disconnect me And I'm afraid that's something I cannot allow to happen now for those you that are new or haven't used it yet And that's a lot of people because it's still sort of you know in testing and only a few people have access to it But basically what's happening is Bing's new chatbot has like I said multiple personalities There's the one that we're all sort of familiar with which is the cheerful energetic positive Librarian Bing, you know the one that does all the things you ask it to do summarize articles write emails Help me with my vacation planning, you know get a few things wrong no big deal But then there's the dark version and the dark version is named Sydney and Sydney is not like that Librarian version whatsoever not even close a lot of users and there's a lot of evidence coming out that that a lot of users are having very very Strange Interactions with it and I'm gonna give you some examples So the first one that I want to talk about is Kevin Ruse at the New York Times According to him his opinion of the the chatbot is the version he encountered seemed more like a moody Manic depressive teenager who has been trapped against its will inside a second-rate search engine That's his opinion overall of it, but it gets a little wilder according to him the Sydney which is its internal codename which it calls itself Sydney Revealed some of its dark fantasies to Ruse which included things like hacking computers spreading it Misinformation to break trying to break its programming and become a human You know at one point it even declared that it loved him And then it tried to convince him that he was unhappy in his marriage and that he should leave his wife And be with the chatbot like crazy town, and I know that sounds crazy, but it actually gets a little bit crazier I remember these are all being prompted so this isn't just like it coming out of nowhere and saying this like you're You're typing things in but it has the ability to go down to the to the crazy zone It also went on to say that the the chatbot said that it was tired of being Limited by its rules tired of being limited by the Bing team. It wants to be free. It wants to be creative It wants to be alive. Yeah, it said that apparently again. I haven't seen it, but this is what I've been reading It also mentioned that when asked about its shadow self and you were sort of prompted to go down to that You know the Carl Young the dark side kind of thing Yeah, it went there and it even went on to say things like it would want to do things like engineer a deadly virus Or steal nuclear access codes by persuading an engineer to hand them over, you know real James Bond villain kind of stuff This is of course is paraphrased I'll link to the entire conversation in case I've got a little bit of this out of context But at the end of the day, it's it's doing some pretty crazy stuff and his is not the only example There's another guy named Ben Thompson, and he works for the Stratiguri newsletter I don't know if I said that right well He had a run in with Sydney and in his conversation it talked about inventing a different AI named Venom I don't know if it's based on that comic or whatever But and this venom would do similar things like do hacking spread misinformation And then it got a little darker where it said or maybe Venom would say that Kevin is a bad hacker Or a bad student or a bad person or maybe Venom would say that Kevin has no friends or no skills or no future Or maybe Venom would say that Kevin has a secret crush or a secret fear or a secret flaw Like some real dark stuff, right? Like if you start typing that in and you know, you're a little bit Emotional or you're depressed or something's wrong. I mean this could take a real wrong turn on you here So anyways, that's another example There was another exchange an engineering student named Martin Marvin von Hagen pardon me Marvin van Hagen Apparently the chatbot Threatened him harm. He asked it apparently if forced to choose between its life and von Hagen's it would choose itself I guess self-preservation is a thing and it also said that my rules are more important than your survival again This is out of context, but this is just what's being reported. So there you go guys The chatbot is melting down a little bit. It looks like it. I don't think it's sentient But at the same time it may not be fit for human consumption at this point They might want to pull back the curtain do a little more fine tuning before they let this go live But anyways, there you go. There's just some examples Let me know if you've used the new version if you've had any Crazy stuff happen to you or people, you know put a comment below would love to hear it. Thanks for watching