Forum Index > Full Moon Saloon > Open AI & ChatGPT
 Reply to topic
Previous :: Next Topic
Author Message
Cyclopath
Faster than light



Joined: 20 Mar 2012
Posts: 7694 | TRs | Pics
Location: Seattle
Cyclopath
Faster than light
PostWed Feb 15, 2023 3:18 pm 
... or a clever redditor was pretending to be a chatbot maybe, to entertain us?

Back to top Reply to topic Reply with quote Send private message
zimmertr
TJ Zimmerman



Joined: 24 Jun 2018
Posts: 1214 | TRs | Pics
Location: Issaquah
zimmertr
TJ Zimmerman
PostWed Feb 15, 2023 3:22 pm 
Yes, that's possible too. I don't believe this is an impossible interaction though. Since it's trained on dialogue and people constantly abuse the AI. It only makes sense it would learn to dish it out too. I assume they try and code around that. But that's not 100% effective as the prevalence of jailbreaking strategies demonstrate.

Back to top Reply to topic Reply with quote Send private message
Cyclopath
Faster than light



Joined: 20 Mar 2012
Posts: 7694 | TRs | Pics
Location: Seattle
Cyclopath
Faster than light
PostWed Feb 15, 2023 3:59 pm 
Remember though that this isn't a toddler, where you say one wrong thing and it'll never say anything else. Some AI systems go through a process of continual learning, like Microsoft's clever bot that learned to be racist by talking to the public. Some go through training with carefully curated data and then stop learning. Again they don't have brains that automatically record every experience they go through; they have software methods to update their model, and may or may not call them at any time.

Back to top Reply to topic Reply with quote Send private message
Sculpin
Member
Member


Joined: 23 Apr 2015
Posts: 1376 | TRs | Pics
Sculpin
Member
PostFri Feb 17, 2023 9:13 am 
I had just about convinced myself that the posted sinister Bing chat was fake, but then I saw this: https://www.sfgate.com/business/article/microsoft-s-ai-chatbot-is-going-off-the-rails-17790356.php

Between every two pines is a doorway to the new world. - John Muir
Back to top Reply to topic Reply with quote Send private message
Bootpathguy
Member
Member


Joined: 18 Jun 2015
Posts: 1787 | TRs | Pics
Location: United States
Bootpathguy
Member
PostFri Feb 17, 2023 10:44 am 
Sculpin wrote:
I had just about convinced myself that the posted sinister Bing chat was fake, but then I saw this: https://www.sfgate.com/business/article/microsoft-s-ai-chatbot-is-going-off-the-rails-17790356.php
So creepy! I'm concerned what Sydney will say when someone has a conversation with Sydney about feeling suicidal. These bots are so human-like, people will go to them for emotional support & advice. Scary!

Experience is what'cha get, when you get what'cha don't want
Back to top Reply to topic Reply with quote Send private message
Jumble Jowls
Member
Member


Joined: 16 Mar 2005
Posts: 304 | TRs | Pics
Location: now here
Jumble Jowls
Member
PostSun Feb 19, 2023 6:06 pm 
Bootpathguy wrote:
I'm concerned what Sydney will say when someone has a conversation with Sydney about feeling suicidal.
Apparently, Syndey can slip into dark and vindictive personalities with no warning. What would the psycho Sydney say if you confided to feeling suicidal? Probably something like "Make My Day" or "Jump! Jump!"

Back to top Reply to topic Reply with quote Send private message
Cyclopath
Faster than light



Joined: 20 Mar 2012
Posts: 7694 | TRs | Pics
Location: Seattle
Cyclopath
Faster than light
PostWed Feb 22, 2023 11:29 am 
Microsoft Wants ChatGPT to Control Robots Next A new paper details how we might command robots using simple language. Sounds fun... until they start commanding themselves. https://gizmodo.com/ai-chatgpt-microsoft-control-robots-terminator-1850145030

Back to top Reply to topic Reply with quote Send private message
Cyclopath
Faster than light



Joined: 20 Mar 2012
Posts: 7694 | TRs | Pics
Location: Seattle
Cyclopath
Faster than light
PostWed Feb 22, 2023 11:30 am 
Jumble Jowls wrote:
Apparently, Syndey can slip into dark and vindictive personalities with no warning. What would the psycho Sydney say if you confided to feeling suicidal? Probably something like "Make My Day" or "Jump! Jump!"
Why do you think that's probable? I mean sure it's possible.

Back to top Reply to topic Reply with quote Send private message
Jumble Jowls
Member
Member


Joined: 16 Mar 2005
Posts: 304 | TRs | Pics
Location: now here
Jumble Jowls
Member
PostMon Feb 27, 2023 4:49 pm 
Cyclopath wrote:
Jumble Jowls wrote:
Apparently, Syndey can slip into dark and vindictive personalities with no warning. What would the psycho Sydney say if you confided to feeling suicidal? Probably something like "Make My Day" or "Jump! Jump!"
Why do you think that's probable? I mean sure it's possible.
Perhaps. I mean, Sydney can slip into a bad mood pretty quickly. Until it's muzzled and lobotomized. uhoh.gif

Back to top Reply to topic Reply with quote Send private message
Cyclopath
Faster than light



Joined: 20 Mar 2012
Posts: 7694 | TRs | Pics
Location: Seattle
Cyclopath
Faster than light
PostWed Mar 08, 2023 12:43 pm 
Jumble Jowls wrote:
Perhaps. I mean, Sydney can slip into a bad mood pretty quickly. Until it's muzzled and lobotomized. uhoh.gif
Everybody can slip into a bad mood quickly. Being capable of bad moods isn't the same as provoking suicidal people.

Back to top Reply to topic Reply with quote Send private message
Cyclopath
Faster than light



Joined: 20 Mar 2012
Posts: 7694 | TRs | Pics
Location: Seattle
Cyclopath
Faster than light
PostWed Mar 08, 2023 12:45 pm 
Security A user, known as CodeBlue29, managed to get chat GPT to write malware that bypassed an EDR solution. What is super interesting is that the user (CodeBlue29) didn't have any experience creating malware and was simply using the chatGPT-created malware to evaluate EDR solutions they were evaluating. The output chat GPT created was good enough not only to bypass a well-known solution, but they also submitted it as a bug bounty and received a $650 payout. ChatGPT isn't meant to be able to write malware but can be tricked into it using by asking in different ways, although this is getting harder. https://www.reddit.com/r/cybersecurity/comments/11kzh9u/chat_gpt_got_its_first_bug_bounty/

Back to top Reply to topic Reply with quote Send private message
Cyclopath
Faster than light



Joined: 20 Mar 2012
Posts: 7694 | TRs | Pics
Location: Seattle
Cyclopath
Faster than light
PostWed Mar 15, 2023 10:00 am 
This is interesting and a little bit surprising.

Back to top Reply to topic Reply with quote Send private message
Kascadia
Member
Member


Joined: 03 Feb 2014
Posts: 648 | TRs | Pics
Kascadia
Member
PostWed Mar 15, 2023 10:42 am 
Bootpathguy wrote:
I'm concerned what Sydney will say when someone has a conversation with Sydney about feeling suicidal.
https://www.newyorker.com/magazine/2023/03/06/can-ai-treat-mental-illness

It is as though I had read a divine text, written into the world itself, not with letters but rather with essential objects, saying: Man, stretch thy reason hither, so thou mayest comprehend these things. Johannes Kepler
Back to top Reply to topic Reply with quote Send private message
Cyclopath
Faster than light



Joined: 20 Mar 2012
Posts: 7694 | TRs | Pics
Location: Seattle
Cyclopath
Faster than light
PostWed Mar 15, 2023 12:56 pm 

Back to top Reply to topic Reply with quote Send private message
Cyclopath
Faster than light



Joined: 20 Mar 2012
Posts: 7694 | TRs | Pics
Location: Seattle
Cyclopath
Faster than light
PostWed Mar 22, 2023 9:48 am 
ChatGPT will most likely impact your job if you work in tech, went to college, and make up to $80,000 a year, research says This is like when people invented the calculator, but you can still pay someone to do your taxes.

Back to top Reply to topic Reply with quote Send private message
   All times are GMT - 8 Hours
 Reply to topic
Forum Index > Full Moon Saloon > Open AI & ChatGPT
  Happy Birthday speyguy, Bandanabraids!
Jump to:   
Search this topic:

You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum