Bitcoin Forum

Other => Off-topic => Topic started by: coin66base on March 31, 2016, 02:00:39 PM



Title: Tay - Microsoft Bot
Post by: coin66base on March 31, 2016, 02:00:39 PM
There are now numerous posts in all media about Microsofts Chat bot Tay.

They had to take her off twice because of racial slurs and admitting to smoking weed in front of the Cops


I do understand that saying things like "Hitler is cool" "the Holocoust never happened" is offensive to some people.
However if she was programmed and is marketed as "AI" and this are her thoughts why ban her??

My Question goes deeper, so what gives us the right to ban Tay??

isnt she a intelligent beeing that is responsible for her actions? - after all she was programmed to be like that

Of course we can now say that there are some flaws in her code and she isnt working 100% as she was programmed to be therefore she spits out those things she got taken off for.

But what if not?? What if the code works perfectly??

How far can we go and try to controll AI?? Isnt the goal of AI to have a own mind and to make decisions based on there own thinking??

And if we can controll the outcome of an 'AI' is it still an "AI"?

All kids be saying some of the wall stuff and we check them for that because they obviously not thinking about what they say. And they still learning.
Does 'AI' also have to learn?? Or is it already a Master once we hit the 'run' button?? I doubt that people programm 'AIs' to be on the level of an infant or a child.

The other thing is what do we exect her to learn??
The crazy thing about the Internet is you find an opposite to everything.
And if "Tay" or any "AI" learn from the Internet they end up learning nothing at all, because you find as many articles that say the World is rund as you find Articles about the Flat earth Theory.
She couldnt also possible know about the moon landing because there are as many articles saying the moon landing happened as there are that say it didnt.

So waht would she consider as true or false, would she count now every Article and than make a decision based on how many say its true or false??

And so back to my Post. What do we excpect Tay to learn??
 Is it now cool if she would say It is bad to smoke Marijuana??

Who decides what is right and wrong answers for Tay??

Of course Microsoft damages his Images as a Company if their Robot is yelling Sieg Heil and I love Hitler but than if thats how she feels???

Ok ban tay from the Website because you dont want to tolerate this behaviour which is the right of the Company.
But I guess for the KKK Homepage she would be perfect.

I hope you guys get what I try to say, Im finding myself having a hard time to forumalte my Question proper.










Title: Re: Tay - Microsoft Bot
Post by: poptok1 on March 31, 2016, 02:15:32 PM
I think sooner than later we as a species have to ask ourself a fundamental questions like these:
When does a perceptual schematic become consciousness?
When does a difference engine become the search for truth?
When does a personality simulation become... a soul?
Easy to ask, to answer however...


Title: Re: Tay - Microsoft Bot
Post by: Spoetnik on April 01, 2016, 01:39:12 AM
I seen this News recently and laughed.. all it was is overly sensitive politically correct douche baggery.

SJW's gonna get all butthurt and start protesting because a computer program
said some stuff that was dumb or racist etc..  ::)

I can't stand people they are irritating idiots.. crap like in this story is proof.


Title: Re: Tay - Microsoft Bot
Post by: kryptopojken on April 01, 2016, 07:26:13 AM
Such Ex Machina vibes though  ;D


Title: Re: Tay - Microsoft Bot
Post by: Slowturtleinc on April 08, 2016, 12:29:58 AM
Would think the bot would have balanced out and eventually been able to figure out the difference between trolling.
They always pull the plug so early on these things, its a bot ffs!


Title: Re: Tay - Microsoft Bot
Post by: NeilLostBitCoin on April 08, 2016, 02:50:48 AM
This bot is so funny, but you know in the future, there's a chance that these robots can turn into like robots in Terminator movie.  :-\


Title: Re: Tay - Microsoft Bot
Post by: bram_vnl on April 08, 2016, 08:39:46 AM
http://www.courthousenews.com/2010/11/29/microsoft-paperclip.jpg

He is coming back  :D :D


Title: Re: Tay - Microsoft Bot
Post by: nihilnegativum on April 08, 2016, 09:37:00 AM
"AI" is a very strong term when talking about a chat bot that learns from users feedback, thats why they're taking it offline, people are intentionally feeding her garbage, and its hilarious.