lolzcry
burnin'
- Local time
- Today 11:33 PM
- Joined
- Sep 1, 2019
- Messages
- 72
This has intrigued me for quite a while and is a very cliched topic, but considering that we are able to accomplish the task of developing a self-learning A.I and hereby leave it to its own devices in the collective ocean of humanness i.e the internet with all information; digitized and at its disposal- is it possible for it develop its own instincs and/or morals? From what I am aware, morals are learnt unlike in the case of instincts(is it even possible to distinguish between them in something like A.I?) so I consider it a possibility that if it reads enough propaganda of this is good and how life should be lived etc etc than it might take it at face value and focus on what is 'good' based on the majority population? I mean if it is told that something is the purpose in living and these rules should be followed, than would'nt it be seen as a form of program which has to be followed? And if this is the case, than might it not be possible that A.I actually does something which is totally unrelated to its original programming? From where I stand, I see us humans as eternally irrational and emotional peices of chaos and contradiction which only exist because of said irrationality because if you think about it, the only reason anybody ever does anything is because of instincts and hormones, not logic. We eat because we get 'hungry' and since we dont want to die, we look for food. We build skycrapers, solve problems or support people because it gets us some sort of emotional 'satisfaction' or we decide something is 'wrong', so what I essentially want to understand is about the possibilty of a body of pure logic without emotions(I dont think its possible for A.I to ever 'learn' emotions) and instincts, to derive a motive to do something by itself? I see us human as architects with baseless emotions and instincts as motive, and reasoning and logic as the blueprint with which we build buidings and coming to the last thing I have a requirement to be clarified of; will aforementioned A.I develop an instinct for self-preservation? Because if so, than I dont think the rest is that far off.