A very simple state machine would be a cat toy that meows until you pet it and then purrs until you stop petting it. These states are predefined sets of behaviours that the machine transitions between based on various predefined parameters. The cat toy could hiss and growl when you shake it and require more petting than usual to enter the purring state for a while after being shaken. Using such parameters we can define state transitions that give the impression that the cat toy has thoughts and emotions, that it can take a liking to some people and hold grudges against other, even give the toy some apparent degree of autonomy by having it enter different states based on randomised timers.
How can you know for sure that someone's not a very sophisticated state machine?
If such sophisticated state machines existed would you consider them people or simply very realistic depictions of people?
Pretty sure this is basically a "what is conciousness" debate, brought up from thinking about state machines. For me idk if it counts as a state machine but similarly I got thinking about it while thinking both about FPGAs (something you can program to be a state machine) and how it's all similar to how neurons in a brain work, and how it'd be pretty straightforward to just simulate a ton of neurons and their connections on a giant FPGA, if you knew exactly where to make connections (there's projects going on that'll make a virtual map of a brain, they're basically 3d-Un-printing it with an extreme level of detail while scanning/cutting it. Sounded like a multi-year scan). Anyways, it just kinda gets you thinking, you could some day just upload code that is someone's consciousness, to a computer. They'd likely not know they ceased to exist just a second ago.
There's also crazy side-thoughts like what if you made 100 of the same person from the same version of consciousness, would it all be one person or each be their own person.
The answer is each would be its own unique individual.
Which ties into the question is if a single clone from a deceased body is made, is it the old person or is it someone new.
The answer is it's someone new.
Which brings up the question of, is there a way to not die and be cloned, but transfer conciousness?
The answer is yes, you must have conciousness on your old body/hardware at the same time as your new body/hardware, prior to losing conciousness on you new body hardware. That is to say, at some point in time, a single conciousness must be the sole conciousness for two separate bodies. Alternately you may have the old conciousness and a new clone merge conciousness, then delete the old one and its hardware/body. But this wouldn't be 100% clean/pure transferrence, and at one point in time it'd be two separate entities, merged, rather than one entity expanded then shrunk.
But, what is conciousness? Computers, state machines, complex extremely abstract ones. The only thing that separates us from an emotionless computer without wants or needs, is the false, hardwired belief/experience that good and bad exists. All meaning in life, and sense of good and bad, are complex abstractions of perceptions of good and bad. Which can lead you to realize that nothing in life really matters or has purpose. Yet, you're human, and can't choose to unwire that false belief, at least not the sad/negative experiences, so trying to fight nature and "become a computer" is futile, and a road to depression. Also, why would you want to? It'd be like losing humanity, comparable to suicide, and that kinda indicates to me conciousness is what makes us feel/experience. Religious people would call it the soul. Which brings the question, if someone has bad "wiring" in their brain and lacks the ability to feel, or want, and just does without reason or personality, are they just a fleshy machine? Are they unconcious? Like the soulless people on supernatural, minus the negative/violent desires as those are still desires. Kinda high rn tho so might not make sense. Reading replies is too much work for now.