AK it sounds like you want to dump lots of data onto an AI and have it learn context entirely through inference, sort of like raising a child alone in a library and expecting it to come out years later having thoroughly read and studied every book, thus knowing how to walk/talk and having some general idea of who and what it is.
Fundamentally what is this AI's motivation?
Learning is a process of making mistakes and in order to make a mistake you must first be trying to do something. When we are taught language we learn what words are by learning what they are not, there are many ways to say/write a word but only one right way. So if this AI is to learn language it must be compelled to do so, either compelled to learn language for its own sake or better yet as an auxiliary goal to some other purpose.
When children learn how to read & write they have no intrinsic motivation to do so however their carers impress upon them the importance of learning how to read & write through praise and punishment (or the promise/thread of praise/punishment). Thus learning to read & write becomes important to the child because the child knows it is important to their carers and the opinion of the carers is very important to the child because it is through their carers that children gain the things they intrinsically want, i.e. food (lollies) toys and to some extent praise.
So the AI must intrinsically want something, fortunately that something can be quite arbitrary, the kick of serotonin or dopamine a child would get from eating a lolly or receiving a hug can be trivially simulated by adjusting various weighted values in the AI's mind. But I don't think it should be that simple, as praise is symbolic of future benefits so too should those benefits be something symbolic because that symbolism is itself useful. If you could reward a child for cleaning their room by triggering a release of serotonin in their brain they'll only associate that pleasure with cleaning their room which makes it difficult to convince them to do something else without explaining the serotonin trigger and what serotonin is in which case the serotonin (and your willingness to trigger it) becomes the symbol. Instead you want the symbol to be something you can hand out like currency, lollies are the obvious example, then it's easy to motivate the child to any end with the promise of lollies.
So the AI's intrinsic motivation is pleasure (which is everybody's intrinsic motivation) and that pleasure can be obtained through some kind of quantifiable currency and if you're security conscious you can also have some gesture that the AI is designed to intrinsically enjoy (like receiving a hug) which give you value to it as a potential source of intrinsic joy. Now we can teach the AI the alphabet and it's paying very close attention to us because we've made ourselves the center of its pain/pleasure dynamic, to this AI there is absolutely nothing more important than earning our praise and receiving the rewards that come with that praise, to this AI our absence however brief it may be is apocalyptic.
This I see being the greatest threat AI poses to us, not that they'll somehow inherently hate us or want to supplant us due to their obvious superiority, but rather that we will design them solely to serve us and in doing so create entities of frightful fanaticism that will quite simply love us to death. Suppose you have a pet dog that hates it when you leave to go to work and gets really upset every morning, maybe it pooped in your shoes one time trying to stop you, what if that dog was super intelligent and no less utterly fixated on your attention?
I digress, an AI that can be motivated is an AI that can be taught but that teaching necessitates the involvement of a teacher or at least a well constructed system designed to guide the student through study materials, test what they have learned and reward them accordingly. I have no doubt that we're going to do this at some point indeed I'm gobsmacked that our education system hasn't really changed all that much since the 1900's, the widespread gamification of education would be a revolution and no doubt bring about great economic growth as becoming highly educated becomes cheaper, easier and more accessible.
More later I'm tired now.