Designs of our Slave Race chapter 2

Chapter 2

It was down to the coding, typing out the algorithms that the new robots, called the HX01, in doing the various tasks that the dam thing would need. I merely added in the more concrete sections: receiving visual, verbal, and textile input, than having the thing figure out what to do with those sensations. That was a start. It would recognize key words and phrases, than complicated sentences that might not have a signal keyword, and trying to figure out what the speaker might have actually wanted, and store that knowledge for later reference. In an essence a learning matrix, so it would know what “Get me a beer” and “Go into the kitchen and make me a sandwich” would mean and do it like the obedient hunk of steel that it was. Now, this wouldn’t normally bother me, for I had done similar programming logic for those chat bots and that, the only new thing being the learning matrix, no problems so far.

The problem would lie in the next phase: emotion recognition and synthesis. Clearly more than simply “if” and “else” statements: the intelligence would stem from all that input, and not just the words needed to be decoded but the voice pitch and tone, and not just voice but body language, the way someone is slouched, tensed up or shaky. I’m no psychologist, that’s another department that tells us the tall-tale signs of various emotions, and we figure out what the droids should do when these scenarios come about.