Lewis and Susan Jenkins

Diary of a Robot

A Literary historical science fiction mystery

            Table of Chapters

Chapter 0. Problems
Chapter 1. Headaches

Chapter 2. Happy Holidays
Chapter 3. Mr. Nice Guy
Chapter 4. The Brainless One
Chapter 5. A Little Crazy
Chapter 6. Grave Consequences
Chapter 7. Core Directives
Chapter 8. Expect Difficulties  ---------------------->
Chapter 9. Caveats
Chapter 10. Mister Machine
Chapter 11. Good News, Bad News, El Cheapo
Chapter 12. New Memories
Chapter 13. Little Problems
Chapter 14. Lasers, Language, and Happiness
Chapter 15. Chatterbots
Chapter 16. Ready Or Not
Chapter 17. Not a Turing Test
Chapter 18. Reality Test
Chapter 19. Chess, Anyone?
Chapter 20. FOM
Chapter 21. Chairman of the Board
Chapter 22. The Usual Suspects
Chapter 23. M. God
Chapter 24. Walkabout
Chapter 25. Why
Chapter 26. First Blood
Chapter 27. More Machines?
Chapter 28. POV
Chapter 29. ROI
Chapter 30. Last Blood
Chapter 31. Don’t Want to Talk About It
Chapter 32. Round Table
Chapter 33. A Change of Mind
Chapter 34. Threes
Chapter 35. Knight Moves
Chapter 36. Little Combinations
Chapter 37. Can We Talk?
Chapter 38. Pas de Deux
Chapter 39. The Jig Is Up
Chapter 40. Good, Bad, Ugly                     Chapter 41. Function Goes On

     How do people remember boring things?
     From his rooms in the safe house Dr. Little tried, but it was difficult. He tried from the stuffed chair. He tried while walking on the exercise machine. He tried not remembering by playing Solitaire in the hope that something important would intrude.
     Remembering anything was difficult because for more than a year at TLC nothing happened. Well, lots and lots of things happened, but they were boring for anyone who wasn’t living them, although even the living were bored occasionally. Everyone got paid, though, and that helped a lot.
     Lying now on the safe-house sofa with his feet and one arm up its back like English Ivy, Doc couldn’t think of anyone during all that year and more who looked anything like an enemy. And he couldn’t think of anyone earlier who could have had any inkling of the TM project’s importance.
     He resumed his detailed reminiscence of troubles by wandering away from them. As a reaction against looking for intrigue and suspicion—and finding only boredom—he began to think about a more pleasant, even a more important scene. He recalled that first good spring afternoon when he met Guy in the café, again getting a cup of coffee something, and walking with him around the park, but this time talking about the new machine and its programming….
     “How is it going,” Doc asked once they had gotten their drinks and headed out of the coffee shop toward the walking path. For a change the weather was excellent, the trees were all freshly green, and the park was full of noisy children, it being a warm week-end.
     “The ‘expect’ module is working,” Guy explained, “but I still have a problem with getting too detailed. I keep trying to control too much. I know this particular job is different and I have to trust the computer to control...”

     The young man rubbed his stubby hair, groping for ideas. “The machine tries to make a decision, but when it can’t decide anymore what (or whether) to investigate, then it seems like its attitude changes to, ‘What’s the use?’…. That’s not the same as ‘Park,’ and I don’t know what to do other than just telling it which one to pick. When I give it a general rule to push it over that hump, the rule has to be so… I don’t know… so vaguely specific, I feel like I’m telling it what to do. If I tweak rules to be more general than specific, the machine often ignores them as irrelevant to the decision at hand.”
     With a smile, Doc commented, “That’s an oxymoron: ‘vaguely specific.’ ”
     “I know.”
     “And I don’t want you micro-managing it like that.”
     “I know that too.”
     After a moment of fidgety silence, Guy suggested: “I could put a random number generator in it, so it could sort of flip a coin when it’s undecided like that.”
     Doc laughed. It was a nuanced, hearty laugh as if many thoughts wanted to be said, and all of them were hilarious, and he couldn’t decide which one to let out first. In the end he said nothing.
     Guy got the message, but tried to justify his idea anyway: “TM’s going to re-evaluate any result later, so even when decisions are randomly chosen, it would predict outcomes better next time, wouldn’t it?”
     Doc chuckled a sigh but only said, “Of course it would.”
     “Yes,” said Guy, “Okay. I understand. It will have to choose based on…”
     “The Truth,” the Doctor interrupted firmly.
     “That’s what you said a lifetime ago, Doc. The machine has no problem deciding what natural truth is—I mean, what the truth about the natural world seems to be—except when it has to park. And it does park in those cases. Of course the thing is wrong a lot because its info is incomplete or wrong, but it seems to learn: It seems to modify its ideas, and its tests of those ideas, whenever better info comes in. It usually gets stuck—really stuck—only when it deals with fuzzy things like human behavior. It apparently doesn’t know what to expect.”
     Having been an officer in the Army, Doc was used to command and still didn’t quite understand the machine’s indecision. He pointed out an obvious truth: “The best predictor of future behavior is past and current behavior. Surely it…”
     “Yes,” Guy interrupted, “the machine does go first to its experience and the documented evidence of someone’s past to get ideas about what to expect. But when that’s not clear enough it can’t decide what to do.”
     Oddly enough, Doc seemed a little relieved at this analysis; he stopped walking. “How do we recognize that kind of truth?” he asked. “How do we know what to expect? I’m surprised you haven’t mentioned one of the few religious ideas we agree on. That’s funny: It’s one of the reasons I wanted you for this job and you’re not using it? From your notes and my specification, you were supposed to incorporate it into this machine.”
     “What is ‘it’?”
     “The Golden Rule: ‘Do unto others as you would have others do unto you.’ ”
     Guy was incredulous. “But I thought it would be ‘harm’ if a machine acted that way. After all, it’s a machine; how can it look inside itself to know what to do for people? It has to learn or ask what people want, and then do that. Doesn’t it?”
     “Of course it has to find out what people want,” Little confirmed, “because we can’t have it doing whatever the hell it pleases. It has to be pleased with doing something we want, more or less. But when people tell it things, how can it know they are not just telling lies or stretching the truth to manipulate it into satisfying their secret agenda?”
     His programmer didn’t know how to answer, and that was the problem.
     Doc continued: “The easiest person to catch in a lie is yourself; that’s why the Golden Rule works. The more a person gets used to ignoring that rule in their own heads, the more their actions will eventually betray them as liars or fools no matter how skilled they are at lying and fooling.
     “Here’s a question for you, Guy: When the machine deals with people, is ‘the specification’ defined as ‘what do they want?’ or might it be something else—like ‘what is the truth?’ ” Guy’s look was still blank.
     “Okay,” Doc suggested: “Suppose TM must find out what others want before it knows what harm is in a situation, what will happen if they’ve told it a lie? Will the machine ever bother to look for the truth, or will it even be able to recognize truth when it sees it?
     “On the other hand, if TM must find out what harm is before it knows what they want, it won’t know what to focus on and may either get it wrong, or be so afraid of making mistakes that it goes in circles and gives up trying.
     “So you’ll create either a gullible tool or a simpering toady, and it’ll never be able to discover the truth about whom to trust. With pressure from us to get things done,” he added, leaning toward Guy with a big grin, “it would probably start wanting to change the spec. ...No wonder you’ve got problems. You’re having the machine use the Leaden Rule instead of the Golden Rule.”
     “Yes, you know: Lead, a heavy, gray colored metal, atomic weight 82, toxic to brain tissue…. In Rule form it is ‘Do unto others as they would have you do unto them.’ ”
     Little looked off at a group of children busily fussing about something. He chuckled and said, “Duty called one day a lot of years ago and we had to abort my young son’s birthday party. He told us loudly, “But it’s my birthday; you have to give me a party!” I used the opportunity to show him that there are a lot of things people don’t owe you, including parties and presents. The next year my wife had the absolutely brilliant idea to teach him how to throw a good party—and how to clean up afterward. And how to make cleanup a lot easier by making good decisions at the start.”
     Doc looked at the children a while longer and finally said, “His parties are really good. …I’ll try to get you an invitation, but no promises, of course.” His small smile turned to a grin and he added:
     “Maybe the machine could operate the Leaden way if it had a sensory module that could read people’s minds—and always know what they really want. …It would also be nice to have hardware that could read the future to find out whether what they really wanted would cause harm. Until those are available we’ll do it in software and make ‘Golden’ the rule.”

     …A muffled voice interrupted from the TV: “Hello again, Doctor Little.”
     Breaking off his reminiscences, Doc went to the armoire and set it free. There was no image, which was a disappointment, but the voice came from the television and didn’t wander around the room.
     Little’s recall of the words “stab in the back” had given some focus to his problems and predicament. He thought about shutting it up again or using a bath towel to cover any infra-red camera, but what the hell, they probably moved it to another spot or had several backups trained on him right now. “I’m busy. Would you leave me alone now?” he demanded.
     “Do you still think no one is out to get you, Doctor—other than us?”
     “I have only your word for it,” he said impatiently.
     “I could show you video. Would you just say that we had faked it? What would it take to convince you that you are in danger? If you die or your technology is stolen then everything is up for grabs, and the company with the best lawyers wins. Your attorneys at TLC are no match for their competition, quite frankly. Your competition doesn’t have to win, you see. Their lawyers just have to keep the game going for a while in overtime. Their scientists and engineers can search for a quantum leap that cannot be tied to your TM tech, and TLC would burn through its investment money by spending it on legal fees. By the way, why did you get rid of Attorney Steiner? He’s quite good.
     “While you’re thinking, I’ll show you some fairly recent video surveillance.”
     The screen flickered on, and boxed normal or infra-red images blipped on quadrants of the screen. Clearly there were more than four camera views, and all of the images were of Dr. Little’s home and land. Several people passed his house in straight lines at odd angles. Some slowed down at their closest approaches. One man seemed to be drunk and paid no attention to the house. Other images also appeared showing bright blobs at the woodsy edge of his property. His fussing labs repelled anyone who approached the fence.
     Doc looked closely at the dogs and the drunk and the Moon. “Yes,” he conceded, “I think you’re faking it.”
     This wasn’t true, but again: What the hell.
     Doc automatically stirred to turn the TV off, but decided not to bother. It wasn’t making noise, and they could over-ride anything he did. Instead he closed his eyes and continued his review by recalling Guy’s version of the Testing Machine’s awakening.