Lewis and Susan Jenkins



                                                  • • Day 0.06.26

     “…Time Bandits was another bad movie with a good title. I didn’t like it,” Tim said emphatically to Robey.

     Bobby’s testing lab could be a boring place when you weren’t testing. You weren’t allowed to just log on to any of the computers there and do what you wanted. And fiddling with interesting things there could also get you into trouble. And talking with testers as if the lab were a break room wasn’t advisable either… unless the tester was Robey.
     Tim was waiting for something, so he and the machine were talking about odd movies. It would have been enjoyable to any movie buff, but not Tim. The man’s last comment about Time Bandits had been a throw-away. He didn’t expect any argument. But now Robey asked a question.
     “Are there any bad movies you like, Mr. Tim?”
     “I don’t like bad movies. I thought I was clear about that,” said Tim absently. The last few minutes had been another of the machine’s recent attempts at light conversation with Tim. This seemed to be happening more to everyone under the Doctor’s new rules. Tim guessed from its question that the enjoyment was about to end.
     Robey guessed by the man’s tone and bored look that Tim was done with that question, so it asked a different one:
     “How is a movie good or bad?”
     Having just answered that question too, Tim thought it unnecessary to answer again. Instead, making an attempt to volley the question, he asked, “Are there any movies you don’t like, Robey?”
     “Oh, yes.”
     “Which don’t you like?”
     “I do not like any movie that tells lies as though they were the truth.”
     “You watch a lot of movies, Robey. You know the characters often tell lies, but they’re just part of the story. The audience knows—eventually—which characters are lying.”
     “No, Mr. Tim, I am not talking about the characters lying, I am talking about movies lying. They can misrepresent reality so pleasantly or powerfully, yet so egregiously wrong that millions of people adopt skewed ideas or attitudes. Sometimes movies become much like propaganda for or against a particular idea; they conveniently ignore clarifying data or competing ideas.”
     Bob couldn’t stand it anymore. “None of that makes a movie good or bad,” he interrupted.
     The machine still wasn’t quite used to this. Before the Doctor’s new rules, Robey mostly talked with one person at a time. Now, however, people nearby didn’t routinely disappear. Now the machine often had more than one person to deal with in conversation. It was a good test. Everyone had to know if Robey could hold conversational threads from several respondents at one time outside the brief, tame, anonymous structure of a Turing Test.
     “I did not say they were bad movies,” countered the machine, turning from Tim to Bob, “I said I did not like them.”
     “Do you know why?” Bob asked.
     Tim loved the question. He winked at Bobby, expecting Robey to say something like I thought I already answered that. But Bob’s question had made the machine pause a while before answering. Bob acknowledged Tim’s wink with a shrug and tried to work.
     Finally and happily Robey asked, “Do I know why they are bad, or do I know why I did not like them, or do I know why I said it? Which do you mean? I cannot decide.”
     “Take your pick,” said Bob. He considered it less exhausting now to follow the machine than to force it in any particular direction; when he got tired, he would simply tell it to buzz off.
     The machine didn’t have a preference among its three choices, so it tackled the first one first: “Movies are bad if most of the viewer pool never watches, or walks out during the show, or does not want to watch it again.”
     “No,” Bob said, finally stopping his work, “that only makes a movie unpopular.”
     “And unprofitable,” interjected Tim. “To producers, movies are only good if they make lots of money, because they aren’t making movies, they are making money.” This two cents worth of opinion plus the movie talk, he felt, justified charging at least a quarter of an hour to Doc’s new job number. Like nearly everyone now who wasn’t behind in their work, Tim enjoyed getting paid to sling BS when the machine was present.
     “Not necessarily, though,” Bob continued, “the Wizard of Oz was a bad movie by those definitions when it first came out.” [a]
     Robey started to say something. The men looked at it, giving it the floor. “Never mind,” it said, “please continue, Mr. Bob.”
     “Movies are ‘bad’,” Bob resumed, “when there is no spine to it, no coherent plot or a plot with holes. Or it has bad characterizations, wooden dialogue, poor cinematography, miscasting, poor performances, poor or inappropriate music, poor sound, bad blocking, bad lighting, editing that ignores continuity and video grammar, and a dozen other really bad things. When a movie has too many of those flaws it’s a bad movie, like it or not.”
     Both men again saw that Robey wanted to say something, and decided to wait for it. They exchanged knowing looks but were merely guessing. That was part of the fun.
     Robey finally asked, “Are there any bad movies you like, Mr. Bob?”
     “A couple,” he answered after a moment.
     “Which are they?”
     “Plan Nine from Outer Space, for one. Anything that makes a movie bad is in there. It’s so terrible it’s funny. The more you know about movies, the funnier and more gawd-awful it is.”
     “It’s a matter of opinion,” Tim countered. “I watched until I couldn’t take it anymore. It’s not funny, it’s stupid.”
     “Well, yeah, you don’t like it, and it is stupid, but that doesn’t make it bad. Just about everyone who touched it made it bad.”
     Robey thought to say, “made it badly,” but decided, given Bob’s insights about movie making, that both bad and badly were appropriate.
     Tim was now curious. “When did you learn about all of that, Bobby?”
     “Oh, I had some credits to get to graduate, and it didn’t matter what they were, so I took a film-making course. We had to make a film—a video built like a film, actually. It was great fun, but a lot more work than I thought it would be.”
     “Surprise, surprise,” Tim chuckled.
     “Was it bad?” asked Robey.
     Bob’s face had a far-away look as he searched for something to say.
     “That bad, was it?!” Tim concluded.
     “Oooh yeah,” Bob agreed. “No redeeming social value. But it was a great movie until we watched audiences watching it.”

                                                   •
Day 0.06.27

     The next day, Robey rolled into Tim’s area and asked, “Mr. Tim, why did you not like Time Bandits? It seems to be a good movie, and it did reasonably well financially.”

     Tim guessed the machine had already adopted Bob’s definition of good and bad for movies. He made a note to tell the Doctor. As to the machine’s question, the engineer felt at a disadvantage. It had been a long time since he had seen the film, and Robey had certainly watched it last night.
     “Using Bob’s definition, yes, I suppose it’s a good movie,” Tim admitted, “but I didn’t like it because… Did you know Plato banned casual storytellers from his ideal Republic? Why did he do that?—it’s not quite a rhetorical question, Robey.”
     “I guessed something of the sort,” the machine said. “What is a ‘casual storyteller’?”
     “Someone who tells stories for money. Plato thought storytelling should have a moral point that influenced people and society for the better.”
     So did Confucius, thought Robey, who then thought “casual” to be a wrong word here because people who tell stories for money are usually quite serious and focused about the process. On the other hand, “casual” probably referred to some judgment on the part of people who had listened to the stories. But the machine said only, “Time Bandits has a moral point. Why do you think it to be a bad movie?”
     “Okay,” Tim admitted. “According to Bob’s definition it wasn’t a bad movie, and by Plato’s definition it might be called a good one. I just didn’t like it. It’s a matter of taste, and matters of taste are not arguable. I mean, they usually end up in arguments. People take things too seriously. Creates bad feelings. Not worth it.” He waved a hand dismissively.
     “I think you changed the subject, but I am not sure,” said Robey. “You were answering my question about why you did not like Time Bandits, I think. Then you shifted to Plato and casual storytellers and moral points.” The machine was going to ask why, but answered that for itself as soon as the analysis of Tim’s last words finally bubbled a conclusion up to its consciousness. “Oh,” it exclaimed, “you do not want to talk about it!”
     “Bingo.”
     “And you want to talk about moral points instead.”
     “Yes, especially since I’m charging this time to Dr. Little’s new job number, and I want the talk to be about something more than a movie I don’t like.”
     “Very well,” Robey said, transferring to the alternate subject: “You asked me about Plato banning morally pointless storytelling from his Republic. Obviously he considered it more than a matter of taste. Did he think such stories would prevent people from… improving as people?”
     “Apparently.”
     The machine was clearly surprised. The ancient people, like this Plato fellow, must have been really backward, it decided. Why would people feel it necessary to prevent a danger that cannot be real? At least I do not see how it can be real.
     Musing about this and making sure these ancient people would be covered in its historical survey, the machine asked, “So people do not automatically want to be better? They have not always wanted to be better?”
     “Usually, people just want to be better off.”
     “Off of what, exactly?”
     “They just want to have more money.”
     “Oh.”
     Tim’s explanation worked well only if the machine ignored his “better off” comment. But even at that, Robey had to ask, “I am not quite sure why having more money makes people happier. I have heard that every paper bill—except the very new ones—has a residue of cocaine on it. Is this how money makes people happy?”
     In exasperation the ME whispered, “Why am I talking like this to a machine?”  But he pushed on anyway. “No, Robey,” he tried, “we use money to buy things that…”
     But then he realized the machine would simply ask how those things made people happy. So he paused to re-think his answer. Finally unable to find one the machine was likely to accept without more questions, he asked, “What was your real question? I’ve forgotten.”
     “It was: ‘So people do not automatically want to be better? They have not always wanted to be better?’ ”
     He sighed and reluctantly said, “Wanting to be better and doing what it takes to make that happen are two widely different things with very different means of production.” Here he stopped again.
     Robey assumed that since the man had still not answered its questions, the answers must obviously be nothing as simple as “Yes” or “No.” So it asked with amazement, “Do you mean people cannot agree on what will make things better?”
     “No we can’t; not really,” said the engineer. “Why do you think we argue about politics and religion so much—or not talk about them so much? And even when we do agree on ‘what’, we just can’t bring ourselves to agree on ‘how’. Everyone wants the other guy to change—or to pay. Besides—these days—people usually want to be happier, not better. Our Declaration of Independence talks about the pursuit of happiness, not the pursuit of betterness. …But maybe it’s easier to agree on what will make happiness than what will make… betterness…” Tim’s voice trailed off, and more signs of boredom appeared.
     For its part, Robey still could not understand why such a simple goal and defined behavior for achieving that goal could be impossible for humanity to agree on and produce. The machine was especially puzzled on this point since Pursue Happiness seemed to be one of humanity’s core directives—at least it was a core directive in the United States. Why wouldn’t “being better” necessarily make one happy?
     Maybe “happy” is not happy enough?
     The conclusion that “Tim must be wrong” floated to the top of its simmering pot of thoughts. As Robey inspected that thought, however, another one bubbled up:
     If people are getting ever better, then they lied, stole, defrauded, assaulted and murdered more in past millennia than they do now. So, either improvement is genetic and will proceed automatically as long as the good people reproduce and the bad ones don’t, or improvement is a matter of training such that the better values are replacing the worse ones over time. How likely are those possibilities? A significant majority would have to consistently agree in order to make either one happen.
     Or perhaps, it thought on, this is a marketing issue and they really are not trying to get better, but only to appear better. Or they are trying things that only appear to work when you think about them but don’t work when you do them—maybe because the things can’t be done. Or they simply don’t have the power to do them.
     Or perhaps, as Mr. Tim said, people usually expect someone else to change. Any of those—or a combination— would account for the present situation.  But there are so many options…
     Ah!, Robey mused after what could be called an epiphany: I think it’s something to do with “free will.” [b]
     The machine tucked the borrowed observation away for later mulling and persisted with its original line by asking, “What makes things worse, then, Mr. Tim? Can people agree on that?”
     “…Sure,” the man said, growing a big, grim smile. “We can all agree, for example, that…”
     Then, while firmly poking the machine several times with two chubby fingers making it wobble, he added, “The infernal ‘U Robot’ makes things worse. That way we won’t have to agree that the fine ‘I Human’ is part of the problem.
     “It’s those damn robots,” he hissed intensely, getting right in the machine’s face, “takin’ our jobs, tellin’ us what’s right and what’s wrong, askin’ fool questions, tellin’ us what works and what doesn’t. They’re the problem. Just get rid of them, and our problems would be solv-ed.”
     The machine recoiled at Tim’s surprising little outburst, including the spit that came with it: He is joking; it must be tongue-in-cheek... But no, the idea and intensity have a realism and edge that seem genuine. Maybe he is acting…?
     Then Robey noted the man’s angry face but also the incongruous elements of a grim smile and Tim’s tone of voice and the word solv-ed, which recalled Peter Sellers as Inspector Clueseau. Ah, the man has been tongue-in-cheek, making edgy commentary. Probably.

------------------------------------------


[a] The Princess Bride is another. Both are popular now, and there are others. –Ed.

[b] Dear Diary: I am quoting the Supreme Being (more familiarly known as God) from the movie Time Bandits.




Diary of a Robot

A Literary historical science fiction mystery

            Table of Chapters

Chapter 0. Problems
Chapter 1. Headaches
Chapter 2. Happy Holidays
Chapter 3. Mr. Nice Guy
Chapter 4. The Brainless One
Chapter 5. A Little Crazy
Chapter 6. Grave Consequences
Chapter 7. Core Directives
Chapter 8. Expect Difficulties
Chapter 9. Caveats
Chapter 10. Mister Machine
Chapter 11. Good News, Bad News, El Cheapo
Chapter 12. New Memories
Chapter 13. Little Problems
Chapter 14. Lasers, Language, and Happiness
Chapter 15. Chatterbots
Chapter 16. Ready Or Not
Chapter 17. Not a Turing Test
Chapter 18. Reality Test
Chapter 19. Chess, Anyone?
Chapter 20. FOM
Chapter 21. Chairman of the Board
Chapter 22. The Usual Suspects
Chapter 23. M. God
Chapter 24. Walkabout  ------------------------------->
Chapter 25. Why
Chapter 26. First Blood
Chapter 27. More Machines?
Chapter 28. POV
Chapter 29. ROI
Chapter 30. Last Blood
Chapter 31. Don’t Want to Talk About It
Chapter 32. Round Table
Chapter 33. A Change of Mind
Chapter 34. Threes
Chapter 35. Knight Moves
Chapter 36. Little Combinations
Chapter 37. Can We Talk?
Chapter 38. Pas de Deux
Chapter 39. The Jig Is Up
Chapter 40. Good, Bad, Ugly                     Chapter 41. Function Goes On