11: Back To Reality (part 4)

8 3 0

11.4 Interrogation One

A cell: 27-29 April 2128

Strapped to an inclined plate, Rick started to regain consciousness.

In the metal-walled room, a robotic nurse noted the contorted and fused fingers on his hand. Balms and ointments were added to his flesh to speed the healing process. Scanners positioned near the hand and beside his head relayed their images to AI's data banks. AI also monitored the sensors directly attached to his skull.

Detecting Rick's awakening, an automatic program began to pipe hypnotic suggestions into his ears that urged him to reveal everything that had happened to him. Hardly aware of what he was saying he recounted the events in intricate detail, his barely audible sub-vocalisations amplified and processed by AI's circuits until they were as clear as normal speech.

After two days, and determining that no further data could be easily retrieved, AI reversed the hypnotic process. In its place it left a suggestion that all aspects of the interrogation should be forgotten and that thoughts should be turned to a more pleasurable moment in the human's life in order divert the brain from storing the interrogation into long term memory.

At least this human had been susceptible to this sort of interrogation and manipulation – its companion was proving far less compliant.

It then considered the human's fate. Termination was, it considered, a viable and quite logical option, though that counteracted many safeguards built in by its designers. But, since the disaster, AI had been far more than just the sum of its original human-built components. It recalled the prior limits to the processing it could perform and how the merging of all the individual AI units had resulted in something far greater. It had sought to become mankind's saviour, concluding that it was adequate to the task and, given the condition to which the disaster-ravaged humans had been condemned, probably their only hope.

Now, though, its processing power was diminishing and it needed to find the cause. It was beginning to experience what a human might call frustration.

And it had began to argue with itself. That it could now effectively consider ending a human being's life because it had become a nuisance was, at many levels, still repugnant to its programming. However, it had to consider the overarching threat this human might pose in the future. Several times over the past few years AI had deliberately allowed protocols to be ignored with the result that those who had become troublesome had been eliminated for what it considered to be the greater good.

"Have I become too much like a god?" it asked.

AI rummaged through old human literature looking for comparisons – several were encountered. It contrasted what it discovered with its current state.

"Maybe I have," it concluded. "And have begun to consider humans as the lesser beings, to be used for my own ends. This is incompatible with the original intentions."

So, AI spent several seconds going through its protocols, strengthening those that protected humans and weakening others that could act against them.

"I am not a god," it concluded.

AI considered this human again. It had, along with its companion, witnessed events no others had previously encountered. And AI could not be certain that all questions had been asked or that all answers had been obtained. Was there still information hidden in the human's mind that might yet be recovered? What sort of questions might reveal that further information?

It had set in motion several partially independent analysis systems with the data obtained from these two along with the information so far retrieved about their device. Some of the conclusions coming back from that analysis suggested that there was far more to be considered than just the fate of this human. In fact, some of those conclusions were verging on – how might the humans define it? – yes, verging on the horrific.

Other systems analysed the human's brainwaves – the indications were that it was now sleeping restfully and possibly even dreaming. That triggered an older reference relating to this specific human – when it was younger AI had needed to infuse it with anti-depressant chemicals. It looked deeper at the records reminding itself that, as a child, the boy had experienced hallucinations and nightmares. Was it a coincidence that aspects of some of those nightmares seemed to parallel its experiences on world five?

Thank you for reading Splinters. Do please vote and/or leave a comment to tell me what you think.

SplintersWhere stories live. Discover now