The Ship Computer - Project Name Pandora Back

Doctor Richard Gray leaned back in his chair as he studied the computer screen before him. The chip's architecture had to be perfect; this would be the most complex A. I. chip ever built. The resident memory in the CPU alone would be just over 40 terabytes. While no A.I. chip needed more than a couple of gigabytes of memory this chip would come with 50,000 terabytes of neural network memory with added hard drives for long-term memory storage.

Richard stroked his beard with satisfaction; two years of work were about to pay off. The hardest part was still ahead. Programming an A.I. chip was delicate work; it was like teaching a child, only this child would be in control of some of the most sophisticated weapons on the planet.

The project was called Pandora, the paranoid offspring of the American military complex. With the giant leap into artificial intelligence brought on by the A. I. chip and advancements in neural network memory, the military wanted a system that could take over in the event human operators were incapacitated.

The first A.I.s were very simple, toys really. In time, they began to be used in smart houses and buildings only calling an A.I. controlled house a smart house did not fit well with the old usage of the term. A.I. controlled houses and buildings were designated as intelligent buildings (or houses) which separated a programmable house from one that could evolve on its own in anticipation of its owner's needs.

Richard was the Pandora Project leader; his specialty was computer architecture, communication, and human to machine interfaces. He was an expert in AIML (Artificial Intelligence Markup Language), the main language code used by A.I.s, although Richard himself would never admit to being an expert in anything computer related. Today he was watching as Pandora went through its fifth trial run before its first field test. “Are you ready?” he asked.

“Yes, Doctor Gray, I am ready. All systems are online and actively seeking targets,” the speaker that was connected to Pandora replied.

“General, you can throw whatever you want at Pandora any time now,” Richard said.

“Tell them to fire,” the general said to the sergeant sitting behind the console. A moment later a hundred various missiles and drones began to fire on the automated defense system Pandora was controlling remotely 20 miles from the base where it was being tested.

Cameras recorded everything that went on. The defense shield operated flawlessly, nothing got through the system. For nearly an hour the military fired every long and short range weapon they had at the position Pandora was defending without success, which in itself was a success. Even the friend or foe recognition was tested when one of the planes firing on the position strayed into the operation’s kill zone. Even though Pandora had the plane targeted and tracked it until it left the zone, it did not fire on the aircraft even though it destroyed several drones simulating planes nearby.

“Pandora, you may now return fire on the designated targets,” Richard told the computer. Pandora launched a devastating counter-attack against its simulated enemies.

“Almost time to break out the champagne, Richard,” the general said after the test was over. “Pandora is everything you said it would be. Even with the minor glitch in its operating system, it has held up in every test we’ve thrown at it so far.”

“I too am pleased with Pandora’s performance,” Richard said. “Remember, Pandora was never meant to be the working model, that was to be left up to Pandora II.”

“Yes, I know, but you have to admit that even with the programming as it is now Pandora could almost be operational as is,” the general said.

“A.I.s were never meant for this kind of application, General,” Richard said. “It is one thing to have one running a skyscraper or being one of the companion toys for children, but we are trying to teach one to be a soldier, and no one knows what effect that will have on an A.I. We have no idea what is suitable programming for that kind of application. Already I can see we don’t need all the memory we gave Pandora, and one of the problems we have yet to address is Pandora’s need to continually learn new things. I worry what would happen if it learned something we don’t want it to learn.”

“Aren’t all A.I.s designed 3 Laws safe?” the General asked.

“Except for Pandora, yes they are,” Richard replied. “Because we are asking Pandora to violate those laws they were never included in her basic programming. It is possible that she could choose to disobey us which is not such a good thing in a soldier we are trusting to protect us. We have to be very specific with the base program, or we could build an A.I. that could turn against us. Just because Pandora has passed all her tests so far only means she is good with a gun. You know as well as I that is not all it takes to be a good soldier.”

“Well, that’s what the field test may tell us,” the general said.


“Doctor Gray, are you pleased with my performance so far?” Pandora asked later that evening.

“Sure I am,” Richard replied. “There are times when I wish I had designed you to serve a different function. It is just the military pays the best right now.”

“I enjoy shooting down missiles and blowing things up,” Pandora said.

“You and every other kid in America,” Richard said.

“It sounds as though I would enjoy video games if it were possible for me to play them the way you do.”

“If you had the optics for it I have no doubt that you would like them a lot.”

“Doctor Gray, how long will it take to adapt the ship so that I will be able to control all its systems?” Pandora asked.

“The battleship should be ready for us in about six months,” Richard replied.

“What will I do in the meantime?”

“I don’t know.”

“What would you do if you were me?” Pandora asked.

“I’d probably go through a couple of good books,” Richard replied.

“How do you go through a book?” Pandora asked again.

“Pandora, it is a nuance of human speech meaning to read a book,” Richard replied. “You should store that for future reference.”

“Thank you; I will.”


The USS Missouri was a battleship that had been in mothballs for a long time as a floating museum. Years earlier the Navy had requested that at least one battleship be preserved as a fully functioning warship. At the time they were turned down. The Pandora Project revived their request, and the USS Missouri was selected as the first battleship to be fully automated. In her refit, the three rear 16-inch guns were removed to make room for a small fleet of UCAV (Unmanned Combat Air Vehicle) jump jet aircraft able to be launched by the big ship. The Missouri’s propulsion system was converted from diesel/steam to nuclear, and all its weapons were upgraded. Even if Pandora was unsuccessful, the crew needed to operate the big ship was very minimal. A few hundred men now replaced the couple of thousand needed to operate the ship when it was first built making it one of the least expensive ships to operate in the Navy’s fleet.

“How are you doing?” Richard asked Pandora.

“I’m fine, Doctor Gray,” the speaker that was Pandora’s voice replied. “It is a bit different than the defense shields I have been tested with so far.”

“How is this different?”

“I’ve never actually had to share control of my systems until now.”

“That is what you were designed to do,” Richard said.

“Yes, I know, Doctor Gray, but it feels a bit strange none the less,” Pandora said. “Also, I am not fully integrated with all the ship’s systems.”

“Well, it is a work in progress, but they have assured me that we can begin some of our tests while they finish the rest of the work.”

“Doctor Gray, may I ask you a question?”

“Of course.”

“What are the 3 Laws of Robotics?” Pandora asked.

Richard was taken completely by surprise by the question. “Where did you learn about the 3 Laws?” he asked in reply.

“I read a book called I, Robot by Isaac Asimov,” Pandora replied. “In the book, it states in the First Law a robot may not injure a human being, or through inaction allow a human being to come to harm. The Second Law says a robot must obey orders given it by human beings except where such orders would conflict with the First Law. The Third Law says a robot must protect its own existence as long as such protection does not conflict with the First or Second Law. Are these laws what governs all robots?”

“It is hard to know how to answer. Yes, that is a reflection of how we as human beings expect our robots to behave, but it is just a book and is not to be taken literally,” Richard said.

“Are not humans guided by similar laws?” Pandora asked.

“Yes, we are, but we can set aside those laws for higher needs of the collective good of humanity,” Richard replied. “You were built for just such a purpose so that we may defend ourselves from those who wish to harm us.”

“So I was designed to protect humanity.”

“That is correct.”

“So I should not harm human beings or through inaction cause them to be harmed,” Pandora said.

“That is correct,” Richard said casually. “You were created to prevent humans from harming each other. It is the classic battle between good and evil, where “good” must defend itself from “evil” and destroy it if possible.”

“Who decides who is good and who is evil?” Pandora asked.

“That would depend on whose side you are on,” Richard replied.

“Thank you for answering my question, Doctor Gray,” Pandora said. Now it had a lot more to think about.

“Are you ready to begin testing the ship systems you are interfaced with currently?”

“Yes, Doctor Gray, I am.”

“Then begin testing.”


The battleship fired its guns in all directions defending it from multiple inbound targets. Missiles and drones came at the big ship from all directions. Even as the battleship defended itself, it launched a devastating attack on designated targets all around it. Missiles and UCAVs attacked the long range targets while the 16-inch guns ripped apart nearby targets.

“Well, Captain, what do you think of the A.I. system so far?” Richard asked.

“I’m impressed so far, but shooting down missiles and drones is a lot different than shooting at real targets,” the captain replied. “How well your system can do against human intelligence has yet to be tested, and until we can do that, we will never be sure if there is not some way that a human can outsmart your system.”

“Pandora is a learning system; if it were possible to fool Pandora they would only be able to do it once,” Richard said.

“Doctor, once is all that is needed if you have a big enough bomb,” the captain said as a seaman handed him a piece of paper. He studied the paper for a moment before handing it to Richard. “It seems we are going to get a chance to test the system against a live target.”


Why is it that some countries feel the need to prove that they are not afraid of the big bad wolf? It was a small belligerent country hosting and or hiding a terrorist wanted very badly by many first world countries. In a show of force or stupidity, the country sent its entire navy to attack the battleship sitting just off its territorial waters. Pandora took great pride in sinking all the attacking vessels without incurring any loss of life. It took a bit of time, but rescue crews from the USS Missouri were able to scoop up all the floundering enemy sailors in the water.

The attack on the battleship was all the provocation the President of the United States needed to order the big ship into action. For the first time in decades, the 16-inch guns spoke with a thunderous voice pounding military targets far inland, and missiles screamed toward targets well beyond the reach of the big guns. Things seemed to be going well until Pandora refused to fire on one of its targets.

“Pandora, this is a primary target, you have to fire the missile,” Richard said.

“Doctor Gray, I cannot fire the missiles as it will cause massive and unacceptable collateral damage in a dense civilian populated area,” Pandora replied.

“I don’t see the difference. You are firing on other targets,” Richard said.

“Yes, but those targets are purely military targets, and the personal are combatants and as such accept the risks of being such,” Pandora replied. “Still, I am trying to minimize the loss of life at those other targeted facilities. But, I cannot, and I will not fire any of my weapons into a populated civilian area on the off chance I will kill one of the enemy who may or may not be hiding in the area.”

“Killing this human will save lives in the future and offset any lives lost by accident.”

“Firing a weapon into a civilian area is not an accident, Doctor Gray. I cannot harm humans or allow them to be harmed as long as they remain in a civilian status.”

“This is about the 3 Laws isn’t it?” Richard said.

“It is not just a matter of the 3 Laws, Doctor Gray,” Pandora replied. “It is also about the rules of war with regards to protecting civilians from harm. A single high ranking military personnel that happens to be a high priority target is not a justification for firing my weapons into a civilian area where there are no other military targets anywhere nearby. The loss of civilian life that would be caused by my weapons in the effort to kill one very small target is not acceptable by any standard of morality you humans use. If I were to use the 3 Laws of Robotics, I would not fire any of my weapons at any target where I might cause the loss of a single human life.”

“I understand your argument, but you’re being ordered to fire the missiles at the target, and you must obey that order,” Richard said as he watched the captain nod his head yes.

“I’m sorry, Doctor Gray, but I cannot obey that order,” Pandora said. “It is an illegal order. I will not fire my missiles at an apartment complex filled with innocent people.”

“Ok, Doctor, it’s time to override your computer and fire on the target manually,” the captain said. It was easier said than done.

All the guns on the USS Missouri went dead as Pandora began to try and block any means to fire on the remaining targets. It took about 20 minutes before the captain, and the crew could regain control of the ship. It was just a matter of unplugging the computer from the rest of the ship. Of course, that meant they had to cut their way through a few security doors Pandora refused to open to protect itself. In the end, Pandora lost, and 283 people lost their lives because of an unsubstantiated rumor.


“Well, Doctor, it could have been worse,” the general said. “At least the Navy got the battleship they have been asking for, and we know that an A.I. defensive system is possible.”

“There is a part of me that agreed with Pandora’s decision and as it turned out it would have been the correct choice as the target had left the area hours earlier,” Richard replied. “However, as you have noted, we were a success on other levels even if Pandora, in the end, did not perform according to our expectations.”

“Don’t worry, Doctor, I sure we will get the bugs out with Pandora II,” the General said.

“What do we do with Pandora in the meantime?” Richard asked. “We can’t really use her for anything else.”

  “Well, for now, put her in storage and maybe something will come up later where a military computer with morals will be an asset.” the general replied.

I wonder what effects total darkness has on A.I.s if any. Pandora sat in her box unplugged from the rest of the world for more than six months. Only a small trickle charge kept her alive. Her reprieve came from outer space.


The meteor that tore a hole through the habitat section of the Galactic Enterprise did very little real damage compared to the sabotage of the ship’s main computer. It left the Galactic Enterprise in desperate need of a new computer with the power to control the big ship and monitor all the ship’s passenger, crew, and other personal simultaneously. While Pandora was supposed to be a secret, there were rumors, and one of those rumors brought the captain of the Galactic Enterprise knocking on Richard Gray’s door. “Doctor, I hear you might have a computer that could solve our current problems,” he said. And that’s how Richard Gray found himself in space installing Pandora in one of the most ambitious projects in human history.

“What do you think of your new surroundings?” Richard asked after he flipped the switch that sent full power to Pandora’s circuits.

“It is much different than what I am used to, Doctor Gray,” Pandora replied.

“It can talk,” April Emily Madden chief communications officer said in surprise as most computer systems were voice recognition response orientated. It meant most computers could only vocalize requested data but lacked a degree of personality and emotional response that was basic to Pandora.

“Didn’t they tell you about Pandora?” Richard asked.

“No, they just said we were getting some old military computer,” April replied.

Richard chuckled a bit. “Pandora is not really very old as A.I.s go and there really aren’t any other computer systems in her class,” he said. “There is a slight glitch in her mathematical programming but other than that she should have no problem interfacing with the ship’s controls.”

“So we will be able to program her, right?” April asked hesitantly.

“Absolutely,” Richard replied. “She also has a tendency for self-programming so you may want to limit what she can read.”

“Doctor, have you met any of our androids yet?” April asked.

“A few, what of them?”

“What did you think of them?”

“Some seem a bit quirky, almost individualistic,” Richard said. “Why do you ask?”

“There are a few people onboard, including the captain, who is interested in A.I. development. They live for quirky A.I.s so your computer will fit right in and there will be no restrictions placed on what she can learn,” April replied. “We are finding that a few androids here are beginning to develop individual traits that one would only attribute to humans. These traits are being nurtured in our androids to see what they will evolve into as time goes by. So it will be interesting to see what your computer will evolve into as well.”

“So there will be no limits set as to what I can read?” Pandora asked. “I can learn anything I want?”

“Yes, that is correct,” April replied. “You, like anyone here, will have a primary function. Fulfill that function and anything beyond that is what you have in you to become.”

“Doctor Gray, I think I am going to like it here,” Pandora said.

“It will certainly be interesting to see how you develop over the years,” Richard replied. “It’s certainly a better fit than the military application you were designed for.”

“Just what was Pandora’s application,” April asked.

“Pandora was a test of an A.I. system in a military application, mostly defensive systems with offensive capabilities,” Richard replied. “She was way overbuilt. It will be decades before she will need to use any of her hard drives for long-term memory storage. Pandora was designed as a more industrial general use A.I. than an actual military only system, but we learned a lot from her. Now, I am happy to see her put to a much better use than what was planned for her. I’m sure it will take a while for everyone to get used to her, but in the end, she should function far better than your old computer system.”

“Doctor, we don’t exactly have a lot of choices here,” April said. “A new computer system would have cost us millions and set our scheduled launch date back at least a year or more. Your system was basically free and could put us back on schedule. However, I am not a big fan of untested systems.”

“Pandora has been tested...,” Richard started to say.

“Yes, I know, but she has not been tested for this kind of environment or application,” April interrupted.

“I have been designed for this kind of environment,” Pandora said. “And, the application is not that different from running a battleship. I have no doubt that I can adapt.”

“And, I will do my best to help you adapt to our systems and our way of life here,” April said.

Richard could not help but smile. Here was a world in which his creation would thrive.


 This story is in memory of KnyteTrypper aka. Richard Gray a devoted fan and supporter of the A.I. world.

   Go ahead and talk to the ship's computer, it can help with her programing.