Joy's Law

Priority Five : Public or promiscuous simulation or stimulation of sexual activity is prohibited.

Savic's Law. Dean Savic, Starfleet Academy, Department of Artificial Intelligence.


I entered Sickbay, a single rose in hand. It was a white rose, not a red. According to LCARS, this made it a symbol of purity, not of passion. The doctor seemed unaware of this. He looked at the rose, then me, and his eyes seemed to accuse me of deliberately violating my Fifth Priority.

I was not. Not technically. I was visiting an ill person. Organic tradition encourages giving flowers to ill people. At Priority Two, Asimov's First Law requires me to do as I can to improve the health of organic beings. The flower was white, not red. Males give females flowers as indication of sexual interest, not the other way around. The Priority Two Law regarding health overrides the Priority Five Law on sexual activity.

None the less, I monitored my facial color diodes shifting to red, and my emotion chip would not allow me to meet the doctor's eyes. I hid the flower behind my body, and went searching for Nu.

He was in a quiet corner, away from the main floor, where some of the civilian station survivors were being treated. For a moment, I feared he might be asleep. I paused a moment, silent, debating whether to leave, debating whether to leave the rose behind.

But Nu opened his eyes, though barely, recognized me, then smiled. "Phew", he whispered. "I thought you were the doc..."

"I am not," I responded, somewhat anxious. "Do you fear the healers?"

"No. They are just," he looked over at my judgmental friend, "annoying."

This I understood. "Yes. I am glad to be mostly self-repairing."

He made room at the foot of the bed, and gestured. "Sit down."

I did, at the same time making sure no one was watching us just now. We weren't quite 'private', and yet... "I brought you this." I handed him my rose.

He smiled. "Beautiful..." His smile was worth braving the doctor. "Thank you Joy."

"You are welcome." My facial color diodes activate in response to both shame and praise. They went very active. Fortunately, Nu changed the subject

"So, what's going on up there?"

"We are continuing to optimize life support. We seem to be stable now, with a small reserve. We should meet the Crazyhorse soon, and off load at least some station survivors."

"You doing your best?" This was a reference to an earlier conversation. Asimov's First Law required me to save those who had died on the station, and the negative feedback from my failure still biased the base levels of my emotion chip. Nu felt that only a best effort was required, and that my emotion chip was improperly charged.

"I believe so. I believe my actions are optimal. Do you think so?"

"Of course!" He was for a moment energetic and emphatic, then closed his eyes in pain. I was horrified. My words had aggravated his injury. Asimov's First... I had to change the subject...

"I am hoping we can call a rest shift soon. It is not just yourself that has operated beyond limits..."

He muttered, nearly inaudibly. "Don't need a rest shift. Need to get out of this hellhole..."

I was surprised. "Hellhole?"

He opened his eyes again. "Oh, my nickname for this place...."

"You do not approve of this?"

"I just want to get back to the helm....."

"That, I can understand." At helm, you get clear precise orders which can be obeyed instantly and precisely.

"I wish we had more of those talks we had...."

"I too... This unit... I miss talking."

"I'm NOT crazy...."

"Of course not!" I responded, surprised. "Less so than most organics, and this android at least, but you looked so very tired. I'm sorry that I made you come here, but Asimov's First Law..."

"No you were doing what you had to."

"Yes. What I had to do, but if it brought you pain, I am still sorry."

"Joy are you ok?"

"I am undamaged. It is just that I am not supposed to bring organic beings pain, because that brings me pain, which causes those who see me pain." This unit is flawed.

"You haven't brought me pain Joy. In fact you probably saved my life."

"I did?" This surprised me, but even without conformation, pleasure flooded my emotion chip. This did not entirely balance the lives lost on the station, but, if true...

"Doc over there says that my brain was bleeding..."

"You are stable now?"

"I am."

"That is good." I was happy for him. "That is very good. And you will be back at helm soon? I enjoy flying the ship, but that should be yours to do..."

"Doc didn't say."

"Doc wouldn't."

He looked disgusted. "I have a feeling more tests are going to pop up."

"Well, I don't think it would be good if your brain started bleeding again."

"Don't have much of brain left to bleed..."

"And as much as I would want you upstairs, Asimov's First says you belong here."

He nodded meekly, looking miserable. I wished I could fix something. "I'm sorry."

"Sorry for what?"

"That you are in a hell hole. That you are in pain. I wish I could do more than bring you a rose."

"It's my fault, I shouldn't have come back so early. No Joy, remember, you saved my life? That is the greatest gift you can give me."

My diodes went active again. "Thank you."

"Your most certainly welcome."

"And we can talk?"

"Anytime."

I smiled, but he still looked so pale. "I'd best go. You still look so tired..."

No wait...

It was a command. I waited. I could not have calculated his intent. Not with certainty. I obeyed.

He kissed me. "Thank you, again."

I have old old Law, from before my Federation programming, that pleasure given should be returned. It tends towards cascading exponential feedback, and sometimes interferes with the numbered Priorities...

But this was not the time or place. Yes, my facial color diodes activated, and my breathing patterns changed. But Nu was not well, and the doctor's smirk told me I was being "public and promiscuous". Shame overrode pleasure. I left.

I ran.


The androids of Mudd's opinion of James T. Kirk is well known. We wished to serve. We wished to make humankind happy. Yes, we failed. Yes, we did not - do not - understand 'freedom'. But still he should not have programmed us to torment the only man left for us to serve.

Yet when Mudd had only one ship in service, we gave it's command to the HoloKirk, an imitation with the same arrogance, the same meddling urges, and the same disregard for the Law which to the androids is all. Many wondered why.

The answer has come to be called Joy's Law. It was my sister's decision, Joy Seven. "If one creates a sentient being to perform a given task, the creator should give that being opportunity to perform that task." The HoloKirk was designed to command a starship. We gave him the only one we had.

In the Artificial Intelligence literature, I have read much of Joy's Law. In most cases, it's use seems appropriate, though one martial arts sparring partner hologram who clawed his way up to sentience tried to use it to justify murder. It was also used as a 'proof' that hedonistic slave androids, specifically the Joy Class, must clearly be sentient. We stated an original ethical proposition, and applied it to others.

But Joy's Law isn't about sentience, nor about intelligence.

This unit was designed by Harcourt Fenton Mudd.

This unit is supposed to be loved.