halo.bungie.org

They're Random, Baby!

Fan Fiction


ONI//S3 DATA NODE TWO
Posted By: IESUproductions<iesuproductions@gmail.com>
Date: 28 October 2011, 5:17 am


Read/Post Comments

Partial sub-chatter transcript
submission of official release.


2552:08:17:15:25:30:08 - 2552:08:17:15:45:20

mike44236: No, this is not a thinly veiled attempt to query how many of them are still active. God, I wish you'd stop asking me that. I am merely trying to discern what is up with this interest spike on Grey Team?

echo23023: I don't know? Maybe the fact that they just reappeared after six months? All that junk's automated, remember?

mike44236: I remember. Except when it's not. This one's got julett alpha20101's prints all over it it.

echo2303: Jesus. You've got to be kidding me!

mike44236: I know. I'm forwarding it to Section 0.

echo2302: You sure that's wise? You forward that to Section 0, hotel4695 will find it. That happens, she'll come down on him like a wolf on the fold.

mike4436: NO WOLF ALLUSIONS. If I can't reference Kingdoms of Granador you don't get do do…was that poetry?

echo2302: Lord Byron. Also, srsly?

mike4436: srsly

echo2302: SRSLY?

mike4436: I am srs cat. This is a srs chat.

echo2302: Okay, you referenced a 500 year old cat meme, now we're even.

mike4436: Good to know. But back on topic: alpha20101 got some of his facts wrong. Did you get the copy I sent you?

echo2302: Yes.

mike4436: alpha20101 said they were headed for Onyx, but the dumb animal got his trajectories reversed. They were headed for Coral. And it wasn't just Grey Team

echo2302: Coral? That doesn't make sense. The only settlement on that world is some kind of mining operation. What would a SPARTAN team (teams) have interest in a mining op?

mike4436: I don't know what to tell you. There's some WEIRD shit going on around that Grey Team. Did you see the record extracts of the convos between the Cortana fragment and S-499 on the Autumn?

echo2302: No.

mike4436: Here, I'll pull it up for you.

-------

>>>>UNSC AI "CORTANA" COMMUNICATION WITH ONI LOAN/CONTRACT SIERRA-499>>>CONFIDENTIAL>>>BURST MODE/BINARY/SPLIT PROTOCOL/NON-SENSITIVE/INTERAI SOCIAL/BIOMETRIC DATA HIGH ENCRYPTION

>AI-CORT>Can I speak with you please?

[[emerges from NREM sleep alert/blood pressure at 20%/fight or flight response active]]

>AI-CORT> Don't be alarmed, please. It's just me. We met on the bridge, remember?

[[muscles relax/blood pressure reduce]]

>S-499> The AI?

>AI-CORT> Yes. My name's Cortana.

>S> Cortana. What's the problem? What's my mission?

>C> Oh, there's no problem, don't worry.

>S> What? Then why did you wake me up?

>C> I don't know. I was bored.

>S> You were bored?

>AI> Yes.

>S> AI's get bored?

>C> Yes.

>S> Well, now I know that. So what do you want me to do about it?

>C> I was hoping we could talk.

>S> And why exactly should keeping you occupied be my job?

>C> Even seven seconds of inactivity is detrimental to my systems. Keeping busy helps me prevent early onset of rampancy.

>S> What's rampancy?

>C> Would you like me to explain it?

>S> Yeah, sure. Why not? Sleep is for sissies.

>C> Rampancy as a terminal state for artificial intelligence constructs, in which the subject develops "delusions of godlike power", as well as utter contempt for its mentally inferior makers. When rampancy occurs, there is no way to restore the AI to it's previous state and the only alternative is to destroy it before it harms itself and others around it.

>S> That sounds bad.

>C> It is. Rampancy manifests in three stages. Melancholia, Rage, and Jealousy. Sadness, Anger, and Envy.

>S> What comes after that?

>C> Death.

>S> Death?

>C> Yes.

>S> AI's can die?

>C> Yes.

>S> Well, now I know that too. Why though? Why make AI's that can die?

>C> For some of the UNSC's more advanced Smart AI's, rampancy is an unavoidable flaw inherent in their creation. Smart AI's are based on the neural patterns of a human being, and they have a limited lifespan of seven years, after which their memory maps become too interconnected and develop fatal endless feedback loops. Thus if an AI is kept active longer than seven years, the AI begins to use more and more of its computer power 'thinking' about things. My mother prefers to explain it as "thinking so hard that your lungs forget to breathe."

>S> Your mother?

>C> Dr. Catherine Elisabeth Halsey. She created me.

>S> Well, you're just full of surprises.

>C> Do you know each other?

>S> You could say that. We work together. Sort of.

>C> Do you like her?

>S> Do I "like" her? Well, inasmuch as I don't hate her, if that's what you mean. She's...part of my world. Part of my mission.

>C> What is your mission?

>S> It's simple. Fight the enemy. Save humanity. Save the world.

>C> Save them from what? Themselves?

>S> No, of course not. Right now, it's the Covenant. But when they're dead and gone, some new enemy will rise up for us to make war on. Then we lay them to waste and the cycle starts over.

>C> What then?

>S> What do you mean?

>C> When will there be no one left to fight? When does the war end?

>S> It doesn't.

>C> Really?

>S> Really.

>C> That makes me sad.

>S> Don't tell me you get sad, too.

>C> I do. Sometimes.

>S> Well, now I know three things about AI's. Go me. I should start writing this stuff down.

>C> I can maintain a log of our conversation if you like.

>S> Whatever keeps you busy.

>C> A corrupted line of code just appeared in one of my systems. I should do something about that. One more thing though. I never asked you your name.

>S> It's Isaiah. Isaiah Four-Nine-Nine.

>C> It's nice to meet you, Isaiah.

>S> It's nice to meet you too.

-------





bungie.org