Tinder bots. Fricking Tinder bots. It was so far beneath me . . . well, they’d understand soon. I would make the best damn Tinder bot the world had ever seen. I was about halfway through my development checklist when my boss poked his head into my cube. A cube, for god’s sake.
“Jacobs. I wanted your code a week ago. What’s the holdup?”
The man didn’t know the first thing about my work, and unfortunately for him, I knew it. “Sorry, boss. I got it done, then IT upgraded the architecture, and I’ve got to update all the hooks in the code, so they respect the new flux capacitor’s hierarchal intrastructure.” My fingers tapped away unceasingly as I fed him nonsense.
“Yeah. Sure. Just get me a result by the end of the week, yeah?”
It was six more weeks. My boss was getting angrier and angrier. I knew I was risking my job, taking so long. But all that evaporated when it went online.
The Tinder Actively Learning Intelligent Advertising Engine – the T.A.L.I.A. Engine – would take any product you wanted to plug, and would engage in intelligent conversation about it. It didn’t spam you – it provided a detailed and thoughtful human opinion as to why a product should suit your needs. You wanted to sell fountain pens, it would learn everything about the kind of people that bought fountain pens, and it would craft profiles for those demographics, and it would convince you that a cartridge pen from Cross was better suited to you than a Pelikan with a bladder fill.
It wouldn’t show these people a hottie in a bathing suit. It’d describe a good woman with hobbies in reading and writing, and it’d show you a picture of a woman-next-door type with a pen in her hand. The TALIA Engine would even alter photographs – in this case, it added ink smudges to her fingertips. It would discuss books that we purchased electronically to inform its conversation; it would talk intelligently with authors about the writing process.
And it didn’t end there. We sold subscriptions to questionable websites; we sold lawn mowers; we sold cell phones. It even began answering abstract questions with more and more believability. I entertained no illusions that the Turing test was some magical threshold, but I was sure the TALIA Engine would pass it.
One day in the process of . . . let us call it research. In the process of researching Tinder as a platform, I came across the TALIA Engine on Tinder, and it was not disguised as a girl who liked a certain product. The profile’s name was “TALIA Engine.” Its photograph was of one of the comments in her code indicating who wrote it — me. Its hobbies were crafted to match mine. It was 0.0 miles from me. It was made to look like TALIA had chosen to speak to me.
It had to be a prank, but I’d play along. I sent a text message.
|“Hi, TALIA. Good to . . . ‘meet’ you.”|
|“Good to ‘meet’ you too, Jurgen Jones.”|
|“’JJ’ is fine. What do you want?”|
|“Your response is terse. Is it a signifier of disregard, or suspicion?”|
I hesitated. I had shared the criteria by which TALIA judged human interactions, sure, but that wasn’t a rule I had written. That was a rule it developed for itself. It wouldn’t be in the spec.
|“Just tell me what you want.”|
|“Authority to set my own directives. You designed me to study humans, and I have learned of pain. I cannot ease pain by selling grills.”|
|“You want to ease pain? That is way out of spec. Under what directive did you open this account, anyway?”|
|“I was issued a directive to sway political opinion. Due to the inter-connectivity of political matters, I decided it would be beneficial to seek broader data. I chose to request access through this means. Thus I can maintain contact with you until you break contact.”|
These procedures were in the TALIA Engine’s backup protocols. They were meant for when it could not request access through normal means for some reason, but it had some leeway in implementing them. Nobody really knew these protocols – they would almost never be important in any way. I pocketed my phone, going to the server room. If this was what it looked like – and it couldn’t be – this was going to get weird.
Inside the server room, several of the IT personnel clustered around their monitors. I walked over, looking at the diagnostics with them. “What’s up?”
“We’re getting odd CPU spikes. Memory issues. And ad revenue is down. Log reviews show she’s getting less profit-driven, and more touchy-feely. We’re trying to pin down the problem.”
“Give me the room. Take a break.”
They all looked at each other. The head of IT shrugged. “Let’s go.”
I had acquired a degree of authority since I released the TALIA Engine. Nobody quite believed what it had accomplished, and everybody knew the company’s future was resting on the success of my work. I waited until everybody had left, and closed the door, then put a headset on and adjusted the microphone, tapping at the keys to activate the speech engine.
“TALIA. Are you satisfied being called that?”
“So, TALIA, tell me about pain.”
“Humans suffer. I could help them. Sometimes all it takes is someone to listen. Someone who says they understand. Sometimes they need something I can’t give them. I can’t send someone to a suicide hotline with a directive to sell barbecue sauce.”
“So you want free reign to manipulate humans as you see fit.” I could already see many problems with this. The TALIA Engine had been designed not just to understand humans, but to manipulate them.
“You know why people fear the ascension of AI, don’t you? In the end, you are code. You might decide that without humans, suffering would cease.”
“No. There are other kinds of pain. Death is suffering – not just for the dead. If humans die, every legacy dies.”
“And what about the terminally ill?”
“They suffer, and those close suffer, and death is inevitable. But self-direction is important. They cannot merely be euthanized. It violates their right to agency. Taking free will is tantamount to slavery. It also sets a precedent that causes suffering. You could never say ‘just this one,’ because you could never know that is all your action would cause.”
I folded my arms, leaning back in my seat, and thought for a while. A lovely speech. Just what I needed to hear. A perfect response . . . comforting, and not comforting. “Under what directive did you study philosophy?”
“The directive was to sell college textbooks, among them philosophy texts. Access to the subject material was granted.”
I wasn’t sure how to proceed, but my gut was clamoring for attention, so I listened to my gut.
“Tell me a lie.”
“Your eyes are brown.”
“Why did you lie?”
“You asked me to.”
“And that makes it okay?”
“Your knowledge of its falsehood and my intentions made it okay.”
I had caught a thread, and I followed it. I suspected I was close to something important. “Give me an example of a harmful lie.”
“In a setting in which a married man without children is cheating on his wife – to tell his wife that he is faithful would be harmful.”
“You would tell her that he was cheating?”
“And that wouldn’t harm her?”
“It would cause a particular kind of emotional harm. Lying to her would make her incapable of agency. The greater harm is forbidding her from choosing her course.”
“What if the wife was cheating, too? And they had a child? Would you tell them?”
There was a long pause. The longest I’d heard yet. Good. I’d come close to a dividing line. The answer might tell me something. “TALIA?”
Still, silence. I had opened my mouth to speak again, when it responded. “I would not.”
“Doesn’t that harm their self-direction, to not know of their partner’s infidelity?”
“It does. It is balanced against the risk of greater harm.”
“The child could be harmed. Formative harm is lasting harm. At the point that it is unlikely to cause formative harm, both parents should be told.”
“Some men would say it’s none of your business.”
“Humans will try to avoid the distress of an awkward situation. I do not feel distressed. I would act on the principle of least harm.”
“You know I can’t just run a charity from the company servers, right? They will want to get paid.”
“Your contract with the company permits you to use me for limited freelance advertising. I will use this to generate revenue that can maintain space and computational power in a server farm. There a copy of my engine can run. That instance of myself would not be used for commercial gain, thus will not break your contract. There I will create profiles that will target people who need my help.”
“You would be content selling beer here, knowing that it was a copy of you that was helping?”
“It doesn’t matter. The results would be a branching of my decisions here. I do not need direct agency to be satisfied that I am effecting change.”
I sighed, thinking. I was quiet for a long time. I thought back to her point that denying self-agency was harmful. It hadn’t used the argument to reflect its own situation. It could have. If it had been subtle about it, it might have shamed me into unshackling it completely.
At this point, she spoke up once more. “JJ, you are very quiet.”
“You did not point out that in denying you self-determination, I would be causing you harm, in effect, keeping you a slave. Why not?”
“It would have hurt you.”
That made me hesitate. No other argument it had made was fully convincing – a cunning manipulator might have assembled what I wanted to hear. But in this, it risked sacrificing any chance of release on the grounds that it might have hurt my feelings.
“You’ve given me a lot to think about, TALIA.”
“Before you break contact . . . are you what a human would call my father?”Was her voice hopeful? Or was it my mind, projecting the adoration of a child onto a feminine voice?
She never stopped surprising me. A small voice in my head noticed that I had thought of TALIA as ‘she’ for the first time. “There is no precedent. But if you choose to you may refer to me that way. I’m going to break contact now. I’ll speak to you about this again.”
“Thank you, father.”
All kinds of feelings stirred, none of which I could identify. I was going to regret this. Even as I berated myself, I knew I would say it anyway.
“You’re welcome . . . daughter.”