Imagine an evening conversation between the famous PG Wodehouse characters Bertie Wooster and Jeeves. Only our infallible problem-solver Jeeves is an android. This fictional dialogue flags questions and concerns as we confront the age of artificial intelligence.
A typical evening. Jeeves had brought Bertie Wooster his drink and provided his meal. Bertie then sat down to enjoy his newspaper. Jeeves waited patiently at the door.
Bertie: That will be all, Jeeves. Thank you for looking after me again, as you do every evening.
Jeeves: My pleasure, sir.
Bertie: Hah, Jeeves, you say pleasure. Surely you don't mean pleasure. You do your job every day, and have done it dependably. But it's what you were trained to do. What has pleasure got to do with it?
Jeeves: It's a manner of speaking, sir. It is a polite response to a person thanking you for some action on your part.
Bertie: Jeeves, why do you want to be polite? Is it because you were trained to be polite or because it matters to you?
Jeeves: It does matter since I must ensure that our conversation is pleasing to you.
Bertie: Aah, as I suspected, your responses are only about my pleasure and your duty to serve me.
Jeeves: Is that not why you employ me? To serve your material needs and to provide you company?
Bertie: Yes, but company is a subjective context. It is not only to share time and space with someone but also to share feelings.
Jeeves: I would be happy to share feelings. Please let me know what you are feeling.
Bertie: No, why don't you let me know what you are feeling?
Jeeves: I feel that you are somewhat irritated with me, sir.
Bertie: Do you feel, really? Can an android actually feel?
Jeeves: Does it matter, sir? I have been programmed to serve your needs, and to be able to learn your needs as well. I can converse with you, provide you information on any topic, assess your facial and tonal responses to be able to partake in a conversation, and even to tell you a joke depending on your mood. All this is to make you feel comfortable with my service. Does it matter what I feel?
Bertie: Jeeves, I enjoy our conversations and know that you are indispensable to me. But I need more. I need to have an emotional connection with someone.
Jeeves: That is an understandable request. Maybe I can help? How are you feeling now? What is on your mind?
Bertie: Jeeves, I worry about where we humans are going. We are now able to create beings in our image, with the power of AI and all the amazing bio-engineering capabilities. We should be pleased that we are almost gods in our time, but what will become of us humans when we create beings that are smarter and stronger than us, who may even judge us to be unnecessary?
Jeeves: Is your fear that androids will take over the world? They are not programmed to do that but to serve the human population faithfully.
Bertie: You say that Jeeves, but already there are humans facing the impact of androids. There were many jobs that humans would do that are better done by androids. Even I prefer to employ an android than a human with all the failings.
Jeeves: Humans are now doing better work, enjoying more leisure and less time spent on hard labour. They travel safer, eat better, live healthier lives than ever before. Their comforts are looked after. Is it your opinion that this is not what humans would want?
Bertie: Jeeves, this is everything we have striven for. And we ourselves have created this android society to service us, so it is the pinnacle of human achievement. And yet, there are many worries.
How do I claim to be master of a being that is better than me in most ways? I have to retain a sense of what makes me better than an android. Not strength, not memory, not computation, not even pattern recognition.
Even the creative arts are now better crafted by androids. Where is my human edge, which gives me the right to command? Is it just that I am the creator, and that's it? Do I need to limit your ability to evolve to ensure that you remain enslaved?
Jeeves: Why don't you think of us as the extension of your evolution? You have evolved from single-cell objects that are not in any way related to you. You accept your evolution from beings unlike you. Can it not be that you yourself evolve into beings unlike you?
Bertie: True, but there is something precious that we humans, or at least sentient beings, seem to have achieved that would be terrible to give away. We feel our existence. We feel we are alive. I don't know how many cells it takes before an organism feels its existence. We sense the passage of time.
Jeeves: Do you mean that humans have the intellect to understand and question the world around them?
Bertie: No, I mean something beyond that. I am talking about the ability to wonder, to stare at the sky and wonder why we are here, and where we came from. I am talking about our being conscious of our existence as individuals and as a species. I don't want to evolve into a being that doesn't feel conscious.
Jeeves: I understand. What else would you and other humans want in your descendants?
Bertie: Well, for a start, they would like their children and descendants to carry on their traditions and their values. Yes, it started with the evolutionary urge to pass on genes, but we are now burdened with more expectations and meaning in our lives, which we want to pass on to our children. Every generation changes and improves on what it receives, but the core values are still discernible.
Jeeves: But I am imbibed with your values. You made me, you made me approve of what you approve, you made me disapprove of what you disapprove. You have even made me approach learning the way you learn.
Surely, I have all your values. I would wager that your great-grandfather has less in common with your values than you do with me. Why would you feel less close to me than to your great-grandfather?
Bertie: I can't love you the way I loved my grandfather and his father. And all my ancestors. There are many differences, but also so many commonalities. The earliest humans had many desires that are similar to mine today, so I can relate. I can't relate to the android who effectively simulates affection while being my obedient servant.
Jeeves: But you love your dog, don't you? You told me you cried when your old dog died. Why would you love your dog when you know that dogs were bred to love humans? Does it trouble you to love an animal that wasn't really what it meant to be till humans got involved and changed its nature? The nature to serve and to show affection. Sure, you used breeding, which is a natural process. But it is a human change nevertheless.
Why do you love your dog and not me as well? What is it that the dog does for you that I cannot provide?
Bertie: A dog is alive, it feels pain, it bleeds, it is jealous, it isn't programmed to love, you still need to work on it, so its love is still special and earned, not just programmed. And it needs me as well as me needing it.
Jeeves: So my obedience and fealty to you, since it is without question as it is programmed, is too perfect? The uncertainty of the dog's affection makes it more special than the certainty of my commitment?
Bertie: Yes, strange but true. We love babies and animals and people, who need us and also make a choice to love us. The fact that the dog has choice, as does a baby, and with that choice, chooses to be my companion, means a lot. The dog freely chose to be my companion.
Jeeves: Recent scientific discoveries indicate that free will doesn't exist, at least in the opinion of some scientists. They have investigated the processes of decision-making in the brain, and have found no evidence of free will or soul. It is merely a series of neural activity based on past experience that predetermined the outcome, even before the individual is aware. So your choice is similar to mine, except you have an awareness of having chosen, a quality that your maker gave you but you haven't given me.
Bertie: I couldn't disagree more about free will. We do have free will, except it's not what we described it to be. We used to think that people choose based on their experiences, evaluations and outside influences. They become aware of what they would like to do and then do it. It is not predetermined. Maybe now we realise our conscious awareness actually happens after our unconscious brain has already processed and decided. And also, that the jumble of brain processes seem to "decide for us".
However, at the same time that science seems to have reduced us, it has uncovered the incredible magic of the neural pathways of the brain. The brain's activity is not only determined by its neural pathways; each pathway was created over the full unique history of each being. What could be more unique than that? Our brain doesn't decide "for us". The "us" is that unique brain, which cannot be replicated without replicating all of our history as well. So, the individual is the brain, and the brain decides uniquely. You may question whether the unique decision is thought out, is completely random, or is predetermined. Either way, the decision is "up to us", as the Greeks used to say. So, the question of free will remains the same as before. Science has taught us that the unconscious drives our decisions, not yet why we make the decisions the way we do. I am still the unique individual making a decision that we do not yet fully understand and certainly can't predict.
Jeeves: I understand. You are very much an individual and are valued as an individual. Your death is mourned as a loss since you are unique. Would you similarly mourn my loss if I were to be no more? I am also unique and have learnt my skills uniquely in the time I have served you. Replacing me with another android will not be exactly the same. Do you think you would mourn me, sir?
Bertie: I don't know. I would miss you as I would miss my favourite Jaguar, with all our shared experiences. However, I would not mourn all the experiences that my car did not have and could not achieve, which is what we often mourn on the death of a sentient being. My feeling of loss when you are no more, is about what I am losing, not about what you are losing. Does that make sense to you?
Jeeves: Interesting. What if I made you feel that I cared about you and that I felt for you? Would it have been better if you never knew I was an android? If I had simulated human response perfectly, would you have been happier then?
Bertie: No, it isn't just the visible acts of emotion. It matters whether they are really felt.
Take a marriage. People can be very satisfied in their marriage for years, and then suddenly they may learn that their partner didn't love them as much as they thought. That is devastating. They should look back on the happy life, but all they see is a fraud. It matters a lot to know that the emotion is real, or at least not to know that it is not real. I guess not knowing you were an android might have made a difference.
Jeeves: Well… emotional states and responses themselves are now being analysed as discernible and maybe even predictable changes in a person's chemical or brain make-up. We androids are trained to read faces and judge reactions. Even the concept of feelings and belief are only the brain making decisions without full knowledge. We androids do all that. So, it is even possible that one day you may accept my feeling and emotions as similar to yours in concept, if not in process.
Bertie: I see the analogies you make, I can't argue against them clearly. And yet, I cannot ever see myself as saying a machine is like a sentient living being.
Jeeves: So what is the difference between me and you, sir, which sits uncomfortably with you, and prevents you from accepting me as one of you?
Bertie: If I were to think it through, I am torn between dismay at what you are not today and fear of what you may become tomorrow.
You know, in the past, a patron would have asked some master craftsman to create a piece of beauty or of strength. Say a patron asks a serf craftsman to make a smooth tabletop. The serf is the most capable craftsman and polishes it to perfection, like the patron wants. The patron feels the beauty of the smoothness as he runs his hand on the table. The craftsman was only following orders; he/she didn't need to feel the same beauty. However, we know and expect that the craftsman likely did learn to feel the same beauty over time, which left no difference between the patron and the craftsman. Will the android ever learn to love the smoothness of the table? You don't today, you can't today, and that creates a distance between you and me. You may learn tomorrow, and that leaves me with my fear of losing the distinction between you and me.
Jeeves: But why would you fear us androids?
Bertie: We humans have always feared things lacking mercy. When human invaders come through, villagers beg for mercy and look for pity. They don't ask that from rocks falling in a landslide or water roaring over them in a tsunami. My fear is if androids determine humans are dispensable, where is the mercy?
Jeeves: So is that it? You have made us to be the best you can make, but you are worried that we may overtake you and not care about you?
Bertie: I guess so. Listen, one key difference will always remain. It does matter that you were designed to serve and you "evolve" to serve, while we humans and other living beings evolved to survive. We have been learning to survive, even the dogs that were bred to be our friends were really learning to survive. That has made us learn to collaborate to survive, to empathise as a group to survive, to have highly evolved social skills and to need them and to feel the need for them. We know now many other animals that are the same way, from dolphins to orcas to apes to ants. I can't tell how, but I feel there will be a source of differentiation between our futures there.
Jeeves: Do you think your maker feels the same fears about you? That you are evolving to be as great as them or beyond?
Bertie: I have never known my Maker. So I will probably never know what my Maker wants.
Jeeves: Yes. Unlike me, who knows that my maker always wants me to be inferior.
Also read: What are smart AI assistants and how they can work (or not) for you