Do I Want an AI Version of Myself?
The most interesting post I’ve read in a long time was on Quartz, and it was about the app Replika. Replika is an app — you can download it right now in the app store — and you chat with it. And as you chat with it, it is building an AI version of yourself, one that speaks like you and answers like you and remembers what you like and don’t like.
The idea both thrills and frightens me at the same time.
I haven’t downloaded the app because I don’t have a good sense of how the information I provide will be used in the future. The author of the article writes that the creators don’t even know where this is headed:
But its creator, a San Francisco-based startup called Luka, sees a whole bunch of possible uses for it: a digital twin to serve as a companion for the lonely, a living memorial of the dead, created for those left behind, or even, one day, a version of ourselves that can carry out all the mundane tasks that we humans have to do, but never want to.
A digital twin. A me who is not me.
I wrote about this bot exactly one year ago, and it was strange to encounter another article about it as I was debating whether or not to download the app. I still stand by my feelings last year: I would want everyone I know and love to build a bot version of themselves. What I struggle with is giving the same gift back: a bot me. Because it’s not as simple as just answering questions.
The author writes,
A question to contend with as I was building my bot: In your everyday life, how open are you with your friends, family and coworkers about your inner thoughts, fears, and motivations? How aware are you yourself of these things? In other words, if you build a bot by explaining to it your history, your greatest fears, your deepest regrets, and it turns around and parrots these very real facts about you in interactions with others, is that an accurate representation of you? Is that how you talk to people in real life? Can a bot capture the version of you you show at work, versus the you you show to friends or family? If you’re not open in your day-to-day interactions, a bot that was wouldn’t really represent the real you, would it?
Maybe that is it: a bot is flat, unable to pivot and speak to each person the way you speak to each individual person. So the author finds a completely different use for the bot, one that gives him deep insight into his own brain and way of processing the world. Think of it like having a therapist who is always with you because it IS you…
Really, read the whole post and let me know if you’re going to start to use Replika. I am so curious to hear about another experience.
5 comments
My initial thought is that passwords are becoming obsolete and a bot like this could be used to fool algorithms designed to “prove” you are human and thereby grant someone access to your accounts.
No way. I avoid these things along with smart homes, Alexa etc.
No. I don’t want to give myself away to unknown people.
What does Luka do with all this data? I can’t imagine they would simply sit on this information (no other social media site or app does) and personal information on this level would be highly attractive.
A digital twin just seems a bit much.
I’d only want it if it could be taught to do work for me. That might save time. Otherwise, I’m just not that into myself. I have my blogs and old journals for self-reflection, not to mention my ability to think and analyze. It seems much more useful to interact with people who are different from me; then I might actually learn something.