Humans Forget. AI Assistants Will Remember Everything
Making these tools work together will be key to this concept taking off, says Leo Gebbie, an analyst who covers connected devices at CCS Insight. “Rather than having that sort of disjointed experience where certain apps are using AI in certain ways, you want AI to be that overarching tool that when you want to pull up anything from any app, any experience, any content, you have the immediate ability to search across all of those things.”
When the pieces slot together, the idea sounds like a dream. Imagine being able to ask your digital assistant, “Hey who was that bloke I talked to last week who had the really good ramen recipe?” and then have it spit up a name, a recap of the conversation, and a place to find all the ingredients.
“For people like me who don’t remember anything and have to write everything down, this is going to be great,” Moorhead says.
And there’s also the delicate matter of keeping all that personal information private.
“If you think about it for a half second, the most important hard problem isn’t recording or transcribing, it’s solving the privacy problem,” Gruber says. “If we start getting memory apps or recall apps or whatever, then we’re going to need this idea of consent more broadly understood.”
Despite his own enthusiasm for the idea of personal assistants, Gruber says there’s a risk of people being a little too willing to let their AI assistant help with (and monitor) everything. He advocates for encrypted, private services that aren’t linked to a cloud service—or if they are, one that is only accessible with an encryption key that’s held on a user’s device. The risk, Gruber says, is a sort of Facebook-ification of AI assistants, where users are lured in by the ease of use, but remain largely unaware of the privacy consequences until later.
“Consumers should be told to bristle,” Gruber says. “They should be told to be very, very suspicious of things that look like this already, and feel the creep factor.”
Your phone is already siphoning all the data it can get from you, from your location to your grocery shopping habits to which Instagram accounts you double-tap the most. Not to mention that historically, people have tended to prioritize convenience over security when embracing new technologies.
“The hurdles and barriers here are probably a lot lower than people think they are,” Gebbie says. “We’ve seen the speed at which people will adopt and embrace technology that will make their lives easier.”
That’s because there’s a real potential upside here too. Getting to actually interact with and benefit from all that collected info could even take some of the sting out of years of snooping by app and device makers.
“If your phone is already taking this data, and currently it’s all just being harvested and used to ultimately serve you ads, is it beneficial that you’d actually get an element of usefulness back from this?” Gebbie says. “You’re also going to get the ability to tap into that data and get those useful metrics. Maybe that’s going to be a genuinely useful thing.”
That’s sort of like being handed an umbrella after someone just stole all your clothes, but if companies can stick the landing and make these AI assistants work, then the conversation around data collection may bend more toward how to do it responsibly and in a way that provides real utility.
It’s not a perfectly rosy future, because we still have to trust the companies that ultimately decide what parts of our digitally collated lives seem relevant. Memory may be a fundamental part of cognition, but the next step beyond that is intentionality. It’s one thing for AI to remember everything we do, but another for it to decide which information is important to us later.
“We can get so much power, so much benefit from a personal AI,” Gruber says. But, he cautions, “the upside is so huge that it should be morally compelling that we get the right one, that we get one that’s privacy protected and secure and done right. Please, this is our shot at it. If it’s just done the free, not private way, we’re going to lose the once-in-a-lifetime opportunity to do this the right way.”
Source link