Do you really know how much of your own personal information is being used for AI learning? The technical term for this is Data Mining. Instead of men in helmets digging along the wall of a cave with a pickaxe, today’s payload is coming from your own smartphones and laptops. More specifically, text mining collects data such as word use patterns and frequency. For example, if 30% of a company's customer service chats include the word dissatisfied, the company could be alerted to isolate those chats and see if there is a specific product issue. Then again, if humans were talking to customers they could simply bring the concern to their managers without an AI process. That is not the world we live in anymore. Data and text mining may sound no less procedural than your electric company requiring you to verify your address, but is there a more invasive, sinister goal involved?
On the surface, your personal correspondences are being searched for keywords so that advertisements can be tailored to your own interests, wants and needs. In the workplace your data is being mined so that your company can provide a happier, more productive workplace. Whether at home or on the job, your emails, text messages, web searches, online web chats-
How can I help you?
I need to change my phone plan.
Here are a few articles that can help you with your phone plan.
Do you still need assistance?
UGH!
Starting this month, June 26, 2024 META will be using your data for AI training. If not for European laws that require tech companies to notify users of data mining we would have never known this was happening. Current and past posts, comments, even how you navigate both Facebook and Instagram will be mined for AI education. Unless you live in one of the countries that carries stricter online protection laws (the United States does not), you can’t opt out. Again, the reason stated for using your data is targeted marketing. AI has another goal however, to become as human as it can. To do that it has to learn how to behave exactly like we do. Imagine this experiment, you are put into a room where you are alone and then told to identify who is in the room next to you separated only by a sheet thin enough to hear that person, but not see them. The person next to you begins to tell you about their day, how they made eggs and toast for breakfast then went to the post office to get stamps. You immediately recognize not only that person's voice but the way they speak as that of your best friend. Voice cloning has already been mastered by AI and used to make fake phone calls. The more data it mines, the better it gets at replicating each and every one of us. If you’re online, phone, computer, tablet, your information and usage is being mined.
Social media has already become a bombardment of AI generated images. We’re failing in our ability to recognize them and instead commenting with admiration at these far from real photographs. If you look closely at some of these glossy, overtly perfect scenes you will start to notice they are flawed. Like a classic sci-fi novel, not everything is what it seems and it is in the details that alternate reality can be found. A photo of a purple iris, the texture of fabric and a pattern never seen in nature brought dozens of responses of people asking where to buy one. Another, a scene of the inside of a coffee shop looking through a window into a rain drenched town came with queries of what the name of the coffee shop was. Only a few noticed the inside table was wet, the coffee cup had two handles and an open book had no words on the pages. If we can’t recognize what AI can do in its elementary phase, where will we be when it learns all it needs to become us?
Experts suggest clearing your computer’s cookies, limiting personal information sharing and using a VPN. Outside of that, we’re all in the midst of uncharted territory where fiction just may become reality.
Comments