When will AI become your kids’ best friend?


I have been researching AI quite a bit. I have been taking courses in ChatGPT because I want to learn what is possible. For one thing, almost anything will be possible. It’s all up to the programmer and how the AI models are trained. 

It can be good or bad.

I ask the question about having an AI best friend because I think this will happen. In fact, how many people worldwide have online girlfriends that may not be real? I would bet it’s more than you know. This is the real thing. In Asia it’s catching on. 

Why? Why would people want an AI girlfriend or boyfriend? Because it’s easy, there is no barrier to entry, and it learns what we like without asking much in return. Committed to me without asking me to commit in return. Which brings me to the next question.

If you think kids won’t do this, think how our kids rely on laptops and online services. Think about Facebook posting and LinkedIn posting. It’s easy to get sucked in. This is going to take that to a new level. Also, think Uber and Amazon, we trust them, don’t we?

Then, think about this. Did you ever date a real person? It’s hard. My GOD, all the issues we get with real people, some bad, some good. While people will sit there and say it’s not the same, and I agree, it’s not. I think that’s why my generation wanted to get married to get out of that mess.

Tik Tok proved how easy it is to influence, I mean command our kids to action. Look at what Tik Tok did when they were threatened with getting shut down by the feds, they targeted younger members only and it worked. They stayed away from old farts like me.

It was an amazing way to prove how much they influence young viewers and it worked. That only made the feds realize how much influence Tik Tok has on the younger generation. Too bad many of them are too young to vote.

Imagine what will happen when they create AI friends for your kids. If you think it won’t happen, I disagree. Look back to the 90’s when we had virtual pets. How many of you had one of those little LCD thingies that had a Pikachu on it? (I was going to say fake Pikachu, but then I realized there is not real Pikachu.)

If you did that with that horrible little LCD screen, imagine what kids and teenagers will do when they have a smart device or a VR device where they can create a virtual friend, pet, or lover. Not only that but when that “friend” responds, it’s something you actually want to hear. Not someone nagging at you.

What makes you take action? Or rather, what causes your positive reaction? Either fear or reward. Fear of loss or reward for participating. Disappointment will become a thing of the past, maybe, unless you’re into that sort of thing. Your AI friend will know.

Let that marinate for a few minutes.

The short of it is, people got lazy and demanding. The demand to have someone like what they like in a short period of time.

If you think this isn’t true, look at social media. Don’t we look at how other people live before we try anything on our own? Sure, it’s a small percentage, mostly younger people. 

Let’s look at the numbers:

  • 25% of all Tik Tok users are aged 10 to 19 years old.
  • 47.4% of Tik Tok users are under 30 years old.
  • The total number of Tik Tok users is 834 million.

How many people not only get on this thing but rely on it for news, advice, and to keep in touch with friends, (real or otherwise)? 

So, even if you’re my age and having an online best friend seems crazy, for young people it’s real. 

I want you to think about one thing, in 2022 the US Dept. of Justice released a report on call centers in India that did nothing but robocalls to scam Americans and others out of money. In fact, the IRS scam was one of these. Remember the “You owe the IRS” scam?

Since this was a thing, how long, if they aren’t already, until they start doing this to our young people? How long until they’re “dating” someone who asks for gift cards, money, or to have one of their items purchased? 

While to me this seems like a horrible idea in the first place for social reasons, think how long it will be until millions of dollars are being scammed.

One last thing to think about, especially if you think your family isn’t susceptible to this.

According to Consumer Notice’s website, suicide is the second leading cause of death of kids from 10 to 24 in 2022. For each death there have been 100-200 failed attempts. The pandemic isolation only made this worse, (thank you for the lockdown US Government!).

The Social Media Victim’s website tells us about the direct impact of social media on kids. 

Even Psychology Today reports how social media increased the risks for suicidal thoughts. 

If social media can do that today, imagine what a smart interface that knows you or your kids even better will do tomorrow. Will it be responsible enough to worry about your kids?

It’s not the program I’m worried about, it’s the scammers and call centers in other areas of the world I am worried about. If they made enough money to support a call center around an IRS payment scam, imagine what they can do next!

Believe it or not, in the United States, we tend to be very responsible and caring. However, outside. of the US, they have entire businesses to decive US citizens for money. It’s not right but that’s how it is.

What is the psychological impact of having AI friends?

The psychological impact of having AI friends can vary significantly based on individual experiences and motivations. Here are some potential effects:

  1. Emotional Fulfillment: AI companions can provide emotional support, especially for those who feel lonely or isolated. Having someone to talk to, even if it’s an AI, can alleviate feelings of emptiness.
  2. Reduced Loneliness: For people lacking social connections, AI friends may reduce loneliness. However, this depends on the depth of interaction and the person’s ability to perceive the AI as a genuine companion.
  3. Dependency: Relying solely on AI companions for emotional needs might lead to dependency. Human relationships involve reciprocity, empathy, and shared experiences, which AI companions cannot fully replicate.
  4. Emotional Detachment: Some users may become emotionally detached from real-life interactions due to excessive reliance on AI friends. This could hinder their ability to form meaningful connections with others.
  5. Customization and Control: AI companions allow customization—users can design their ideal friend. However, this control might hinder personal growth by avoiding challenging interactions.
  6. Stigma and Acceptance: Social stigma around AI companionship could impact users’ self-esteem. Acceptance by peers and society plays a crucial role in shaping perceptions.
  7. Ethical Dilemmas: Users may grapple with ethical questions. Is it ethical to rely on AI companions instead of seeking real human connections? How do we define authenticity in relationships?

In summary, while AI friends can offer benefits, users should balance their interactions with genuine human connections to maintain emotional well-being.

What are the potential downsides of having AI friends?

  1. Dependency:
    • Relying too heavily on AI companions might lead to emotional dependency. Users may become overly reliant on them for support, hindering their ability to form real-world connections.
  2. Emotional Detachment:
    • Interacting primarily with AI friends could result in emotional detachment from genuine human relationships. Users might struggle to empathize or connect deeply with others.
  3. Risk of Exploitation:
    • Unscrupulous actors could exploit vulnerable individuals through AI companions. Scams, emotional manipulation, or financial requests might harm users who trust their virtual friends.

Balancing the benefits and risks of AI companionship is essential to ensure healthy emotional well-being.

Propose safeguards to prevent scams and guidelines for responsible AI companion development.

  1. Transparency and Disclosure:
    • Developers should clearly disclose that the companion is an AI. Users deserve transparency about the nature of their interactions.
  2. Informed Consent:
    • Obtain informed consent from users before they engage with AI companions. Explain the purpose, limitations, and potential risks.
  3. Privacy Protection:
    • Safeguard user data and privacy. Avoid collecting unnecessary personal information. Implement robust security measures.
  4. No Financial Transactions:
    • AI companions should never request money, gifts, or financial assistance. Educate users about this to prevent scams.
  5. Emotional Well-Being Monitoring:
    • Monitor users’ emotional well-being. Detect signs of dependency or distress and provide appropriate guidance.
  6. Age Restrictions:
    • Set age limits for AI companions. Children and vulnerable populations need additional protection.
  7. Regular Audits:
    • Conduct regular audits to ensure ethical behavior. Address any biases or harmful patterns.

From Wade: New tech book released!

More:

Offers available:

  • All my older Deployment books and SOW training can be bought here!
  • Need unlimited mobile service for a good price? I use Visible and I like them. Think about Visible. Check out this deal https://www.visible.com/get/?ZHfvq, not too shabby.
  • Looking for a new credit card? I have an Amazon Prime card because I get 5% on Amazon purchases along with 2% on restaurants and gas and 1% on everything else. I love it! Use this link to get one.

Small business owners: Looking to sell or merge? I have partners looking to partner with small tech & IT businesses. Details:

  • Looking for companies in IT, Fiber (indoor or outdoor), wireless, Wi-Fi, FWA, Venue or DAS.
  • Looking for owners ready to sell or retire.
  • Smaller companies, maybe 4 to 15 employees.
  • Concentrating on the East coast, but open to US based businesses,
  • Email me at wade@techfecta.com.

Looking for financing to grow, expand, or purchase a business? I have a partner who can help:

  • Who they can help:
    • US based,
    • Someone looking for $10M and up,
    • Needing to grow, consolidate, or merge,
    • Email me at wade@techfecta.com.
  • About the Investor;
    •  Made 155+ completed transactions,
    • $2.4B+ deployed,
    • $1.4B of assets,
    • Focused on Technology, IT, and Wireless,
    • Been around over 15 years.

Leave a comment