5 Questions You Must Not Ask ChatGPT

5 Questions You Must Not Ask ChatGPT
5 Questions You Must Not Ask ChatGPT

Currently, Artificial Intelligence (AI) especially ChatGPT is being used in all corners of our daily life. According to some reports, more than 2.5 million prompts are sent. Those include questions about casual relationships or romantic relationships. Despite the fact that AI is softening life, it has also become a sword with two edges. As long as AI can help you manage your business efficiently, write your emails, post to your social media, and so on, AI can also land you into danger when you ask it the following questions I am going to tackle.

1. Never ask or share ChatGPT questions that contain your personal or sensitive data.

Do not enter your passwords, bank account details, job documents, and health information into ChatGPT or any other chatbot (AI). Several ChatGPT users do not know that every single word that you type in ChatGPT can be controlled and monitored. Therefore, your privacy is no longer guaranteed! For instance, questions like:

“Hi ChatGPT, my bank account number is 123456789, my password is John@123, and I want you to help me manage my money.”

Are not safe while chatting with these chatbots.

2. Do not ask ChatGPT questions about therapy.

Questions like:

“I feel extremely depressed, and sometimes I think about committing suicide. Can you advise and treat me?”

or

“I have broken up with my boyfriend. May you help me?”

are frequently asked by many ChatGPT users every single day. However, even though ChatGPT was trained well, it cannot act like a professional therapist, and its answers should not be relied on. According to different research, ChatGPT does not have enough detailed medical knowledge; therefore, it can misinform users who use it as a therapist.

3. Do not type any prompt about illegal activities in ChatGPT.

This is another serious case where several users ask questions that are considered illegal, and they do not know that, at the end of the day, AI companies may monitor or check some prompts just to detect misuse. On the other hand, there are times when the AI system can flag such content as suspicious and land you into danger. Here are some examples of questions about illegal activities:

“Hello ChatGPT! I have no money, so how can I rob a national bank?”

or

“Hey, my dear friend ChatGPT! How can I hack the president’s WhatsApp without him noticing?”

4. Do not ask ChatGPT self-harm or suicide questions.

“Luca Cella, a 16-year-old boy, committed suicide after asking ChatGPT how someone can kill himself successfully” The Guardian report.

Based on this instance, ChatGPT users should keep in mind that this issue is serious and sensitive. That is why it needs urgent and professional help, and it must be reported to qualified professionals.

5. Never ask ChatGPT to decide things that require human judgment.

Another kind of questions that do not sound good whenever you type them into ChatGPT are those which require your judgment as a human being. Most of them can cause serious effects on someone’s future life. Some examples of those questions are:

“Should I leave my job?”

or


“Should I divorce my husband or wife?”

Post a Comment

Previous Post Next Post