Watch out for single narrative answers of ChatGPT

Ranjit Damodaran
2 min readMar 5, 2023

It is the continuation of the blog written previously about ChatGPT.

Photo by Etactics Inc on Unsplash

Now that Chat GPT has been in the public realm for some time, it is high time to write more about the findings.

Chat GPT and its Bling Chat are extremely addictive. You can ask direct questions and get crisp answers. Answers can be within 10 to 20 lines. There is no need to go over the links of google output. It was always a pain to dig for the answers from the links shared by Google. ChatGPT eliminates all those things, and it gives concise answers. However, here comes the catch.

The answers that you get from ChatGPT are always a single narrative. It doesn't give you options, mostly confined to the question of what we ask.

The answers I get depend on how I phrase my questions. For instance, if I ask ChatGPT, “Should I buy a share of an xxx company?” it tells me to buy it and gives me some reasons supporting it. But if I ask it, “What are the drawbacks of buying a share of the same company?” it tells me to avoid it and explains why. The first question implies that buying shares is good, while the second implies otherwise. We should be careful about how we ask questions.

In hindsight, it is a bit obvious, ChatGPT derives the answers from the net.

Sometimes it gives us specific and authoritative answers to questions which has no direct answers.

--

--

Ranjit Damodaran

Tech enthusiast, Project Management. Interested in Complexity science, Economics, Psychology, Philosophy, Human Nature, Behavioral Economics, almost anything.