Not trusting Chat-GPT results, which are known to hallucinate false information, as your primary search method is a silly take? AI was telling people to put glue on pizza to keep the cheese from falling off. If you can see that the source of that information is a Reddit shitpost, you are way more likely to make a good judgment call about the veracity of that information.
If you want searches without sponsored results, use SearXNG or an equivalent that strips out the ads.
You can actually ask for its sources now and fact check yourself. But just like anything you read online, use common sense. I’d see those same results in a Google search too.
If it’s something serious, yes. Like fixing something. I also use it as an idea generator. I needed to figure out why my toilet wasn’t flushing. It suggested the flapper. So then I went to YouTube and looked up a video on how to install it once it pointed me in a direction.
This is such a silly take.
Not trusting Chat-GPT results, which are known to hallucinate false information, as your primary search method is a silly take? AI was telling people to put glue on pizza to keep the cheese from falling off. If you can see that the source of that information is a Reddit shitpost, you are way more likely to make a good judgment call about the veracity of that information.
If you want searches without sponsored results, use SearXNG or an equivalent that strips out the ads.
You can actually ask for its sources now and fact check yourself. But just like anything you read online, use common sense. I’d see those same results in a Google search too.
Do you ask for the sources every time?
If it’s something serious, yes. Like fixing something. I also use it as an idea generator. I needed to figure out why my toilet wasn’t flushing. It suggested the flapper. So then I went to YouTube and looked up a video on how to install it once it pointed me in a direction.