I’ve used them and have yet to get a fully correct result on anything I’ve asked beyond the absolute basics. I always have to go in and correct some aspect of whatever it shits out. Scraping every bit of data they can get their hands on is only making the problem worse.
To say you’ve never gotten a fully correct result on anything has to be hyperbole. These things are tested. We know their hallucination rate, and it’s not 100%.
I have used them in a large variety of ways, from general knowledge seeking to specific knowledge seeking, writing code, generating audio, images, and video. I use it most days, if not essentially every day. What examples would you like me to provide? Tell me and I will provide them.
You sound like an old man yelling about the TV. LLMs are NOT unhelpful. You’d know this if you actually used them.
I’ve used them and have yet to get a fully correct result on anything I’ve asked beyond the absolute basics. I always have to go in and correct some aspect of whatever it shits out. Scraping every bit of data they can get their hands on is only making the problem worse.
To say you’ve never gotten a fully correct result on anything has to be hyperbole. These things are tested. We know their hallucination rate, and it’s not 100%.
Please read the entire comment. Of course it can answer simple stuff. So can a google search. It’s overkill for simple shit.
In all of your replies, however, you fail to provide a single example. Are they writing code for you, or creating shitty art for you?
I have used them in a large variety of ways, from general knowledge seeking to specific knowledge seeking, writing code, generating audio, images, and video. I use it most days, if not essentially every day. What examples would you like me to provide? Tell me and I will provide them.