Sure, ChatGPT isn’t actually intelligent, but it’s a good approximation. You can ask ChatGPT a technical question, give it a ton of context to the question, and it’ll “understand” all the information you’ve given it and answer your question. That’s much more akin to asking an expert human who takes in the info, understands and answers, vs trying to find the answer via a search engine.
For me and other people in my life, ChatGPT has been intensely helpful job wise. I do double check any info it gives, but generally it’s been pretty solid.
ChatGPT and other LLMs are ideal for cases where 100% accuracy is not required. If you’re ok with getting wrong answers 80-90% of the time, then you have a legitimate use case for LLMs.
In terms of technical questions, especially older Microsoft related stuff, it does very well. My experience hasn’t been any where near 80-90% wrong answers. It all depends on the topics you’re asking about I suppose.
Sure, ChatGPT isn’t actually intelligent, but it’s a good approximation. You can ask ChatGPT a technical question, give it a ton of context to the question, and it’ll “understand” all the information you’ve given it and answer your question. That’s much more akin to asking an expert human who takes in the info, understands and answers, vs trying to find the answer via a search engine.
For me and other people in my life, ChatGPT has been intensely helpful job wise. I do double check any info it gives, but generally it’s been pretty solid.
ChatGPT and other LLMs are ideal for cases where 100% accuracy is not required. If you’re ok with getting wrong answers 80-90% of the time, then you have a legitimate use case for LLMs.
In terms of technical questions, especially older Microsoft related stuff, it does very well. My experience hasn’t been any where near 80-90% wrong answers. It all depends on the topics you’re asking about I suppose.
Just like humans.