While I never had it happen, it could give you wrong command line switches that do damage. For example, when I asked how I could list volumes attached to an AWS instance, it gave me a “modify-volume” command instead of “describe-volume” command. Thankfully, I caught that before I cut and paste it.
You can ask it for source now with browser integration. Previously the browser extension was a separate model with gpt3.5 which was pretty bad, now it’s just integrated into gp4. It works a million times better and it’s great that it doesn’t break the flow of the conversation.
I had an emailed a question that I didn’t really know where to go with, so I asked Copilot to answer the email factually. Sent that email with a note of ai origin, but it was close enough and got us into right track
I find myself going to ChatGPT for this stuff now.
“I’m trying to do something like [concept]. What is that called and can you give me an example”
Usually I get my results faster and easier than Google.
be careful using it as your only source of truth, even more so when you don’t know what you’re searching for exactly
If it spits out the wrong syntax my compiler will tell me immediately.
While I never had it happen, it could give you wrong command line switches that do damage. For example, when I asked how I could list volumes attached to an AWS instance, it gave me a “modify-volume” command instead of “describe-volume” command. Thankfully, I caught that before I cut and paste it.
had a similar problem searching for gcloud commands
Oh yes. With that sort of thing better double check each time.
It’s bad enough at programming that you can often see the problems without the help of the compiler
Last thing I asked it for, after the fourth draft still had undeclared variables and called imaginary libraries (which if they existed would be great)
It was good for coming up with a nice structure for a small program
You can say the same for Google
You can ask it for source now with browser integration. Previously the browser extension was a separate model with gpt3.5 which was pretty bad, now it’s just integrated into gp4. It works a million times better and it’s great that it doesn’t break the flow of the conversation.
I had an emailed a question that I didn’t really know where to go with, so I asked Copilot to answer the email factually. Sent that email with a note of ai origin, but it was close enough and got us into right track