Using ChatGPT for anything more than repetitive or random generation tasks is a bad idea, and its usefulness becomes even more limited when you're working with proprietary code that you can't directly send to ChatGPT. Even with personal projects, I still try to avoid ChatGPT as much as possible for the simple reason that I'll be forced to pay for it if it becomes an essential part of my workflow when it leaves this free beta testing phase.
Programming
Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!
Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.
Hope you enjoy the instance!
Rules
Rules
- Follow the programming.dev instance rules
- Keep content related to programming in some way
- If you're posting long videos try to add in some form of tldr for those who don't want to watch videos
Wormhole
Follow the wormhole through a path of communities [email protected]
Exactly this! I hate hearing politicians and rulemakers discuss how ChatGPT and LLM are going to be relevant everywhere and how ChatGPT should already be incorporated into education. They literally call it a "research preview", you can only assume that when they've gathered enough data, they're going to shut it down, or at least reduce its capacity by a lot.
With that said, I really enjoy using it. Mainly for brainstorming topics or new projects, and what technologies to use in them. Sometimes I also find a use for it as a therapist, for social topics I don't really know who to ask, and I expect a generic reply anyway.
On the politicians / rulemakers side of things, that may or may not be a good thing tbh. Technology moves so fast and traditionally the aforementioned groups are glacial and can't keep up, sometimes to the benefit of a small group, often to the detriment of the majority. Having this on their radar relelatively soon is potentially a useful change.
While it's nice that politicians are enthusiastic about new technologies, I think ChatGPT is one example where they shouldn't force mass adoption. ChatGPT is a proprietary model owned by a private corporation, and it's made very clear that interaction data with ChatGPT will be collected and used by OpenAI for its business. It's horrible for data security and it helps to strengthen OpenAI's monopoly. Honestly, governments recommending privately owned software and technologies should be considered advertising.
That side of it I wholeheartedly agree with. Perhaps I'm just deluding myself into thinking technology awareness early on makes for better legal infrastructure to handle its effect on society. I really would like that to be the case.
But yeah agree, "ChatGPT" being synonymous with "groundbreaking AI" to the vast majority of the public (I suspect) is not great from a monopoly perspective.
governments recommending privately owned software and technologies should be considered advertising.
Is this not also true if the software is open-source? It's still advertising, but it's somehow ok because a corporation doesn't benefit? It's not that I don't agree with you - regulatory capture and vendor lock-in are much less of a concern for free and/or open-source software, but that doesn't mean it's not still advertising.
That's true
One thing I used ChatGPT for recently was generating test data.
Hey ChatGPT, I use SQL Server and here is my table structure, please generate an insert query with 10 rows of fake test data.
It wasn't perfect, but honestly nor is the test data I would have written. It was a great starting point and saved me a lot of time since this is a legacy app with some wide tables (30+ columns).
Me too! I used it recently to generate some fairly specific test data that would have taken me probably 30 minutes of massaging instead of the 30 seconds of creating the right prompt. So helpful!
I've been using ChatGPT at work quite a bit now. Some of the things I've used it for are:
- Writing a shell script that scrapes some information about code modules and shows them neatly
- Minor automation scripts that setup and make my day to day docker workflow easy
- Writing random regex, sql, lua pattern matching functions
- It turned out to be surprisingly good at creating code examples for certain undocumented APIs (kong.cache, kong.worker_events, kong.cluster_events) in Kong API Gateway.
- Copy pasting a rough python automation script, converting it into Go, and adding it in the application itself.
I still don't feel comfortable using it for anything big.
I haven't found a good use case for it yet personally, but I am excited to hopefully get it to start helping with the boring boilerplate and similar things.
I am a little afraid that it'll be the first bit of common tech I am too old to really grok and I'll just be that old dude in the corner that insists on doing it how everyone did "back in my day."
I actually use it a lot, especially for stuff like "I need to change this code to something like this". It's usually pretty spot on and has saved me a lot of typing. Complex code not so much, at least not yet. I think it's capable of it to some degree but I've found I much prefer to offload the "mundane or tedious" stuff to it.
Our understanding is we can't use it at work, and I do have a personal project going at the moment, but maybe I'll give it a shot for the next one.
I tried to use it, but it have some big issues in reliability, because at the end of the day, despise the dataset it's trained on, it's still something I describes as a "language interpolation."
It sometime make TERRIBLE recommendations for which tools/libraries I should explore, because it assumes that those libraries might have support. Those libraries never does and so I wasted weeks on it. (It doesn't help that both code and project are undocumented.)
So after that experience, I demote ChatGPT usefulness to just "cleaning up pre-written documentation so it sounds better." That's it.
I, personally don't trust ChatGPT. So whenever I use it I always look at code it generated until I understand what it does. And the truth is that it takes me more time to understand it than if I would've wrote code myself.
It does wonders for repetitive tasks or data generation, though.
I believe in any place where security is critical, developers should not use AI-generated code irresponsibly.
I usually use it more to help me write documentation and add comments on some functions. It helps explaining what a function does.
To write code I usually just use it to write simple functions or a template code for me to start from somewhere. I avoid using it with external Libraries as in my experience, it likes to "invent" functions and methods that are not implemented.
I primarily use ChatGPT to ask about how to use X, Y, Z software libraries together, and other general programming questions. Unfortunately, the project that I work on is proprietary, so I'm not allowed to just feed ChatGPT my code and tell it to correct/refactor it. But that's fine, because ChatGPT still provides a lot of value to me for free.
This is a paywalled URL, but at least private windows work on Medium.