this post was submitted on 19 Jun 2023
23 points (96.0% liked)

Technology

34788 readers
424 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 3 points 1 year ago (2 children)

Leaking industry secrets is a much bigger concern that boosting productivity a little bit.

We're talking about very specialized engineering work, it's not something you can totally rely on a bot to do, though it might help sometimes, it's fully understandable for specialized companies to want to ban GPT internally, until there's a way for them to host a totally internal one.

[–] [email protected] 2 points 1 year ago

We're talking about very specialized engineering work,

We're not though. This isn't a policy preventing them from disclosing them from talking about specific company IP (which is almost certainly covered by existing NDAs already). This prevents them from using it internally at all.

I use ChatGPT at work all the time, usually for getting very specific information about products I have to integrate with, quickly parsing new API documentation, and learning about unfamiliar processes at a conceptual level before I have to dive deeper for a project. It's more the context around which I'll be building the specialized IP. It's the sort of stuff I can learn via Googling (or sometimes Stack Exchange), but can learn it faster in a more targeted manner by asking detailed questions to the chatbot.

[–] [email protected] 0 points 1 year ago (1 children)

On this I agree entirely. The potential for corporate espionage because of unwitting employees using an LLM through unofficial means is huge.

At the very least, the corporation itself would have to be the customer, so that watertight terms might be negotiated, not the employee.

[–] [email protected] 1 points 1 year ago* (last edited 1 year ago)

I don't think being a customer would work either, language models are still on the training, noone knows exactly how users queries are used, that's a big no no for every company having to protect their secrets.

A self-hosted instance is a much better solution, if not the only "safe" one from that point of view, we'll get there.