close menu button
👋

Hello there!
Log into the academy:

In order to log into your academy profile, please provide your email and password

Forgot password?

Did you previously use

Don’t have an account yet?

You are being brain-drained.

How AI companies silently change policies on your work.

Slack joined a growing group of companies that feed YOUR private data to AI models.
They did it quietly — almost in the middle of the night — by just changing their terms and conditions.

Your work and/or activity is being parsed on a massive scale — often by your favorite apps you use daily — like Figma.

In the case of Slack, it means, that all that NDA-protected stuff in your private slacks, your client work, company secrets, financial reports, and more, are all being sent to an AI to “learn” from it.

opt out

But hey, there’s good news!

You CAN opt out of this!

Of course, the right way would be to Opt In instead of out and features like that should never be enabled by default. But at least they’re letting you just click a button in the settings and …
Of course not. That would be too easy.

The way to opt out is by SENDING AN EMAIL to feedback@slack.com. The email needs to be sent from the main, admin email account and contain “Global model opt-out request” in the subject line.

Then just paste your workspace name or URL in the copy and click send.
That’s convenient! They could’ve asked you to print the request, go to the post office, and mail it to them like many companies now also do with refund requests.

(yes — it’s illegal!)

data usage

Figma and OpenAI

Figma — a popular design tool used by about 80% of the industry has a deal with OpenAI to use their models for what they call “AI features and product improvements”.

The description of what they do exactly is quite vague on purpose as it allows them to change the range of action later with just a few tweaks to the policies.

They do however state that your files, designs, and other work is NOT used to train AI models. They want to emphasize it so much they actually repeated it twice in just one short paragraph.

Figma’s agreement with OpenAI provides that data is not to be used for model training. Data inputted into AI features is sent to OpenAI for processing and generating AI output. Data is temporarily retained in OpenAI’s environment to provide the services, however it is not used for model training.

But notice the wording: Data is NOT to be used. It’s also a vague term and it’s avoiding the phrase “never will be used”. The current approach leaves a small window open for a “but…” for some rare occasions where data “can” be used to train models.

OpenAI is also famous for pushing against all control and regulations (growth at all costs) which resulted in many chief people there recently quitting.

vortex of evil

Vague terms leave a window

Now don’t get me wrong.

I’m not suggesting Figma is currently training AI models on your work so they can launch an automatic design generator and replace humans.

If they did and it came out it would be a PR disaster that could destroy the company. They probably aren’t doing anything like that.

But why the vague descriptions? Why not being completely open and clear about it?

If it’s only used to replace your lorem ipsums in text fields it should be clearly communicated as such.

And most of all why do you have to OPT OUT MANUALLY?
Things like that should always be an OPT-IN feature. It doesn’t matter what that data is used for.

Also: if you’re on the free plan and using drafts to learn design or work on your first freelance clients you cannot opt out at all.

data money

AI Companies are going crazy

While chasing growth and fighting fierce competition AI companies often cross the line. Sure, we should inherently trust they’re all good people and are telling the truth that they’re not really training any models.

But remember how Adobe praised their ethical approach to training Firefly AI on only free, open-source images?
Well, it turned out they were actually training it on Midjourney images that are often accused of unethical actions and violating copyright.

And yet Adobe … promised.

firefly

So just to be safe always opt out of AI features. Just in case.

And especially when your work is NDA protected because there have been leaks of private data from chatbots before. If someone hacks the AI that stores your stuff you can end up paying a hefty fine to your clients.

facebook

Apple shows the way

Apple has taken a different approach regarding privacy and introduced an OPT IN to app trackers instead. That doesn’t suddenly mean they’re all good and ethical as most big companies do some evil things to win, but at least this is a step in the right direction.

This exact thing should also happen with AI tools.

I’ll repeat it one more time — check what apps you use to do with your data and whenever you can opt out of AI parsing of ANY kind. Even if it’s not to be used to train models.



Liked the article? Share it!

twitter iconlinked in iconfacebook icon

Similar articles

cover

AI’s big, dirty secret

Read article
cover

Paid vs Free stuff

Read article