What worries me about AI tooling

on 2022-07-17

I was listening Linux Downtime episode 51 and meanwhile it was nice to hear that Copilot has been helpful for open source developers.. It still worries me.

First off I still don't like they're selling the tool. A proprietary tool.

Like I guess I'm glad my awful code could benefit some system that helps to solve problems. But also I am not glad that some big billion software house took it and sold it. It's unethical to me, but corporations have never cared about ethics. So I am yelling at a wall basically.

Anyhow, that's not my main worry.

The actual worry is that suddenly everything has their own AI assisted service and that you're soon to be expected to use them to live "normally." We already live in this subscription bs, now slap AI on top of it as another layer.

Now the current subscription bs is not necesary for you to live, but AI can be something that is soon to be expected for everyone to have to help them, and anyone who doesn't use it is considered slow. Because in the end it's a productivity helper.

And that of course kicks out anyone who is too poor to have one or just simply doesn't want an AI assisted tooling.

This will keep happening if the AI stuff is kept proprietary. It may be the next "internet" style of thing, it may not. But if we want to make AI tooling etc. more widespread, we really need to have somekind of open standards for them and all AI tooling should be open and free.

I am actually not against proprietary software in the first place, or even proprietary systems. But tools are different to me, since they're tools. Anyone should be able to build their own tools for free. It gives everyone a chance to start.

Another thing is that if all AI tooling is kept closed, who knows what they're doing behind the scenes. Github Copilot, for example, could be very easily gathering more stuff to it's training database from places it has no legal access, like Gitlab repositories.

Now imagine tens of these AI tools, competing against each other... I just don't think it's gonna be sustainable, and I am quite sure the code quality will suffer from it.

I'll keep avoiding AI tooling for now

But yeah, as always, corporations don't care about ethics of things. And that's why I don't want to use Copilot.

Besides I feel like my learning capabilities would suffer a lot when I can just press tab key and have magically everything appear on my screen. Learning is one of the main reasons why I like programming, I don't want to take that away from myself.

Honestly the only way I am going to trust any kind of AI tooling is that I can peer behind the scenes, even if I don't understand anything what's going on.

And why I don't trust it? Well, would you trust a random person walking by you suddenly telling you how to write a program? They may be correct, but would you trust them?

But that could be because I'm Finnish. :P