Hello everyone,
I apologize if this is a debate that has already taken place. Please delete the post, and kindly indicate where I can send my message.
We all know that in technology, there are always things where one has to accept trust in the developer(s), whether it's hardware or software. Some things are currently unavoidable to change in the short term, so that's not where I'm focusing my point.
But something bothers me about "Open-Source" applications. I don't know how to compile, and I'm not willing to dedicate so many hours of my life to learning it. So, in addition to trusting reputable companies, I now choose to trust a reputable person or group, who likely receives code audits for their open-source code. However, these audits are based on the open-source code, not on what ends up being compiled for my final consumer execution. In the end, each project is a bucket of trust unless I know how to compile. And even then, there may be ways that something slips past us, but I understand that it would at least reduce the risk. I read that F-Droid did this: they didn't trust the app creator, but rather compiled their own version from the open-source code. It seemed fantastic to me, but the problem was always the delay.
The question is: Couldn't a program with AI be created to compile any GitHub repository directly? It would eliminate the need to trust even the developer themselves; we would only have to trust their code, as we already do today, and the audits would finally have real value. We would know that what we receive is that code.
I would also love for the concept of Flatpak to be like this: that the developer doesn't sign the binary, but only signs the code, and Flathub (or some backend) creates everything automatically. And if there are doubts about Flathub or any other backend, users could do it locally. It would be a bit more tedious, but its value in privacy would be enormous.
By the way, if any of this already works this way and I am confused, please enlighten me.
Thank you very much, everyone!
In theory AI might be able to analyse the project files and figure out what kind of compiler and configuration are needed, which could then be executed automatically. Is this what you're describing, some kind of AI powered user friendly interface that lets you compile the project on your own machine? Because what you wrote sounds like you want the AI to actually just read the source code and produce machine code from it. Also, if you use e.g. archlinux, there is an entire user repository which consists of build scripts for software which often let you compile the package with a simple command. This seems similar to what you are describing in regards to flatpak. However, since these scripts are typically written by a third party, that adds another level of distrust.
You're correct, I suggest a user-friendly AI interface to assist with compilation, not for AI to produce machine code directly. The idea is to increase transparency and trust, especially for non-technical users. The Archlinux scripts you mentioned are indeed similar to my thought, but as you noted, third-party involvement may raise trust issues. Hence, AI might add an extra layer of verification, making the process more understandable. It's a complex issue worth exploring as technology continues to evolve.