This series was in response to another thread [1] which wanted to make rust mandatory in an upcoming release.
The authors proposal was to instead take the middle ground and use rust as an optional dependency until a later point of time where it becomes mandatory.
The later point of time was decided based on when rust support lands in gcc, which would make things smoother, since platforms which support gcc would also be included.
The GCC compiler collection has been hit and miss though. Nobody uses gcj for example. I sort of doubt that they'll be able to implement a good compiler for a language that doesn't even have a standard without that implementation going wildly out of date in the future, just like what happened with Java.
There's two different methods by which Rust support can be added to GCC: adding a Rust frontend to GCC and adding a GCC backend to the Rust compiler (rustc_codegen_gcc). The latter approach would not be (as?) susceptible to implementation divergence as an independent frontend.
Maybe I'm just old and moany, and I need to step aside for bigger and better things such as Rust.
But.
Now rather than needing to understand just C to work on Git/kernel, you now need to also know Rust. The toolchain complexity is increasing, and the mix of these languages increases the barrier to entry.
I'm highly invested into Git, having learned the tooling and having a significant number of projects constructed within it. I've written my own Git clients and have built a web server around Git repositories. I don't want to lose the hack-ability of Git.
> I'm just old and moany, and I need to step aside for bigger and better things such as Rust.
You are. This is firm "I don't want to have to learn new things" territory, which isn't a viable attitude in this industry.
In any case Rust is usually easier than C (excluding buggy C which is very easy to write), and certainly easier than actually learning the Git or Linux codebases.
We might also have different priorities. I do not care too much that google and apple want to lock down their smartphone spyware and sales platforms. The supply chain risks and maintenance burden imposed onto me by the Rust ecosystem are much more of an concern.
I don't know what this has to do with locking down phones, but I do appreciate not getting compromised just for cloning a repo or opening my laptop at a coffee shop.
This is not what I said, but memory safety is certainly not anything which is a high priority for my own security. I still think memory safety is important and I also think Rust is an interesting language, but... the hype is exaggerated and driven by certain industry interests.
Rust isn't popular just because of memory safety though. I think the memory safety message is maybe a little too loud.
It's also a modern language with fantastic tooling, very high quality library ecosystem and a strong type system that reduces the chance of all kinds of bugs.
It's obviously not perfect: compile time is ... ok, there aren't any mature GUI toolkits (though that's true of many languages), async Rust has way too many footguns. But it's still waaaaay better than C or C++. In a different league.
While I really really want devices I can own, I don't want to compromise security to do it. We need to do two things:
1. Lobby politicians to write laws that allow us to actually own the devices we bought.
2. Stop the FUD that a device that can be jailbroken is insecure. I heard this from our frigging CSO, of all people, and it's patently false, just FUD by Apple and Google who want you to be afraid of owning your device.
I want a device that's as secure as possible, but that I can own. I don't want to hack my own self just to get what I paid for.
It is a sad thing but I do root against secure boot initiatives because they almost entirely work to limit user's freedom instead of improving their security.
> You are. This is firm "I don't want to have to learn new things" territory, which isn't a viable attitude in this industry.
It's viable, but limiting. Sometimes you have to do things you don't want to, which is why it's called work. But if you can choose what platforms you work on, you can orient towards things where things change less, and then you don't need to learn new things as often.
Chances are, if you get into the weeds in a lot of C programs, Rust is in your future, but it's viable to not want that, and to moan about it while doing it when you need to.
It's not "having to learn something new", but "having to be good at two things, both of which are full languages with their own specifics, problems and ways to solve them, two sets of compilers and some duct tape to hold them together.
It's like putting steak on a pizza... pizza is good, steak is good, pizza on a steak might be good too, but to actually do that in production, you now need two prep stations and you can't mess up either one.
The idea that "new is automatically better and everyone not keeping up is lazy" is incredibly flawed and stupid.
Get this: Introducing a secondary language into the toolchain is a tradeoff in complexity vs the new language's proposed "benefits".
It's far better to introduce an AI code tool that fixes the unsafe code rather than increasing complexity with a second language. Open source C devs, get working on it. edit: wait, don't we have static code analyzers?!
Rust is over 10 years old now. It has a track record of delivering what it promises, and a very satisfied growing userbase.
OTOH static analyzers for C have been around for longer than Rust, and we're still waiting for them to disprove Rice's theorem.
AI tools so far are famous for generating low-quality code, and generating bogus vulnerability reports. They may eventually get better and end up being used to make C code secure - see DARPA's TRACTOR program.
The applicability of Rice's theorem with respect to static analysis or abstract interpretation is more complex than you implied. First, static analysis tools are largely pattern-oriented. Pattern matching is how they sidestep undecidability. These tools have their place, but they aren't trying to be the tooling you or the parent claim. Instead, they are more useful to enforce coding style. This can be used to help with secure software development practices, but only by enforcing idiomatic style.
Bounded model checkers, on the other hand, are this tooling. They don't have to disprove Rice's theorem to work. In fact, they work directly with this theorem. They transform code into state equations that are run through an SMT solver. They are looking for logic errors, use-after-free, buffer overruns, etc. But, they also fail code for unterminated execution within the constraints of the simulation. If abstract interpretation through SMT states does not complete in a certain number of steps, then this is also considered a failure. The function or subset of the program only passes if the SMT solver can't find a satisfactory state that triggers one of these issues, through any possible input or external state.
These model checkers also provide the ability for user-defined assertions, making it possible to build and verify function contracts. This allows proof engineers to tie in proofs about higher level properties of code without having to build constructive proofs of all of this code.
Rust has its own issues. For instance, its core library is unsafe, because it has to use unsafe operations to interface with the OS, or to build containers or memory management models that simply can't be described with the borrow checker. This has led to its own CVEs. To strengthen the core library, core Rust developers have started using Kani -- a bounded model checker like those available for C or other languages.
Bounded model checking works. This tooling can be used to make either C or Rust safer. It can be used to augment proofs of theorems built in a proof assistant to extend this to implementation. The overhead of model checking is about that of unit testing, once you understand how to use it.
It is significantly less expensive to teach C developers how to model check their software using CBMC than it is to teach them Rust and then have them port code to Rust. Using CBMC properly, one can get better security guarantees than using vanilla Rust. Overall, an Ada + Spark, CBMC + C, Kani + Rust strategy coupled with constructive theory and proofs regarding overall architectural guarantees will yield equivalent safety and security. I'd trust such pairings of process and tooling -- regardless of language choice -- over any LLM derived solutions.
In my experience current AI is still far from reasoning about the kind of hard-to-spot bugs in C that lead to the worst exploits. Rust solves most of these by design. It isn't about adding a second language - it is about slowly phasing out a language that is being misused in areas it shouldn't be in.
C will at some point be relegated to being an educational language, incredibly valuable due to few but good abstractions over assembly. It will continue to exist for decades in most systems, but hopefully it won't be used outside of the maintenance of legacy systems.
Perl, TCL and Python are all written in C, as well as many shells, so despite their interdependency the total complexity can be satisfied with a C11 compiler.
I did check this out. The shell, perl and python are likely for scripting and not used during runtime. TCL is likely some form of dynamic scripting.
I think we also have to be honest about what the project here is too, it's not to have both C and Rust together, but to replace all C with Rust. In which case, it probably makes sense to just clone to repo and work on a fork like they did with SSH.
> The shell, perl and python are likely for scripting and not used during runtime.
Some git subcommands are implemented in these. git filter-branch is a shell script, git cvsimport is a Perl script, and git p4 (perforce interop) is a Python script. There are not too many left these days (git add -p/-i also used to call a Perl script), but they exist.
I'm sure you are aware, reading between the lines of what you said, why, but for some others who aren't aware of the history of git; it was originally about 50% C and 50% Perl, the performance critical parts were written in C and then various git commands were written in Perl. Over time almost all the Perl was removed because there were less Perl monks than C devs.
Now it would seem the logic is reversed; even though there are less Rust devs than C devs, Rust is going to replace C. Maybe now that git is large enough and entrenched enough such a move can be forced through.
> it was originally about 50% C and 50% Perl, the performance critical parts were written in C and then various git commands were written in Perl.
IIRC, it was mostly shell, not Perl, and looking at the proportion is misleading: the low-level commands (the "plumbing") like git-cat-file or git-commit-tree were all in C, while the more user-friendly commands (the "porcelain") like git-log or git-commit were all shell scripts calling the low-level commands. Yes, even things we consider fundamental today like "git commit" were shell scripts.
I believe gitk and git-gui are written in tcl. Those are definitely things that get shipped to the user, so (at least for those parts) you wouldn't need to have a toolchain on the build server.
A number of the git commands were implemented in perl and shell. Now I see only git-svn is perl here for me and there's still a few shell scripts in /usr/libexec/git.
Agreed. And if someone is interested in contributing to the Linux kernel, a new programming language is far from the hardest thing that they need to learn...
Except now these software engineers have to code switch between languages.
Could you software engineers stop making things harder for yourselves and playing this meaningless flex of a status game, and you know, focus on something tangible, meaningful, instead of adding more bureaucracy?
I'm guessing you aren't a software engineer based on this comment, but the difference between programming languages is tangible and meaningful. It isn't like human languages where they're mostly basically the same and achieve the same thing.
And code switching between languages is not hard at all.
It's hilarious that you can assume such a thing just by a couple of words on the internet. Or maybe I'm not a 'software engineer' by your standards because unlike your closed group of SWEs I'm a lot less focused on resume padding and keeping my codebase sane and not exploding in complexity.
I should specify - it's hard in that it's troublesome to have to code switch and do a bunch of recall before working on the thing.
Say you've not worked on this secondary language for a long time, which absolutely happens, and have spend hours of effort to recall it. This takes time that you need not spend on, it's how your memory works.
I didn’t make the assumption but it sounded like a reasonable assumption based on the pronouns you used. You said “could you software engineers stop making things harder for yourselves.” A reasonable interpretation of this is that you aren’t a software engineer.
Reinforced softly by the rest of your comment not being technically sound. Adding a second language that is meaningfully different in its strengths and weaknesses isn’t “bureaucracy”. Bureaucracy is more like “sign a CLA before you can contribute”.
> I should specify - it's hard in that it's troublesome to have to code switch and do a bunch of recall before working on the thing.
You don't sound like you have any experience working on software projects. I can tell you it's not hard to switch between programming languages. If anything, the difficulty level is placed on onboarding onto projects you are not familiar with, but the programming language in use is far from being a relevant factor if you already are familiar with it.
Be prepared to argue semantics all day when it comes to rust community.
And be prepared to read that rust is "obviously" not hard despite there being literature stating the contrary. Including projects that migrated from rust to other languages. And projects where rust was considered and discarded.
They will write entire books worth of comments around the definition of simple. And why compiling rust isn't "ackchually" slow.
Im on the same boat :) But no worries. You can always build and use older git without rust. Of course, it will work for a while until those kids will change the proto for the "better". And being old and grumpy also means, you can slowly care less and less about all that moot :)
Kids: now downvote it into oblivion :) Like I give a shit...
I've also sent some patches git's way and I can't say I'm thrilled about being forced to (finally) learn Rust if I want to contribute again in the future. I guess I'm outdated...
They're proposing porting over one small piece that has no dependencies and exposing it to the rest of git via a C interface. Yes, they'll presumably port more over in the future if it goes well, but it's a gross exaggeration to characterize this as somehow making it impossible to contribute without knowing Rust.
I know that it is a "slippery slope" argument, but in the future, it will become more difficult to contribute without knowing Rust. That's the entire point of introducing it.
I understand that it's a minor change in its current state. However, it is a fact that the long term goal is to port everything to rust. Once that goal is accomplished, rust will be required. So it is not at all a gross exaggeration. It's a prediction of the future.
I don't even disagree with that goal, I think it's desirable that things be written in rust, it's a really good language that provides a lot of benefits. I think I've just been infected with the C virus too long. I can't even tolerate C++.
I suggest waiting till the gcc side matures, with the minimum of a working gcc frontend for a non optional dependency. Optional dependencies with gcc_codegen might be okay. Git is pretty core to a lot of projects, and this change is risky, it's on a fairly short time frame to make it a core dep (6 months?).
I am curious, what is the reason behind introducing Rust in Git?
I am not familiar with Git development, I am just a user. But my impression is that it is already a complete tool that won't require much new code to be written. Fixes and improvements here and there, sure, but that does not seem like a good reason to start using a new language. In contrast, I understand why adding it to e.g. Linux development makes sense, since new drivers will always need to be written.
Git is constantly gaining features, even if for the most part it seems like the core functionality is unchanged.
If you'd like to review the changelog, the Git repo has RelNotes but I've found GitHub's blog's Git category to be a more digestible resource on the matter: https://github.blog/open-source/git/
https://lore.kernel.org/git/ZZ9K1CVBKdij4tG0@tapette.crustyt... has a couple dozen replies and would be a useful place to start reading about it; beyond that, search that list for Rust. (Note, I’m only responding the opening question, not evaluating the arguments pro/con here or on the list; in any case, someone else surely will.)
not changing working code to prevent issues is unsafe.
we can go in circles all day with blanket statements that are all true. but we have ample evidence that even if we think some real-world C code is safe, it is often not because humans are extremely bad at writing safe C.
sometimes it's worth preventing that more strongly, sometimes it's not, evidently they think that software that a truly gigantic amount of humans and machines use is an area where it's worth the cost.
A test will never catch every bug, otherwise it's a proof, and any change has the probability to introduce a new bug, irregardless of how careful you are. Thus, changing correct code will eventually result in incorrect code.
I honestly can't tell if this is meant as serious reply to my question (in that case: let's say I agree that Rust is 100% better than C; my question still stands) or as a way to mock Rust people's eagerness to rewrite everything in Rust (in that case: are you sure this is the reason behind this? They are not rewriting Git from scratch...)
Everyone on hackernews is well aware that C makes it relatively easy to create buffer overflows, and what buffer overflows are. You're still not responding to GP question.
I'm not involved in the initiative so I can't answer the question definitively? I provided one of the major reasons that projects get switched from C. I think it's likely to be a major part of the motivation.
Right, I never mentioned that I am a decently experienced C developer, so of course I got my fair share of buffer overflows and race conditions :)
I have also learned some Rust recently, I find a nice language and quite pleasant to work with. I understand its benefits.
But still, Git is already a mature tool (one may say "finished"). Lots of bugs have been found and fixed. And if more are found, sure it will be easier to fix them in the C code, rather than rewriting in Rust? Unless the end goal is to rewrite the whole thing in Rust piece by piece, solving hidden memory bugs along the way.
This doesn't matter at all for programs like Git. Any non-free standing program running on a modern OS on modern hardware trying to access memory its not supposed to will be killed by the OS. This seams to be the more reasonable security-boundary then relying on the language implementation to just not issue code, that does illegal things.
Yeah sure, memory-safety is nice for debuggibility and being more confident in the programs correctness, but it is not more than that. It is neither security nor proven correctness.
Not quite the best example, since Git usually has unrestricted file access and network access through HTTP/SSH, any kind of RCE would be disastrous if used for data exfiltration, for instance.
If you want a better example, take distributed database software: behind DMZ, and the interesting code paths require auth.
Git already runs "foreign" code e.g. in filters. The ability to write code that reacts unexpectedly on crafted user input isn't restricted to languages providing unchecked array/pointer access.
There's at least one proprietary platform that supports Git built by via a vendor-provided C compiler, but for which no public documentation exists and therefore no LLVM support is possible.
Shouldn't these platforms work on getting Rust to support it rather than have our tools limited by what they can consume? https://github.com/Rust-GCC/gccrs
A maintainer for that specific platform was more into the line of thinking that Git should bend over backwards to support them because "loss of support could have societal impact [...] Leaving debit or credit card authorizers without a supported git would be, let's say, "bad"."
To me it looks like big corps enjoying the idea of having free service so they can avoid maintaining their own stuff, and trying the "too big to fail" fiddle on open source maintainers, with little effect.
It's additionally ridiculous because git is a code management tool. Maybe they are using it for something much more wild than that (why?) but I assume this is mostly just a complaint that they can't do `git pull` from their wonky architecture that they are building on. They could literally have a network mount and externally manage the git if they still need it.
It's not like older versions of git won't work perfectly fine. Git has great backwards compatibility. And if there is a break, seems like a good opportunity for them to fork and fix the break.
And lets be perfectly clear. These are very often systems built on top of a mountain of open source software. These companies will even have custom patched tools like gcc that they aren't willing to upstream because some manager decided they couldn't just give away the code they paid an engineer to write. I may feel bad for the situation it puts the engineers in, I feel absolutely no remorse for the companies because their greed put them in these situations in the first place.
Yes. It benefits them to have ubiquitous tools supported on their system. The vendors should put in the work to make that possible.
I don’t maintain any tools as popular as git or you’d know me by name, but darned if I’m going to put in more than about 2 minutes per year supporting non-Unix.
(This said as someone who was once paid to improve Ansible’s AIX support for an employer. Life’s too short to do that nonsense for free.)
As you're someone very familiar with Ansible, what are your thoughts on it in regards to IBM's imminent complete absorption of RedHat? I can't imagine Ansible, or any other RedHat product, doing well with that.
I wouldn’t say I’m very familiar. I don’t use it extensively anymore, and not at all at work. But in general, I can’t imagine a way in which IBM’s own corporate culture could contribute positively to any FOSS projects if they removed the RedHat veneer. Not saying it’s impossible, just that my imagination is more limited than the idea requires.
IBM has been, and still is, a big contributor to a bunch of Eclipse projects, as their own tools build on those.
The people there were both really skilled, friendly and professional.
Different divisions and departments can have huge cultural differences and priorities, obviously, but “IBM” doesn’t automatically mean bad for OSS projects.
Yeah! There shouldn't even be portable languages with multiple implemetations. Everything should use one compiler, one run-time and one package manager.
The shitheads who insist on using alternative compilers and platforms don't deserve tools.
> Everything should use one compiler, one run-time and one package manager.
If you think that calling out closed C compilers is somehow an argument for a single toolchain for all things, I doubt there's anything I can do to help educate you about why this isn't the case. If you do understand and are choosing to purposely misinterpret what I said, there are a lot of much stronger arguments you could make to support your point than that.
Even ignoring all of that, there's a much larger point that you've kind of glossed over here by:
> The shitheads who insist on using alternative compilers and platforms don't deserve tools
There's frequently discussion around the the expectations between open source project maintainers and users, and in the same way that users are under no obligation to provide compensation for projects they use, projects don't have any obligations to provide support indefinitely for any arbitrary set of circumstances, even if they happen to for a while. Maintainers sometimes will make decisions weighing tradeoffs between supporting a minority of users or making a technical change they feel will help them maintain the project better in the long-term differently than the users will. It's totally valid to criticize those decisions on technical grounds, but it's worth recognizing that these types of choices are inevitable, and there's nothing specific about C or Rust that will change that in the long run. Even with a single programming language within a single platform, the choice of what features to implement or not implement could make or break whether a tool works for someone's specific use case. At the end of the day, there's a finite amount of work people spend on a given project, and there needs to be a decision about what to spend it on.
> There's at least one proprietary platform that supports Git built by via a vendor-provided C compiler, but for which no public documentation exists and therefore no LLVM support is possible.
That's fine. The only impact is that they won't be able to use the latest and greatest release of Git.
Once those platforms work on their support for Rust they will be able to jump back to the latest and greatest.
It's sad to see people be so nonchalant about potentially killing off smaller platforms like this. As more barriers to entry are added, competition is going to decrease, and the software ecosystem is going to keep getting worse. First you need a lib C, now you need lib C and Rust, ...
But no doubt it's a great way for the big companies funding Rust development to undermine smaller players...
It's kind of funny to see f-ing HPE with 60k employees somehow being labeled as the poor underdog that should be supported by the open-source community for free and can't be expected to take care of software running on their premium hardware for banks etc by themselves.
I think you misread my comment because I didn't say anything like that.
In any case HPE may have 60k employees but they're still working to create a smaller platform.
It actually demonstrates the point I was making. If a company with 60k employees can't keep up then what chance do startups and smaller companies have?
HP made nearly $60b last year. They can fund the development of the tools they need for their 50 year old system that apparently powers lots of financial institutions. It's absurd to blame volunteer developers for not wanting to bend over backwards, just to ensure these institutions have the absolute latest git release, which they certainly do not need.
> It's sad to see people be so nonchalant about potentially killing off smaller platforms like this.
Your comment is needlessly dramatic. The only hypothetical impact this has is that whoever uses these platforms won't have upgrades until they do something about it, and the latest and greatest releases will only run if the companies behind these platforms invests in their maintenance.
This is not a good enough reason to prevent the whole world from benefiting from better tooling. This is not a lowest common denominator thing. Those platforms went out of their way to lag in interpretability, and this is the natural consequence of these decisions.
Rust has an experimental C backend of its own as part of rustc_codegen_clr https://github.com/FractalFir/rustc_codegen_clr . Would probably work better than trying to transpile C from general LLVM IR.
Given that the maintainer previously said they had tried to pay to get GCC and LLVM ported multiple times, all of which failed, money doesn’t seem to have helped.
I am curious, does anyone know what is the use case that mandates the use of git on NonStop? Do people actually commit code from this platform? Seems wild.
Seriously, I guess they just have to live without git if they're not willing to take on support for its tool chain. Nobody cares about NonStop but the very small number of people who use it... who are, by the way, very well capable of paying for it.
I strongly agree. I read some of the counter arguments, like this will make it too hard for NonStop devs to use git, and maybe make them not use it at all. Those don’t resonate with me at all. So what? What value does them using git provide to the git developers? I couldn’t care less if NonStop devs can use my own software at all. And since they’re exclusively at giant, well-financed corporations, they can crack open that wallet and pay someone to do the hard work if it means than much to them.
"You have to backport security fixes for your own tiny platform because your build environment doesn't support our codebase or make your build environment support our codebase" seems like a 100% reasonable stance to me
> your build environment doesn't support our codebase
If that is due to the build environment deviating from the standard, then I agree with you. However, when its due to the codebase deviating from the standard, then why blame the build environment developers for expecting codebases to adhere to standards. That's the whole point of standards.
Is there a standard that all software must be developed in ANSI C that I missed, or something? The git developers are saying - we want to use Rust because we think it will save us development effort. NonStop people are saying we can't run this on our platform. It seems to me someone at git made the calculus: the amount that NonStop is contributing is less than what we save going to Rust. Unless NonStop has a support contract with git developers that they would be violating, it seems to me the NonStop people want to have their cake and eat it too.
They enjoy being portable and like things to stay that way so when they introduce a new toolchain dependency which will make it harder for some people to compile git, they point it out in their change log?
because the rust compiler just doesn't support some platforms (os / architecture combination)?
RESF members tend to say it the other way around as in the platform doesn't support rust, but the reality is that it's the compiler that needs to support a platform, not the other way around.
Rust can't support a platform when that platform's vendors just provide a proprietary C compiler and nothing else (no LLVM, no GCC). Perhaps someone could reverse-engineer it, but ultimately a platform with zero support from any FOSS toolchain is unlikely to get Rust support anytime soon.
Furthermore, how could it without the donation of hardware, licenses and so forth?! This is a problem entirely of the proprietary platforms making, and it should be their customers problem for having made a poor decision.
Thanks for the specifics, really fascinating list! I'm sure I'm being a bit flippant, but it's pretty funny that a list including the Playstation 1, N64, and Apple Watches is in the same conversation as systems that need to compile git from source.
Anyone know of anything on that list with more than a thousand SWE-coded users? Presumably there's at least one or two for those in the know?
What I like about seeing a project support a long list of totally irrelevant old obscure platforms (like Free Pascal does, and probably GCC) is that it gives some hope that they will support some future obscure platform that I may care about. It shows a sign of good engineering culture. If a project supports only 64-bit arm+x86 on the three currently most popular operating systems that is a red flag for future compatibility risks.
I don't think the concern is whether a user can compile git from source on said platform, but rather whether the rust standard lib is well supported on said platform, which is required for cross compiling.
Rust doesn't run on all of their platforms so this is a good example of where git may not be viable for OpenBSD long-term (if they were to switch from CVS one day, which is a big IF)
You’re chasing after the meaning of “impossible.” Easy. There’s two categories of developers:
> I like programming
> I program to make money
If you belong to the second category - I’m going to be super charitable, it sounds like I’m not going to be charitable and I am, so keep reading - such as by being paid by a giant bank to make applications on Nonstop, there might be some policy that’s like
“You have to vet all open source code that runs on the computer.”
So in order to have Rust, on Nonstop, to build git, which this guy likes, he’d need to port llvm, which isn’t impossible. What’s impossible is to get llvm code reviewed by legal, or whatever, which they’re not going to do, they’re going to say “No. No llvm. HP who makes Nonstop can do it, and it can be their legal problem.”
I’m not saying it’s impossible. The other guy is saying it’s impossible, and I’m trying to show how, in a Rube Goldberg way, it looks impossible to him.
You and I like programming, and I’m sure we’re both gainfully employed, though probably not making as much money at guy, but he doesn’t like programming. You are allowed to mock someone’s sincerity if they’re part of a system that’s sort of nakedly about making lots of money. But if you just like programming, you’d never work for a bank, it’s really fucking boring, so basically nobody who likes programming would ever say porting Rust or whatever is impossible. Do you see?
It’s tough because, the Jane Street people and the Two Sigma people, they’re literally kids, they’re nice people, and they haven’t been there for very long, they still like programming! They feel like they need to mook for the bank, when they could just say that living in New York and having cocktails every night is fun and sincere. So this forum has the same problem as the mailing list, where it sounds like it’s about one thing - being able to use fucking hashmaps in git - and it’s really about another - bankers. Everywhere they turn, the bankers run into people who make their lifestyle possible, whether it’s the git developers who volunteer their time or the parents of the baristas at the bars they’re going to paying the baristas’ rent - and the bankers keep hating on these people. And then they go and say, well everyone is the problem but me. They don’t get it yet.
Is this a bit of chickens coming home to roost as far as developer culture forgetting how to work with cross-compiling toolchains? When I started my career, it was common understanding that the developer may be manipulating sourcecode on a different system and/or platform than where it will be executed.
Our source control, editing, compilation, and execution was understood to happen in different computational spaces, with possible copy/staging steps in between. You were doing something very naive if you assumed you could execute the built program on the same system where the sourcecode files existed and the editor/IDE was running.
This was a significant fraction of the build rules we used to manage. E.g. configuration steps had to understand that the target platform being measured/characterized is not the same as the platform executing the build tools. And to actually execute a built object may require remote file copies and remote program invocation.
Actually, the Rust toolchain makes cross-compiling way easier than any other fully-compiled language I've ever used. There are like 100 different platforms you can target by just setting the `--target` flag, and they all pretty much just work on any host platform.
Sounds like the real issue is that some Git developers have ancient, rigid requirements for their own development machines.
The way Zig solves this problem "better" than Rust is by claiming the target libraries as part of its distribution and building those on demand. It makes for a really excellent experience cross-building.
Rust might have a harder time if it wanted a corresponding feature because it doesn't natively build C like Zig does (using libclang). Either it would have to start using libclang or ship with rust re-implementations of the C library. AFAIK it's impossible to write the C++ library in Rust though.
That has not been my experience. I develop on Windows and need to compile for Linux. After spending several hours trying to get cross-compilation working, I gave up and do it via WSL now.
I switched from Go and I feel like Go was much better at this than Rust.
(I tried “cross” but it was very slow and I found it faster to rsync the files inside the container and then run the build scripts)
But, my point is you shouldn't even have to cross-compile Git to a platform like NonStop in order to develop NonStop apps. So the portability of Rust shouldn't even matter here. The app developer should be able to run their Git commands on a supported platform and cross-compile their own app to NonStop.
I haven't double checked, but my recollection of that story was that they were using Git as part of the operations at runtime, not (just) as a development dependency.
I suspect the majority of developers never even learnt as such. Cross-compilation is almost always a second-class citizen and I never expect it to work correctly on an external project. Linux distros have given up, with fedora even insisting on running compilation on the real target hardware for platforms like the raspberry pi, which is kind of insane, and as a result basically no-one puts in the effort to make it work.
> Is this a bit of chickens coming home to roost as far as developer culture forgetting how to work with cross-compiling toolchains?
I don't understand your comment. Completely ignorning Rust the modern state of cross-compilation is an unmitigated disaster.
Linux is especially bad because glibc is badly architected pile of garbage stuck in the 80s. It should be trivially possible to target any minimum glibc version for any possible Linux hardware environment. But glibc and Linux distros don't even attempt to make this possible. Linux toolchains make it nearly impossible to not use the default system libraries which is the opposite of correct for cross-compiling.
Zig moves mountains to make cross-compiling possible. But almost no projects actually attempt to support cross-compile.
Does anyone with insight into Git development know if we should care about this? Is this just a proposal out of nowhere from some rando or is this an idea that a good portion of Git contributors have wanted?
You can perhaps learn more about their involvement in the community from this year’s summit panel interview: https://youtu.be/vKsOFHNSb4Q
In a brief search, they’re engineering manager for GitLab, appear to be a frequent contributor of high-difficulty patches to Git in general, and are listed as a possible mentor for new contributors.
Given the recent summit, it seems likely that this plan was discussed there; I hadn’t dug into that possibility further but you could if desired.
Looking at the comment thread, at least one person I recognize as a core maintainer seems to be acting as if this is an official plan that they've already agreed on the outline of, if not the exact timing. And they seem to acknowledge that this breaks some of the more obscure platforms out there.
Interesting! I'd certainly say that's worth something. Definitely didn't expect it though given how poorly some people have reacted to Rust being introduced as an optional part of the Linux kernel.
It's a lot more understandable for developer tooling like Git to more quickly adopt newer system requirements. Something like the Linux kernel needs to be conservative because it's part of many people's bootstrapping process.
rustc_codegen_gcc is close to becoming stable, and conversely the Linux kernel is dropping more esoteric architectures. Once the supported sets of architectures fully overlap, and once the Linux kernel no longer needs unstable (nightly-only) Rust features, it'd be more reasonable for Linux to depend on Rust for more than just optional drivers.
I would also say that it’s a lot easier to learn to write rust when you’re writing something that runs sequentially on a single core in userspace as opposed to something like the Linux kernel. Having dipped my toes in rust that seems very approachable. When you start doing async concurrency is when the learning curve becomes steep.
Those footguns still exist in C, they’re just invisible bugs in your code. The Rust compiler is correct to point them out as bad architecture, even if it’s annoying to keep fighting the compiler.
And the reason this is a problem is because of the me-first attitude of language developers these days. It feels like every language nowadays feels the need to implement its own package manager. These package managers then encourage pinning dependencies, which encourages library authors to be a less careful about API stability (though obviously this varies from library to library) and makes it hard on distro maintainers to make all the packages work together. It also encourages program authors to use more libraries, as we see in the Javascript world with NPM, but also in the Rust world.
Now, Rust in Git and Linux probably won't head in these directions, so Debian might actually be able to support these two in particular, but the general attitude of Rustacians toward libraries is really off-putting to me.
IMHO the reason is that these languages are industry-funded efforts. And they are not funded to help the free software community. Step-by-step this reshapes the open-source world to serve other interests.
Semantic versioning is culturally widespread in Rust, so the problem of library authors being "less careful about API stability" rarely happens in practice. If pinned packages were the problem, I'd imagine they would have been called out as such in the Debian page linked by parent.
Rust is generally a much better tool for building software than C. When your software is built with better tools, you will most likely get better software (at least eventually / long term, sometimes a transition period can be temporarily worse or at least not better).
I'm not sure exactly what you mean but of course people are facing implementation deficiencies in Git. Last I checked submodules were still "experimental" and extremely buggy, and don't work at all with worktrees. (And yeah submodules suck but sometimes I don't have a choice.)
The developers of git will continue to be motivated to contribute to it. (This isn’t specific to Rust, but rather the technical choices of OSS probably aren’t generally putting the user at the top of the priority list.)
Well it would probably take at least 5 years to rewrite all of Git in Rust (git-oxide is 5 years old and far from finished). Then another few years to see novel features, then a year or two to actually get the release.
Btw 10 lines of code per day is a typical velocity for full time work, given it's volunteers 1 line per day might not be as crazy as you think.
While we are on Hacker News, this is still an enormously obtuse way to communicate.
Are you saying that as users of git we will be negatively affected by deps being added and build times going up? Do you have evidence of that from past projects adding rust?
What's the point of trying to introduce Rust everywhere? Git is a mature piece of software and I doubt a lot of new code needs to be written. Also, Rust is very complex relative to C. If you really need classes, templates, etc, you can stick to C++ 98 and get something that is still clean and understandable relative to recent C++ standards and Rust.
It's to a "test balloon" if you have a plan to mandate it and will be announcing that. Unless I suppose enough backlash will cause you to cancel the plan.
It's literally a test of how people will react, so yes, finding out if people will react negatively would be exactly the point of doing the test in the first place. Would you prefer that they don't publicize what their follow-up plans would be to try to make it harder to criticize the plans? If you're against the plan, I'm pretty sure that's the exact type of feedback they're looking for, so it would make more sense to tell them that directly if it actually affects you rather than making a passive-aggressive comment they'll likely never read on an unrelated forum.
If they’re running the project with a Linus-type approach, they won’t consider backlash to be interesting or relevant, unless it is accompanied by specific statements of impact. Generic examples for any language to explain why:
> How dare you! I’m going to boycott git!!
Self-identified as irrelevant (objector will not be using git); no reply necessary, expect a permaban.
> I don’t want to install language X to build and run git.
Most users do not build git from source. Since no case is made why this is relevant beyond personal preference, it will likely be ignored.
> Adopting language X might inhibit community participation.
This argument has almost certainly already been considered. Without a specific reason beyond the possibility, such unsupported objections will not lead to new considerations, especially if raised by someone who is not a regular contributor.
> Language X isn’t fully-featured on platform Y.
Response will depend on whether the Git project decides to support platform Y or not, whether the missing features are likely to affect Git uses, etc. Since no case is provided about platform Y’s usage, it’ll be up to the Git team to investigate (or not) before deciding
> Language X will prevent Git from being deployed on platform Z, which affects W installations based on telemetry and recent package downloads, due to incompatibility Y.
This would be guaranteed to be evaluated, but the outcome could be anywhere from “X will be dropped” to “Y will be patched” to “Z will not be supported”.
Rust suffers from the same problems that functional programming languages suffer from: deep learning curve and high complexity. The high complexity is intended to push more runtime errors back to compile time, but boy does that mean the language pays for it. Rust is a tire fire of complexity.
For these reasons I believe it is not a good idea. The kernel also sort of rejected Rust. The kernel is complex enough without adding a Haskell type system and a lisp-level macro system capable of obfuscating what code calls what code. serde code is so hard to spelunk for this reason. Contrast this with Go's Unmarshall, much easier to follow.
I personally find functional programming languages, including Rust, much clearer than C or Go, in particular because you can offload much information onto the compiler. The example of Serde feels a bit weird, because I don't think I've ever encountered issues with Serde code, while almost 100% of the times I've used Go in production, I've needed to debug through Go's Unmarshal and its... interesting implementation.
Also, last time I checked, the kernel didn't reject Rust. There was a conflict between two specific developers on the best place to store some headers, which is slightly different.
I actually think Rust is pretty easy to pick up for anyone that’s written Typescript and can use their linter to understand references and unwrapping a Result and catching an error.
I was going to roll my eyes at "Rust is a tire fire of complexity". Because it's not. Especially compared to C++. But then you just go on to outright lie in your second paragraph.
Dear Rust haters, lying about Rust in the Linux kernel is not effective for your cause, and in fact just makes it further look like you're throwing a tantrum. Downvoting me doesn't change the fact that more and more Rust is merged into the kernel, new, serious drivers are being written in Rust. It also doesn't change Firefox, Chrome, Microsoft, the US Government and others are recommending and writing new code in Rust. It's over, qq. It's absurd.
I really wish I could find the Lobsters comment the other day from someone that broke down the incredible list of nuanced, spec-level detail you needed to know about C++ to actually use it at scale in large projects. It's laughably, absurdly complex compared to Rust in huge code bases.
Given that rust only works on e.g. cygwin recently (and still does not build many crates: i try to compile jujutsu and failed), this is a big blow to portability IMHO. While I try to like rust, I think making it mandatory for builds of essential tools like git is really too early.
As said before I wasn't complaining about windows, but rather of not so common posix layers like cygwin [0]. Most C posix compliant stuff compiles in my experience.
Right, but Rust makes it so you don't have to use Cygwin. It's one of the great portability advantages of Rust that you can write real Windows programs with it.
I am not really sure if I can follow here. How could a rust compiled program like git honor my cygwin emulated mount points in paths, which I need, when working with other posix compliant software.
I want it to be cygwin native, i.e. passing calls through the cygwin posix layer and not use the windows binary. Sure I can use the windows binary, but that is a different thing.
You could read "Rust will become mandatory" as "all contributors will need to be able to code Rust" or even "all new code has to be written in Rust" or similar variations
I see. No, I understood it the way it is, as introducing it as a new hard dependency in git 3. I suppose it is a pilot for making it mandatory for contributions / incrementally replacing the existing code in the future, though.
Git is pretty modular, and it already includes multiple languages. I guess that significant parts of it will remain in C for a long time, including incremental improvements to those parts. Though it wouldn't surprise me if some parts of git did become all-Rust over time.
My last company used Jenkins, so our build infrastructure depended on Java. We used zero code outside of supporting Jenkins. So Java was required to build our stuff, but not to write or run it.
Edit: nope, I’m wrong. On reading the link, they’re setting up the build infrastructure to support Rust in the Git code itself.
Ironically it's original use was in political* parlance.
From wiki it's "information sent out to the media in order to observe the reaction of an audience. It is used by companies sending out press releases to judge customer reaction, and by politicians who deliberately leak information on a policy change."
Yup I have no doubt that there's a Rust 'evangelist' group somewhere aiming for inorganic growth of the language.
One argument from the git devs is that it’s very hard to implement smarter algorithms in C, though. For example, it uses arrays in places where a higher level language would use a hash, because the C version of that is harder to write, maintain, and debug. It’s also much easier to write correct threaded code in Rust than C. Between those 2 alone, using a more robust language could make it straightforward to add performance gains that benefit everyone.
That's a one time gain though. There's no reason for every platform to check the validity of some hash table implementation when that implementation is identical on all of them.
In my opinion, the verification of the implementation should be separate from the task of translating that implementation to bytecode. This leaves you with a simple compiler that is easy to implement but still with a strong verifier that is harder to implement, but optional.
And who’s volunteering for that verification using the existing toolchain? I don’t think that’s been overlooked just because the git devs are too dumb or lazy or unmotivated.
That came across more harshly than I meant, but I stand by the gist of it: this stuff is too hard to do in C or someone would’ve done it. It can be done, clearly, but there’s not the return on investment in this specific use case. But with better tooling, and more ergonomic languages, those are achievable goals by a larger pool of devs — if not today, because Rust isn’t as common as C yet, then soon.
As a practical example, the latest Git version can be compiled by an extremely simple (8K lines of C) C compiler[1] without modification and pass the entire test suite. Gonna miss the ability to make this claim.
In theory you should be able to use TCC to build git currently [1] [2]. If you have a lightweight system or you're building something experimental, it's a lot easier to get TCC up and running over GCC. I note that it supports arm, arm64, i386, riscv64 and x86_64.
The nature considering the future is that our actions _now_ affect the answer _then_. If we tie our foundational tools to LLVM, then it's very unlikely a new platform can exists without support for it. If we don't tie ourselves to it, then it's more likely we can exist without it. It's not a matter of if LLVM will be supported. We ensure that by making it impossible not to be the case. It's a self fulfilling prophecy.
I prefer to ask another question: "Is this useful". Would it be useful, if we were to spin up a different platform in the future, to be able to do so without LLVM. I think the answer to that is a resounding yes.
That doesn't leave rust stranded. A _useful_ path for rust to pursue would be to defined a minimal subset of the compiler that you'd need to implement to compile all valid programs. The type checker, borrow checker, unused variable tracker, and all other safety features should be optional extensions to a core of a minimal portable compiler. This way, the rust compiler could feasibly be as simple as the simplest C compiler while still supporting all the complicated validation on platforms with deep support.
rustc is only loosely tied to LLVM. Other code generation backends exist in various states of production-readiness. There are also two other compilers, mrustc and GCC-rs.
mrustc is a bootstrap Rust compiler that doesn't implement a borrow checker but can compile valid programs, so it's similar to to your proposed subset. Rust minus verification is still a very large and complex language though, just like C++ is large and complex.
A core language that's as simple to implement as C would have to be very different and many people (I suspect most) would like it less than the Rust that exists.
This series was in response to another thread [1] which wanted to make rust mandatory in an upcoming release.
The authors proposal was to instead take the middle ground and use rust as an optional dependency until a later point of time where it becomes mandatory.
The later point of time was decided based on when rust support lands in gcc, which would make things smoother, since platforms which support gcc would also be included.
[1]: https://lore.kernel.org/git/pull.1980.git.git.1752784344.git...
The GCC compiler collection has been hit and miss though. Nobody uses gcj for example. I sort of doubt that they'll be able to implement a good compiler for a language that doesn't even have a standard without that implementation going wildly out of date in the future, just like what happened with Java.
There's two different methods by which Rust support can be added to GCC: adding a Rust frontend to GCC and adding a GCC backend to the Rust compiler (rustc_codegen_gcc). The latter approach would not be (as?) susceptible to implementation divergence as an independent frontend.
yep, if git is content with rustc_codegen_gcc, then it's very doable they can require rust in the next few years
Since OpenJDK was released there isn't much point maintaining GCJ.
The title is a bit of a misnomer. Rust will become mandatory in the build system, not mandatory for future patches.
What does that mean? Is it mandatory for building the build system or also for building the application?
Thanks! I've added that to the title above. If it's somehow inaccurate, we can change it again.
Maybe I'm just old and moany, and I need to step aside for bigger and better things such as Rust.
But.
Now rather than needing to understand just C to work on Git/kernel, you now need to also know Rust. The toolchain complexity is increasing, and the mix of these languages increases the barrier to entry.
I'm highly invested into Git, having learned the tooling and having a significant number of projects constructed within it. I've written my own Git clients and have built a web server around Git repositories. I don't want to lose the hack-ability of Git.
> I'm just old and moany, and I need to step aside for bigger and better things such as Rust.
You are. This is firm "I don't want to have to learn new things" territory, which isn't a viable attitude in this industry.
In any case Rust is usually easier than C (excluding buggy C which is very easy to write), and certainly easier than actually learning the Git or Linux codebases.
I think it is often under appreciated by people who haven't worked in security how hard high quality C is in practice.
We might also have different priorities. I do not care too much that google and apple want to lock down their smartphone spyware and sales platforms. The supply chain risks and maintenance burden imposed onto me by the Rust ecosystem are much more of an concern.
I don't know what this has to do with locking down phones, but I do appreciate not getting compromised just for cloning a repo or opening my laptop at a coffee shop.
(There is a persistent idea that the lack of memory safety in C is good because it allows people to jailbreak their phones.)
This is not what I said, but memory safety is certainly not anything which is a high priority for my own security. I still think memory safety is important and I also think Rust is an interesting language, but... the hype is exaggerated and driven by certain industry interests.
Rust isn't popular just because of memory safety though. I think the memory safety message is maybe a little too loud.
It's also a modern language with fantastic tooling, very high quality library ecosystem and a strong type system that reduces the chance of all kinds of bugs.
It's obviously not perfect: compile time is ... ok, there aren't any mature GUI toolkits (though that's true of many languages), async Rust has way too many footguns. But it's still waaaaay better than C or C++. In a different league.
While I really really want devices I can own, I don't want to compromise security to do it. We need to do two things:
1. Lobby politicians to write laws that allow us to actually own the devices we bought.
2. Stop the FUD that a device that can be jailbroken is insecure. I heard this from our frigging CSO, of all people, and it's patently false, just FUD by Apple and Google who want you to be afraid of owning your device.
I want a device that's as secure as possible, but that I can own. I don't want to hack my own self just to get what I paid for.
It is a sad thing but I do root against secure boot initiatives because they almost entirely work to limit user's freedom instead of improving their security.
Who says you do not? :)
> You are. This is firm "I don't want to have to learn new things" territory, which isn't a viable attitude in this industry.
It's viable, but limiting. Sometimes you have to do things you don't want to, which is why it's called work. But if you can choose what platforms you work on, you can orient towards things where things change less, and then you don't need to learn new things as often.
Chances are, if you get into the weeds in a lot of C programs, Rust is in your future, but it's viable to not want that, and to moan about it while doing it when you need to.
No one’s laying off COBOL programmers. Specialization has its upsides once the market isn’t saturated!
Well only because 99% of the world's COBOL developers were laid off decades ago (or switched to another language).
The more things change,
It's not "having to learn something new", but "having to be good at two things, both of which are full languages with their own specifics, problems and ways to solve them, two sets of compilers and some duct tape to hold them together.
It's like putting steak on a pizza... pizza is good, steak is good, pizza on a steak might be good too, but to actually do that in production, you now need two prep stations and you can't mess up either one.
The idea that "new is automatically better and everyone not keeping up is lazy" is incredibly flawed and stupid.
Get this: Introducing a secondary language into the toolchain is a tradeoff in complexity vs the new language's proposed "benefits".
It's far better to introduce an AI code tool that fixes the unsafe code rather than increasing complexity with a second language. Open source C devs, get working on it. edit: wait, don't we have static code analyzers?!
Rust is over 10 years old now. It has a track record of delivering what it promises, and a very satisfied growing userbase.
OTOH static analyzers for C have been around for longer than Rust, and we're still waiting for them to disprove Rice's theorem.
AI tools so far are famous for generating low-quality code, and generating bogus vulnerability reports. They may eventually get better and end up being used to make C code secure - see DARPA's TRACTOR program.
The applicability of Rice's theorem with respect to static analysis or abstract interpretation is more complex than you implied. First, static analysis tools are largely pattern-oriented. Pattern matching is how they sidestep undecidability. These tools have their place, but they aren't trying to be the tooling you or the parent claim. Instead, they are more useful to enforce coding style. This can be used to help with secure software development practices, but only by enforcing idiomatic style.
Bounded model checkers, on the other hand, are this tooling. They don't have to disprove Rice's theorem to work. In fact, they work directly with this theorem. They transform code into state equations that are run through an SMT solver. They are looking for logic errors, use-after-free, buffer overruns, etc. But, they also fail code for unterminated execution within the constraints of the simulation. If abstract interpretation through SMT states does not complete in a certain number of steps, then this is also considered a failure. The function or subset of the program only passes if the SMT solver can't find a satisfactory state that triggers one of these issues, through any possible input or external state.
These model checkers also provide the ability for user-defined assertions, making it possible to build and verify function contracts. This allows proof engineers to tie in proofs about higher level properties of code without having to build constructive proofs of all of this code.
Rust has its own issues. For instance, its core library is unsafe, because it has to use unsafe operations to interface with the OS, or to build containers or memory management models that simply can't be described with the borrow checker. This has led to its own CVEs. To strengthen the core library, core Rust developers have started using Kani -- a bounded model checker like those available for C or other languages.
Bounded model checking works. This tooling can be used to make either C or Rust safer. It can be used to augment proofs of theorems built in a proof assistant to extend this to implementation. The overhead of model checking is about that of unit testing, once you understand how to use it.
It is significantly less expensive to teach C developers how to model check their software using CBMC than it is to teach them Rust and then have them port code to Rust. Using CBMC properly, one can get better security guarantees than using vanilla Rust. Overall, an Ada + Spark, CBMC + C, Kani + Rust strategy coupled with constructive theory and proofs regarding overall architectural guarantees will yield equivalent safety and security. I'd trust such pairings of process and tooling -- regardless of language choice -- over any LLM derived solutions.
In my experience current AI is still far from reasoning about the kind of hard-to-spot bugs in C that lead to the worst exploits. Rust solves most of these by design. It isn't about adding a second language - it is about slowly phasing out a language that is being misused in areas it shouldn't be in.
C will at some point be relegated to being an educational language, incredibly valuable due to few but good abstractions over assembly. It will continue to exist for decades in most systems, but hopefully it won't be used outside of the maintenance of legacy systems.
> I've written my own Git clients and have built a web server around Git repositories. I don't want to lose the hack-ability of Git.
And they will keep working because the repository format isn't affected by the language git is written in.
AFAIK git already uses multiple languages, github says its 50% C, 38% shell, 4% perl, then 4% TCL python 1%
So "another language" here probably does not weigh as much, especially considering perl/TCL are the weirder one there.
But for big projects like linux and git, this could actually be a consolidation step: you spent decades growing, hacking things on top of each other.
You have mostly figured out what this project is and where it is going, it's time to think about safety, performance and remove old hacks.
Rust feels like a good fit, imho.
Perl, TCL and Python are all written in C, as well as many shells, so despite their interdependency the total complexity can be satisfied with a C11 compiler.
I did check this out. The shell, perl and python are likely for scripting and not used during runtime. TCL is likely some form of dynamic scripting.
I think we also have to be honest about what the project here is too, it's not to have both C and Rust together, but to replace all C with Rust. In which case, it probably makes sense to just clone to repo and work on a fork like they did with SSH.
> The shell, perl and python are likely for scripting and not used during runtime.
Some git subcommands are implemented in these. git filter-branch is a shell script, git cvsimport is a Perl script, and git p4 (perforce interop) is a Python script. There are not too many left these days (git add -p/-i also used to call a Perl script), but they exist.
I'm sure you are aware, reading between the lines of what you said, why, but for some others who aren't aware of the history of git; it was originally about 50% C and 50% Perl, the performance critical parts were written in C and then various git commands were written in Perl. Over time almost all the Perl was removed because there were less Perl monks than C devs.
Now it would seem the logic is reversed; even though there are less Rust devs than C devs, Rust is going to replace C. Maybe now that git is large enough and entrenched enough such a move can be forced through.
> it was originally about 50% C and 50% Perl, the performance critical parts were written in C and then various git commands were written in Perl.
IIRC, it was mostly shell, not Perl, and looking at the proportion is misleading: the low-level commands (the "plumbing") like git-cat-file or git-commit-tree were all in C, while the more user-friendly commands (the "porcelain") like git-log or git-commit were all shell scripts calling the low-level commands. Yes, even things we consider fundamental today like "git commit" were shell scripts.
I believe gitk and git-gui are written in tcl. Those are definitely things that get shipped to the user, so (at least for those parts) you wouldn't need to have a toolchain on the build server.
A number of the git commands were implemented in perl and shell. Now I see only git-svn is perl here for me and there's still a few shell scripts in /usr/libexec/git.
> Now rather than needing to understand just C to work on Git/kernel, you now need to also know Rust.
I'm yet to know a single software engineer who isn't well versed on multiple programming languages. This is not a problem.
Agreed. And if someone is interested in contributing to the Linux kernel, a new programming language is far from the hardest thing that they need to learn...
Except now these software engineers have to code switch between languages.
Could you software engineers stop making things harder for yourselves and playing this meaningless flex of a status game, and you know, focus on something tangible, meaningful, instead of adding more bureaucracy?
I'm guessing you aren't a software engineer based on this comment, but the difference between programming languages is tangible and meaningful. It isn't like human languages where they're mostly basically the same and achieve the same thing.
And code switching between languages is not hard at all.
It's hilarious that you can assume such a thing just by a couple of words on the internet. Or maybe I'm not a 'software engineer' by your standards because unlike your closed group of SWEs I'm a lot less focused on resume padding and keeping my codebase sane and not exploding in complexity.
I should specify - it's hard in that it's troublesome to have to code switch and do a bunch of recall before working on the thing.
Say you've not worked on this secondary language for a long time, which absolutely happens, and have spend hours of effort to recall it. This takes time that you need not spend on, it's how your memory works.
I didn’t make the assumption but it sounded like a reasonable assumption based on the pronouns you used. You said “could you software engineers stop making things harder for yourselves.” A reasonable interpretation of this is that you aren’t a software engineer.
Reinforced softly by the rest of your comment not being technically sound. Adding a second language that is meaningfully different in its strengths and weaknesses isn’t “bureaucracy”. Bureaucracy is more like “sign a CLA before you can contribute”.
> I should specify - it's hard in that it's troublesome to have to code switch and do a bunch of recall before working on the thing.
You don't sound like you have any experience working on software projects. I can tell you it's not hard to switch between programming languages. If anything, the difficulty level is placed on onboarding onto projects you are not familiar with, but the programming language in use is far from being a relevant factor if you already are familiar with it.
Dude you said "Could you software engineers stop..."
In normal English that means you aren't a software engineer.
What, I cant be a software engineer and also address a batch of SWEs that way? What kind of 'normal english' is that?
By the way, you have something that's actually substantial or are we going to argue irrelevant semantics all day?
Be prepared to argue semantics all day when it comes to rust community.
And be prepared to read that rust is "obviously" not hard despite there being literature stating the contrary. Including projects that migrated from rust to other languages. And projects where rust was considered and discarded.
They will write entire books worth of comments around the definition of simple. And why compiling rust isn't "ackchually" slow.
Removing Perl and adding Rust instead is probably reducing complexity rather than increasing it.
Im on the same boat :) But no worries. You can always build and use older git without rust. Of course, it will work for a while until those kids will change the proto for the "better". And being old and grumpy also means, you can slowly care less and less about all that moot :)
Kids: now downvote it into oblivion :) Like I give a shit...
I've also sent some patches git's way and I can't say I'm thrilled about being forced to (finally) learn Rust if I want to contribute again in the future. I guess I'm outdated...
They're proposing porting over one small piece that has no dependencies and exposing it to the rest of git via a C interface. Yes, they'll presumably port more over in the future if it goes well, but it's a gross exaggeration to characterize this as somehow making it impossible to contribute without knowing Rust.
I know that it is a "slippery slope" argument, but in the future, it will become more difficult to contribute without knowing Rust. That's the entire point of introducing it.
And also, a lot of people who hate C, or who never learned it well, will be able to contribute to more and more areas of the Linux kernel.
I understand that it's a minor change in its current state. However, it is a fact that the long term goal is to port everything to rust. Once that goal is accomplished, rust will be required. So it is not at all a gross exaggeration. It's a prediction of the future.
I don't even disagree with that goal, I think it's desirable that things be written in rust, it's a really good language that provides a lot of benefits. I think I've just been infected with the C virus too long. I can't even tolerate C++.
I feel the same way about C code though. I don't think C gets the right to be the one true programming language that everyone must know forever.
I suggest waiting till the gcc side matures, with the minimum of a working gcc frontend for a non optional dependency. Optional dependencies with gcc_codegen might be okay. Git is pretty core to a lot of projects, and this change is risky, it's on a fairly short time frame to make it a core dep (6 months?).
I am curious, what is the reason behind introducing Rust in Git?
I am not familiar with Git development, I am just a user. But my impression is that it is already a complete tool that won't require much new code to be written. Fixes and improvements here and there, sure, but that does not seem like a good reason to start using a new language. In contrast, I understand why adding it to e.g. Linux development makes sense, since new drivers will always need to be written.
Can anyone explain what I might be missing?
Git is constantly gaining features, even if for the most part it seems like the core functionality is unchanged.
If you'd like to review the changelog, the Git repo has RelNotes but I've found GitHub's blog's Git category to be a more digestible resource on the matter: https://github.blog/open-source/git/
https://lore.kernel.org/git/ZZ9K1CVBKdij4tG0@tapette.crustyt... has a couple dozen replies and would be a useful place to start reading about it; beyond that, search that list for Rust. (Note, I’m only responding the opening question, not evaluating the arguments pro/con here or on the list; in any case, someone else surely will.)
git feels complete until you use a tool like jj or git-branchless (latter of which has things like in-memory merges in rust)
C is unsafe.
https://github.com/Speykious/cve-rs
Changing well-tested code is unsafe.
not changing working code to prevent issues is unsafe.
we can go in circles all day with blanket statements that are all true. but we have ample evidence that even if we think some real-world C code is safe, it is often not because humans are extremely bad at writing safe C.
sometimes it's worth preventing that more strongly, sometimes it's not, evidently they think that software that a truly gigantic amount of humans and machines use is an area where it's worth the cost.
If the code is brittle to change, it must not have been particularly safe in the first place, right?
And if it's well-tested, maybe that condition is achieved by the use of a test suite which could verify the changes are safe too?
A test will never catch every bug, otherwise it's a proof, and any change has the probability to introduce a new bug, irregardless of how careful you are. Thus, changing correct code will eventually result in incorrect code.
I'm not sure if that's how probability works.
I mean if you want Git to never change you're free to stick with the current version forever. I'm sure that will work well.
I obviously don’t think that is wise, but Git is literally designed with this in mind: https://git-scm.com/docs/repository-version/2.39.0
Just like SQLite has an explicit compatibility guarantee through 2050. You literally do not have to update if you do not want to.
And it’s still a choice you can make regardless of Git moving to Rust or not, so what’s the problem?
I honestly can't tell if this is meant as serious reply to my question (in that case: let's say I agree that Rust is 100% better than C; my question still stands) or as a way to mock Rust people's eagerness to rewrite everything in Rust (in that case: are you sure this is the reason behind this? They are not rewriting Git from scratch...)
As a user, you may not be aware that C makes it relatively easy to create https://en.m.wikipedia.org/wiki/Buffer_overflow which are a major source of security vulnerabilities.
This is one of the best reasons to rewrite software in Rust or any other more safe by default language.
Everyone on hackernews is well aware that C makes it relatively easy to create buffer overflows, and what buffer overflows are. You're still not responding to GP question.
I'm not involved in the initiative so I can't answer the question definitively? I provided one of the major reasons that projects get switched from C. I think it's likely to be a major part of the motivation.
I didn't know that C makes it easy.
Right, I never mentioned that I am a decently experienced C developer, so of course I got my fair share of buffer overflows and race conditions :)
I have also learned some Rust recently, I find a nice language and quite pleasant to work with. I understand its benefits.
But still, Git is already a mature tool (one may say "finished"). Lots of bugs have been found and fixed. And if more are found, sure it will be easier to fix them in the C code, rather than rewriting in Rust? Unless the end goal is to rewrite the whole thing in Rust piece by piece, solving hidden memory bugs along the way.
https://access.redhat.com/articles/2201201 and https://github.com/git/git/security/advisories/GHSA-4v56-3xv... are interesting examples to consider (though I'm curious whether Rust's integer overflow behavior in release builds would have definitely fared better?).
> Unless the end goal is to rewrite the whole thing in Rust piece by piece, solving hidden memory bugs along the way.
I would assume that's the case.
This doesn't matter at all for programs like Git. Any non-free standing program running on a modern OS on modern hardware trying to access memory its not supposed to will be killed by the OS. This seams to be the more reasonable security-boundary then relying on the language implementation to just not issue code, that does illegal things.
Yeah sure, memory-safety is nice for debuggibility and being more confident in the programs correctness, but it is not more than that. It is neither security nor proven correctness.
Not quite the best example, since Git usually has unrestricted file access and network access through HTTP/SSH, any kind of RCE would be disastrous if used for data exfiltration, for instance.
If you want a better example, take distributed database software: behind DMZ, and the interesting code paths require auth.
Git already runs "foreign" code e.g. in filters. The ability to write code that reacts unexpectedly on crafted user input isn't restricted to languages providing unchecked array/pointer access.
Unintentional bugs that caused data destruction would also be disastrous for a tool like git
> Any non-free standing program running on a modern OS on modern hardware trying to access memory its not supposed to will be killed by the OS.
This seems like a rather strong statement to me. Do you mind elaborating further?
> Introducing Rust is impossible for some platforms and hard for others.
Please could someone elaborate on this.
There's at least one proprietary platform that supports Git built by via a vendor-provided C compiler, but for which no public documentation exists and therefore no LLVM support is possible.
Ctrl+F for "NonStop" in https://lwn.net/Articles/998115/
Shouldn't these platforms work on getting Rust to support it rather than have our tools limited by what they can consume? https://github.com/Rust-GCC/gccrs
A maintainer for that specific platform was more into the line of thinking that Git should bend over backwards to support them because "loss of support could have societal impact [...] Leaving debit or credit card authorizers without a supported git would be, let's say, "bad"."
To me it looks like big corps enjoying the idea of having free service so they can avoid maintaining their own stuff, and trying the "too big to fail" fiddle on open source maintainers, with little effect.
> Leaving debit or credit card authorizers without a supported git would be, let's say, "bad".
Oh no, if only these massive companies that print money could do something as unthinkable as pay for a support contract!
It's additionally ridiculous because git is a code management tool. Maybe they are using it for something much more wild than that (why?) but I assume this is mostly just a complaint that they can't do `git pull` from their wonky architecture that they are building on. They could literally have a network mount and externally manage the git if they still need it.
It's not like older versions of git won't work perfectly fine. Git has great backwards compatibility. And if there is a break, seems like a good opportunity for them to fork and fix the break.
And lets be perfectly clear. These are very often systems built on top of a mountain of open source software. These companies will even have custom patched tools like gcc that they aren't willing to upstream because some manager decided they couldn't just give away the code they paid an engineer to write. I may feel bad for the situation it puts the engineers in, I feel absolutely no remorse for the companies because their greed put them in these situations in the first place.
Yes. It benefits them to have ubiquitous tools supported on their system. The vendors should put in the work to make that possible.
I don’t maintain any tools as popular as git or you’d know me by name, but darned if I’m going to put in more than about 2 minutes per year supporting non-Unix.
(This said as someone who was once paid to improve Ansible’s AIX support for an employer. Life’s too short to do that nonsense for free.)
As you're someone very familiar with Ansible, what are your thoughts on it in regards to IBM's imminent complete absorption of RedHat? I can't imagine Ansible, or any other RedHat product, doing well with that.
I wouldn’t say I’m very familiar. I don’t use it extensively anymore, and not at all at work. But in general, I can’t imagine a way in which IBM’s own corporate culture could contribute positively to any FOSS projects if they removed the RedHat veneer. Not saying it’s impossible, just that my imagination is more limited than the idea requires.
IBM has been, and still is, a big contributor to a bunch of Eclipse projects, as their own tools build on those. The people there were both really skilled, friendly and professional. Different divisions and departments can have huge cultural differences and priorities, obviously, but “IBM” doesn’t automatically mean bad for OSS projects.
I'm sure some of RedHat stuff will end up in the Apache Foundation once IBM realizes it has no interest in them.
There isn't even a Nonstop port of GCC yet. Today, Nonstop is big-endian x86-64, so tacking this onto the existing backend is going to be interesting.
That platform doesn’t support GCC either.
Isn’t that’s what’s happening? The post says they’re moving forward.
why, everyone on the fucking planet should drop what they are doing and start mucking around with Rust.
At this point maybe it's time to let them solve the problem they've created for themselves by insisting on a closed C compiler in 2025.
Yeah! There shouldn't even be portable languages with multiple implemetations. Everything should use one compiler, one run-time and one package manager.
The shitheads who insist on using alternative compilers and platforms don't deserve tools.
Rust will fix all that.
>> insisting on a closed C compiler in 2025.
> Everything should use one compiler, one run-time and one package manager.
If you think that calling out closed C compilers is somehow an argument for a single toolchain for all things, I doubt there's anything I can do to help educate you about why this isn't the case. If you do understand and are choosing to purposely misinterpret what I said, there are a lot of much stronger arguments you could make to support your point than that.
Even ignoring all of that, there's a much larger point that you've kind of glossed over here by:
> The shitheads who insist on using alternative compilers and platforms don't deserve tools
There's frequently discussion around the the expectations between open source project maintainers and users, and in the same way that users are under no obligation to provide compensation for projects they use, projects don't have any obligations to provide support indefinitely for any arbitrary set of circumstances, even if they happen to for a while. Maintainers sometimes will make decisions weighing tradeoffs between supporting a minority of users or making a technical change they feel will help them maintain the project better in the long-term differently than the users will. It's totally valid to criticize those decisions on technical grounds, but it's worth recognizing that these types of choices are inevitable, and there's nothing specific about C or Rust that will change that in the long run. Even with a single programming language within a single platform, the choice of what features to implement or not implement could make or break whether a tool works for someone's specific use case. At the end of the day, there's a finite amount of work people spend on a given project, and there needs to be a decision about what to spend it on.
Weighted by user count for a developer tool like Git, Rust is a more portable language than the combination of C and bash currently in use.
Why should free software projects bend over backwards to support obscure proprietary platforms? Sounds absurd to me
> There's at least one proprietary platform that supports Git built by via a vendor-provided C compiler, but for which no public documentation exists and therefore no LLVM support is possible.
That's fine. The only impact is that they won't be able to use the latest and greatest release of Git.
Once those platforms work on their support for Rust they will be able to jump back to the latest and greatest.
It's sad to see people be so nonchalant about potentially killing off smaller platforms like this. As more barriers to entry are added, competition is going to decrease, and the software ecosystem is going to keep getting worse. First you need a lib C, now you need lib C and Rust, ...
But no doubt it's a great way for the big companies funding Rust development to undermine smaller players...
It's kind of funny to see f-ing HPE with 60k employees somehow being labeled as the poor underdog that should be supported by the open-source community for free and can't be expected to take care of software running on their premium hardware for banks etc by themselves.
I think you misread my comment because I didn't say anything like that.
In any case HPE may have 60k employees but they're still working to create a smaller platform.
It actually demonstrates the point I was making. If a company with 60k employees can't keep up then what chance do startups and smaller companies have?
> If a company with 60k employees can't keep up then what chance do startups and smaller companies have?
They build on open source infrastructure like LLVM, which a smaller company will probably be doing anyway.
Sure, but let's not pretend that doesn't kill diversity and entrench a few big players.
The alternative is killing diversity of programming languages, so it's hard to win either way.
HP made nearly $60b last year. They can fund the development of the tools they need for their 50 year old system that apparently powers lots of financial institutions. It's absurd to blame volunteer developers for not wanting to bend over backwards, just to ensure these institutions have the absolute latest git release, which they certainly do not need.
> It's sad to see people be so nonchalant about potentially killing off smaller platforms like this.
Your comment is needlessly dramatic. The only hypothetical impact this has is that whoever uses these platforms won't have upgrades until they do something about it, and the latest and greatest releases will only run if the companies behind these platforms invests in their maintenance.
This is not a good enough reason to prevent the whole world from benefiting from better tooling. This is not a lowest common denominator thing. Those platforms went out of their way to lag in interpretability, and this is the natural consequence of these decisions.
Maybe they can resurrect the C backend for LLVM and run that through their proprietary compilers?
It's probably not straightforward but the users of NonStop hardware have a lot of money so I'm sure they could find a way.
Rust has an experimental C backend of its own as part of rustc_codegen_clr https://github.com/FractalFir/rustc_codegen_clr . Would probably work better than trying to transpile C from general LLVM IR.
Some people have demonstrated portability using the WASM target, translating that to C89 via w2c2, and then compiling _that_ for the final target.
Given that the maintainer previously said they had tried to pay to get GCC and LLVM ported multiple times, all of which failed, money doesn’t seem to have helped.
I mean at one point I had LLVM targeting Xbox 360, PS3, and Wii so I'm sure it's possible, it just needs some imagination and elbow grease :)
Surely the question is how much they tried to pay? Clearly the answer is "not enough".
Won't someome think of the financial sector
Reminds me of a conversation about TLS and how a certain bank wanted to insert a backdoor into all of TLS for their convenience.
I am curious, does anyone know what is the use case that mandates the use of git on NonStop? Do people actually commit code from this platform? Seems wild.
Sucks to be that platform?
Seriously, I guess they just have to live without git if they're not willing to take on support for its tool chain. Nobody cares about NonStop but the very small number of people who use it... who are, by the way, very well capable of paying for it.
I strongly agree. I read some of the counter arguments, like this will make it too hard for NonStop devs to use git, and maybe make them not use it at all. Those don’t resonate with me at all. So what? What value does them using git provide to the git developers? I couldn’t care less if NonStop devs can use my own software at all. And since they’re exclusively at giant, well-financed corporations, they can crack open that wallet and pay someone to do the hard work if it means than much to them.
"You have to backport security fixes for your own tiny platform because your build environment doesn't support our codebase or make your build environment support our codebase" seems like a 100% reasonable stance to me
> your build environment doesn't support our codebase
If that is due to the build environment deviating from the standard, then I agree with you. However, when its due to the codebase deviating from the standard, then why blame the build environment developers for expecting codebases to adhere to standards. That's the whole point of standards.
Is there a standard that all software must be developed in ANSI C that I missed, or something? The git developers are saying - we want to use Rust because we think it will save us development effort. NonStop people are saying we can't run this on our platform. It seems to me someone at git made the calculus: the amount that NonStop is contributing is less than what we save going to Rust. Unless NonStop has a support contract with git developers that they would be violating, it seems to me the NonStop people want to have their cake and eat it too.
According to git docs they seem to try to make a best effort to stick to POSIX but without any strong guarantees, which this change seems to be entirely in line with: https://github.com/git/git/blob/master/Documentation/CodingG...
Had you been under the impression that any of these niche platforms conform to any common standard other than their own?
Because they don’t. For instance, if they were fully POSIX compliant, they’d probably already have LLVM.
I’m sold.
Nonstop is still supported? :o
How is this git's concern?
They enjoy being portable and like things to stay that way so when they introduce a new toolchain dependency which will make it harder for some people to compile git, they point it out in their change log?
Git's main concern should, of course, be getting Rust in, in some shape or form.
because the rust compiler just doesn't support some platforms (os / architecture combination)?
RESF members tend to say it the other way around as in the platform doesn't support rust, but the reality is that it's the compiler that needs to support a platform, not the other way around.
Rust can't support a platform when that platform's vendors just provide a proprietary C compiler and nothing else (no LLVM, no GCC). Perhaps someone could reverse-engineer it, but ultimately a platform with zero support from any FOSS toolchain is unlikely to get Rust support anytime soon.
Furthermore, how could it without the donation of hardware, licenses and so forth?! This is a problem entirely of the proprietary platforms making, and it should be their customers problem for having made a poor decision.
Reverse that: "C can't support a platform when that platform's vendors just provide a proprietary Rust compiler and nothing else".
Seems to me that that is equally true and doesn't remove any validity from the argument.
My understanding: As Rust is built on LLVM and not GCC, it is also limited to operating systems supporting LLVM.
GCC simply supports more platforms.
Rust has a GCC backend as well, rustc_codegen_gcc. However, the NonStop platform just has a proprietary C compiler.
MSVC is also proprietary. However LLVM is supported by Microsoft. The developer of Nonstop is apparently not doing that.
See this page [1], particularly the 'Tier 3' platforms.
[1] https://doc.rust-lang.org/beta/rustc/platform-support.html
More precise link
https://doc.rust-lang.org/beta/rustc/platform-support.html#t...
Thanks for the specifics, really fascinating list! I'm sure I'm being a bit flippant, but it's pretty funny that a list including the Playstation 1, N64, and Apple Watches is in the same conversation as systems that need to compile git from source.
Anyone know of anything on that list with more than a thousand SWE-coded users? Presumably there's at least one or two for those in the know?
What I like about seeing a project support a long list of totally irrelevant old obscure platforms (like Free Pascal does, and probably GCC) is that it gives some hope that they will support some future obscure platform that I may care about. It shows a sign of good engineering culture. If a project supports only 64-bit arm+x86 on the three currently most popular operating systems that is a red flag for future compatibility risks.
I don't think the concern is whether a user can compile git from source on said platform, but rather whether the rust standard lib is well supported on said platform, which is required for cross compiling.
In practice, the only systems any significant number of people care about running Git on are arm64 and x86-64, and those are very well supported.
Rust doesn't support as many CPU architectures as C does (SH4 for example, though there's likely many more better examples.)
This might make a much more interesting case for GOT than before https://www.gameoftrees.org/
got is a waste of time, imo.
they could just port the multiprocess pledge stuff to git (and benefit linux too with namespaces)
then all the userfacing changes (i.e. work on git bare instrad of wc) I've been doing for the last decade with a couple lines on my gitconfig file.
Rust doesn't run on all of their platforms so this is a good example of where git may not be viable for OpenBSD long-term (if they were to switch from CVS one day, which is a big IF)
You’re chasing after the meaning of “impossible.” Easy. There’s two categories of developers:
> I like programming
> I program to make money
If you belong to the second category - I’m going to be super charitable, it sounds like I’m not going to be charitable and I am, so keep reading - such as by being paid by a giant bank to make applications on Nonstop, there might be some policy that’s like
“You have to vet all open source code that runs on the computer.”
So in order to have Rust, on Nonstop, to build git, which this guy likes, he’d need to port llvm, which isn’t impossible. What’s impossible is to get llvm code reviewed by legal, or whatever, which they’re not going to do, they’re going to say “No. No llvm. HP who makes Nonstop can do it, and it can be their legal problem.”
I’m not saying it’s impossible. The other guy is saying it’s impossible, and I’m trying to show how, in a Rube Goldberg way, it looks impossible to him.
You and I like programming, and I’m sure we’re both gainfully employed, though probably not making as much money at guy, but he doesn’t like programming. You are allowed to mock someone’s sincerity if they’re part of a system that’s sort of nakedly about making lots of money. But if you just like programming, you’d never work for a bank, it’s really fucking boring, so basically nobody who likes programming would ever say porting Rust or whatever is impossible. Do you see?
It’s tough because, the Jane Street people and the Two Sigma people, they’re literally kids, they’re nice people, and they haven’t been there for very long, they still like programming! They feel like they need to mook for the bank, when they could just say that living in New York and having cocktails every night is fun and sincere. So this forum has the same problem as the mailing list, where it sounds like it’s about one thing - being able to use fucking hashmaps in git - and it’s really about another - bankers. Everywhere they turn, the bankers run into people who make their lifestyle possible, whether it’s the git developers who volunteer their time or the parents of the baristas at the bars they’re going to paying the baristas’ rent - and the bankers keep hating on these people. And then they go and say, well everyone is the problem but me. They don’t get it yet.
What are you on about?
Is this a bit of chickens coming home to roost as far as developer culture forgetting how to work with cross-compiling toolchains? When I started my career, it was common understanding that the developer may be manipulating sourcecode on a different system and/or platform than where it will be executed.
Our source control, editing, compilation, and execution was understood to happen in different computational spaces, with possible copy/staging steps in between. You were doing something very naive if you assumed you could execute the built program on the same system where the sourcecode files existed and the editor/IDE was running.
This was a significant fraction of the build rules we used to manage. E.g. configuration steps had to understand that the target platform being measured/characterized is not the same as the platform executing the build tools. And to actually execute a built object may require remote file copies and remote program invocation.
Actually, the Rust toolchain makes cross-compiling way easier than any other fully-compiled language I've ever used. There are like 100 different platforms you can target by just setting the `--target` flag, and they all pretty much just work on any host platform.
Sounds like the real issue is that some Git developers have ancient, rigid requirements for their own development machines.
> Actually, the Rust toolchain makes cross-compiling way easier than any other fully-compiled language I've ever used
Zig takes the crown on that one, to the point that some people use Zig to cross-compile Go projects with CGo dependencies.
The way Zig solves this problem "better" than Rust is by claiming the target libraries as part of its distribution and building those on demand. It makes for a really excellent experience cross-building.
Rust might have a harder time if it wanted a corresponding feature because it doesn't natively build C like Zig does (using libclang). Either it would have to start using libclang or ship with rust re-implementations of the C library. AFAIK it's impossible to write the C++ library in Rust though.
That has not been my experience. I develop on Windows and need to compile for Linux. After spending several hours trying to get cross-compilation working, I gave up and do it via WSL now.
I switched from Go and I feel like Go was much better at this than Rust.
(I tried “cross” but it was very slow and I found it faster to rsync the files inside the container and then run the build scripts)
Others have said Rust does not support NonStop.
But, my point is you shouldn't even have to cross-compile Git to a platform like NonStop in order to develop NonStop apps. So the portability of Rust shouldn't even matter here. The app developer should be able to run their Git commands on a supported platform and cross-compile their own app to NonStop.
I haven't double checked, but my recollection of that story was that they were using Git as part of the operations at runtime, not (just) as a development dependency.
A good example of it is how easy it is to do WASM from rust. WASM is even one of the harder platforms to target with rust.
I suspect the majority of developers never even learnt as such. Cross-compilation is almost always a second-class citizen and I never expect it to work correctly on an external project. Linux distros have given up, with fedora even insisting on running compilation on the real target hardware for platforms like the raspberry pi, which is kind of insane, and as a result basically no-one puts in the effort to make it work.
> Is this a bit of chickens coming home to roost as far as developer culture forgetting how to work with cross-compiling toolchains?
I don't understand your comment. Completely ignorning Rust the modern state of cross-compilation is an unmitigated disaster.
Linux is especially bad because glibc is badly architected pile of garbage stuck in the 80s. It should be trivially possible to target any minimum glibc version for any possible Linux hardware environment. But glibc and Linux distros don't even attempt to make this possible. Linux toolchains make it nearly impossible to not use the default system libraries which is the opposite of correct for cross-compiling.
Zig moves mountains to make cross-compiling possible. But almost no projects actually attempt to support cross-compile.
I'm wondering what's on the horizon with git 3.0?
From my (very limited) perspective, I just kind of thought git had settled in to 2.x and there wasn't any reason to break compatibility.
See https://git-scm.com/docs/BreakingChanges#_git_3_0
SHA-256 will become the default hash.
Does anyone with insight into Git development know if we should care about this? Is this just a proposal out of nowhere from some rando or is this an idea that a good portion of Git contributors have wanted?
You can perhaps learn more about their involvement in the community from this year’s summit panel interview: https://youtu.be/vKsOFHNSb4Q
In a brief search, they’re engineering manager for GitLab, appear to be a frequent contributor of high-difficulty patches to Git in general, and are listed as a possible mentor for new contributors.
Given the recent summit, it seems likely that this plan was discussed there; I hadn’t dug into that possibility further but you could if desired.
For whatever it might be worth...
Looking at the comment thread, at least one person I recognize as a core maintainer seems to be acting as if this is an official plan that they've already agreed on the outline of, if not the exact timing. And they seem to acknowledge that this breaks some of the more obscure platforms out there.
Interesting! I'd certainly say that's worth something. Definitely didn't expect it though given how poorly some people have reacted to Rust being introduced as an optional part of the Linux kernel.
It's a lot more understandable for developer tooling like Git to more quickly adopt newer system requirements. Something like the Linux kernel needs to be conservative because it's part of many people's bootstrapping process.
rustc_codegen_gcc is close to becoming stable, and conversely the Linux kernel is dropping more esoteric architectures. Once the supported sets of architectures fully overlap, and once the Linux kernel no longer needs unstable (nightly-only) Rust features, it'd be more reasonable for Linux to depend on Rust for more than just optional drivers.
I would also say that it’s a lot easier to learn to write rust when you’re writing something that runs sequentially on a single core in userspace as opposed to something like the Linux kernel. Having dipped my toes in rust that seems very approachable. When you start doing async concurrency is when the learning curve becomes steep.
Those footguns still exist in C, they’re just invisible bugs in your code. The Rust compiler is correct to point them out as bad architecture, even if it’s annoying to keep fighting the compiler.
I've found that when you're doing concurrency, Rust makes things easier, and it becomes simpler to get right.
However, adapting the conventions and mechanisms of a large complex C system like the Linux kernel to Rust is taking time.
How does this help me as a user of git?
By not getting timely security updates:
https://www.debian.org/releases/trixie/release-notes/issues....
And the reason this is a problem is because of the me-first attitude of language developers these days. It feels like every language nowadays feels the need to implement its own package manager. These package managers then encourage pinning dependencies, which encourages library authors to be a less careful about API stability (though obviously this varies from library to library) and makes it hard on distro maintainers to make all the packages work together. It also encourages program authors to use more libraries, as we see in the Javascript world with NPM, but also in the Rust world.
Now, Rust in Git and Linux probably won't head in these directions, so Debian might actually be able to support these two in particular, but the general attitude of Rustacians toward libraries is really off-putting to me.
IMHO the reason is that these languages are industry-funded efforts. And they are not funded to help the free software community. Step-by-step this reshapes the open-source world to serve other interests.
Semantic versioning is culturally widespread in Rust, so the problem of library authors being "less careful about API stability" rarely happens in practice. If pinned packages were the problem, I'd imagine they would have been called out as such in the Debian page linked by parent.
Rust is generally a much better tool for building software than C. When your software is built with better tools, you will most likely get better software (at least eventually / long term, sometimes a transition period can be temporarily worse or at least not better).
That would be a stronger argument if people were facing implementation deficiencies in git
I'm not sure exactly what you mean but of course people are facing implementation deficiencies in Git. Last I checked submodules were still "experimental" and extremely buggy, and don't work at all with worktrees. (And yeah submodules suck but sometimes I don't have a choice.)
The developers of git will continue to be motivated to contribute to it. (This isn’t specific to Rust, but rather the technical choices of OSS probably aren’t generally putting the user at the top of the priority list.)
I am pretty sure that developers motivated to contribute code benefits end users plenty.
In future it might be more reliable and faster, maybe with more features.
But we probably won't see any effect for 10 years or so.
Except there are far less Rust developers than C developers, so contributions will start to drop as Rust usage expands in git.
10 years? are they going to contribute 1 line of a code a day or something?
Well it would probably take at least 5 years to rewrite all of Git in Rust (git-oxide is 5 years old and far from finished). Then another few years to see novel features, then a year or two to actually get the release.
Btw 10 lines of code per day is a typical velocity for full time work, given it's volunteers 1 line per day might not be as crazy as you think.
My guess is that we (I am also a user of git) won't even notice.
I will leave this here for the future:
I did not measure but it does not take long on my old hardware to compile git from scratch either, for now.Ok, I'll bite.
While we are on Hacker News, this is still an enormously obtuse way to communicate.
Are you saying that as users of git we will be negatively affected by deps being added and build times going up? Do you have evidence of that from past projects adding rust?
Why not just say that??
Git is already an uncomfortably large binary for embedded applications. Rust binaries tend to be even more bloated.
Why would you want to run a VCS in an embedded application? Any halfway usable development platform (even VIM) will be much bigger anyways.
It is sure convenient to be able to use git (and vim!) on embedded Linux. You can get by without them of course...
No need to bite. :P
We will see!
It doesn't, it hurts you by limiting the number of platforms Git is available on.
If it works on mac & linux I've got nothing to worry about
What's the point of trying to introduce Rust everywhere? Git is a mature piece of software and I doubt a lot of new code needs to be written. Also, Rust is very complex relative to C. If you really need classes, templates, etc, you can stick to C++ 98 and get something that is still clean and understandable relative to recent C++ standards and Rust.
It's to a "test balloon" if you have a plan to mandate it and will be announcing that. Unless I suppose enough backlash will cause you to cancel the plan.
It's literally a test of how people will react, so yes, finding out if people will react negatively would be exactly the point of doing the test in the first place. Would you prefer that they don't publicize what their follow-up plans would be to try to make it harder to criticize the plans? If you're against the plan, I'm pretty sure that's the exact type of feedback they're looking for, so it would make more sense to tell them that directly if it actually affects you rather than making a passive-aggressive comment they'll likely never read on an unrelated forum.
If they’re running the project with a Linus-type approach, they won’t consider backlash to be interesting or relevant, unless it is accompanied by specific statements of impact. Generic examples for any language to explain why:
> How dare you! I’m going to boycott git!!
Self-identified as irrelevant (objector will not be using git); no reply necessary, expect a permaban.
> I don’t want to install language X to build and run git.
Most users do not build git from source. Since no case is made why this is relevant beyond personal preference, it will likely be ignored.
> Adopting language X might inhibit community participation.
This argument has almost certainly already been considered. Without a specific reason beyond the possibility, such unsupported objections will not lead to new considerations, especially if raised by someone who is not a regular contributor.
> Language X isn’t fully-featured on platform Y.
Response will depend on whether the Git project decides to support platform Y or not, whether the missing features are likely to affect Git uses, etc. Since no case is provided about platform Y’s usage, it’ll be up to the Git team to investigate (or not) before deciding
> Language X will prevent Git from being deployed on platform Z, which affects W installations based on telemetry and recent package downloads, due to incompatibility Y.
This would be guaranteed to be evaluated, but the outcome could be anywhere from “X will be dropped” to “Y will be patched” to “Z will not be supported”.
Rust suffers from the same problems that functional programming languages suffer from: deep learning curve and high complexity. The high complexity is intended to push more runtime errors back to compile time, but boy does that mean the language pays for it. Rust is a tire fire of complexity.
For these reasons I believe it is not a good idea. The kernel also sort of rejected Rust. The kernel is complex enough without adding a Haskell type system and a lisp-level macro system capable of obfuscating what code calls what code. serde code is so hard to spelunk for this reason. Contrast this with Go's Unmarshall, much easier to follow.
That's... an interesting point of view.
I personally find functional programming languages, including Rust, much clearer than C or Go, in particular because you can offload much information onto the compiler. The example of Serde feels a bit weird, because I don't think I've ever encountered issues with Serde code, while almost 100% of the times I've used Go in production, I've needed to debug through Go's Unmarshal and its... interesting implementation.
Also, last time I checked, the kernel didn't reject Rust. There was a conflict between two specific developers on the best place to store some headers, which is slightly different.
> Rust is a tire fire of complexity.
And C isn't?
> The high complexity is intended to push more runtime errors back to compile time
I would almost say that the ergonomics of allowing this is almost as important as the borrow checker!
I actually think Rust is pretty easy to pick up for anyone that’s written Typescript and can use their linter to understand references and unwrapping a Result and catching an error.
Beyond that, Rust has pretty forgiving syntax.
No Linux did not reject Rust from the kernel.
I was going to roll my eyes at "Rust is a tire fire of complexity". Because it's not. Especially compared to C++. But then you just go on to outright lie in your second paragraph.
Dear Rust haters, lying about Rust in the Linux kernel is not effective for your cause, and in fact just makes it further look like you're throwing a tantrum. Downvoting me doesn't change the fact that more and more Rust is merged into the kernel, new, serious drivers are being written in Rust. It also doesn't change Firefox, Chrome, Microsoft, the US Government and others are recommending and writing new code in Rust. It's over, qq. It's absurd.
I really wish I could find the Lobsters comment the other day from someone that broke down the incredible list of nuanced, spec-level detail you needed to know about C++ to actually use it at scale in large projects. It's laughably, absurdly complex compared to Rust in huge code bases.
Given that rust only works on e.g. cygwin recently (and still does not build many crates: i try to compile jujutsu and failed), this is a big blow to portability IMHO. While I try to like rust, I think making it mandatory for builds of essential tools like git is really too early.
?? I build Jujutsu and many other Rust programs from source on Windows.
Rust has a much better Windows story than C and bash do, due to its heritage as a language built by Mozilla for Firefox.
As said before I wasn't complaining about windows, but rather of not so common posix layers like cygwin [0]. Most C posix compliant stuff compiles in my experience.
[0] https://github.com/rust-lang/rust/issues/137819
Right, but Rust makes it so you don't have to use Cygwin. It's one of the great portability advantages of Rust that you can write real Windows programs with it.
I am not really sure if I can follow here. How could a rust compiled program like git honor my cygwin emulated mount points in paths, which I need, when working with other posix compliant software.
Git works only on cygwin too?
No, it doesn't. OP meant that the Rust support on Cygwin is bad; it is better with the native Windows API.
I don't quite understand. Why use a janky, lossy Linux emulation layer when you can just target Windows natively?
Cygwin is an ugly hack anyway.
jj has msvc builds and is still tire 1 target maybe something particular about your configuration?
I want it to be cygwin native, i.e. passing calls through the cygwin posix layer and not use the windows binary. Sure I can use the windows binary, but that is a different thing.
Curious what this means for libgit2.
Ideally upstream git would become better as a library as part of being rewritten in Rust.
As long as binary sizes don't explode...
"Announce that Git 3.0 will make Rust a mandatory part of our build infrastructure."
Sounds like it will be mandatory to use Rust to build all of Git. The title implies Rust itself will be mandatory.
how is that not the same thing?
You could read "Rust will become mandatory" as "all contributors will need to be able to code Rust" or even "all new code has to be written in Rust" or similar variations
One phrasing implies contributions will have to be in Rust, the other doesn’t.
I was confused in the same way after reading the submission title. Mandating Rust would be a far more radical change.
I see. No, I understood it the way it is, as introducing it as a new hard dependency in git 3. I suppose it is a pilot for making it mandatory for contributions / incrementally replacing the existing code in the future, though.
Git is pretty modular, and it already includes multiple languages. I guess that significant parts of it will remain in C for a long time, including incremental improvements to those parts. Though it wouldn't surprise me if some parts of git did become all-Rust over time.
My last company used Jenkins, so our build infrastructure depended on Java. We used zero code outside of supporting Jenkins. So Java was required to build our stuff, but not to write or run it.
Edit: nope, I’m wrong. On reading the link, they’re setting up the build infrastructure to support Rust in the Git code itself.
Maybe rewrite or create a new SCM called `grit`, etc
Feel like there’s a ton of interesting things ahead for SCM — want to see more of those proposals.
For example…had to build my own tool to extend git blame and track the AI generated code in our repository and save prompts:
https://github.com/acunniffe/git-ai
what's a 'test balloon'?
Ironically it's original use was in political* parlance.
From wiki it's "information sent out to the media in order to observe the reaction of an audience. It is used by companies sending out press releases to judge customer reaction, and by politicians who deliberately leak information on a policy change."
Yup I have no doubt that there's a Rust 'evangelist' group somewhere aiming for inorganic growth of the language.
> Yup I have no doubt that there's a Rust 'evangelist' group somewhere aiming for inorganic growth of the language.
So anything using Rust now must be the ‘evangelists’ work right?
A viral commit
mandatorty: best new word of 2025
mandatorty (adj.): Simultaneously required and a civil offense.
I need you to fill out this TPS report. Unfortunately it's mandatorty to fudge section 15A.
It seems unwise, to me, to tie the life of a project as fundamental, and conceptually simple, as git to a compiler and runtime as complicated as rust.
The beauty of the unsafety of C is partially that it's pretty easy to spin up a compiler on a new platform. The same cannot be said of Rust.
One argument from the git devs is that it’s very hard to implement smarter algorithms in C, though. For example, it uses arrays in places where a higher level language would use a hash, because the C version of that is harder to write, maintain, and debug. It’s also much easier to write correct threaded code in Rust than C. Between those 2 alone, using a more robust language could make it straightforward to add performance gains that benefit everyone.
That's a one time gain though. There's no reason for every platform to check the validity of some hash table implementation when that implementation is identical on all of them.
In my opinion, the verification of the implementation should be separate from the task of translating that implementation to bytecode. This leaves you with a simple compiler that is easy to implement but still with a strong verifier that is harder to implement, but optional.
C is 50 years old or something like that, and it still doesn't have a standard hash map.
Sure its not impossible for C to get that, but at the same time, they are trying to write git not fix C.
* My point is, that hash maps and data structures like that are clearly not the priority of C or they would **exist by now.
** by exist I mean either in C standard, or a at least a community consensus about which one you pick, unless you need something specific.
And who’s volunteering for that verification using the existing toolchain? I don’t think that’s been overlooked just because the git devs are too dumb or lazy or unmotivated.
> just because the git devs are too dumb or lazy or unmotivated.
That's a very unkind assumption of my argument.
I ask that you read https://news.ycombinator.com/item?id=45314707 to hopefully better understand my actual argument. It doesn't involve calling anybody stupid or lazy.
That came across more harshly than I meant, but I stand by the gist of it: this stuff is too hard to do in C or someone would’ve done it. It can be done, clearly, but there’s not the return on investment in this specific use case. But with better tooling, and more ergonomic languages, those are achievable goals by a larger pool of devs — if not today, because Rust isn’t as common as C yet, then soon.
As a practical example, the latest Git version can be compiled by an extremely simple (8K lines of C) C compiler[1] without modification and pass the entire test suite. Gonna miss the ability to make this claim.
[1] https://github.com/fuhsnn/widcc
Do you think any new, Git-relevant platform is going to gain C compiler support via anything other than Clang/LLVM?
In theory you should be able to use TCC to build git currently [1] [2]. If you have a lightweight system or you're building something experimental, it's a lot easier to get TCC up and running over GCC. I note that it supports arm, arm64, i386, riscv64 and x86_64.
[1] https://bellard.org/tcc/
[2] https://github.com/TinyCC/tinycc
> I note that it supports arm, arm64, i386, riscv64 and x86_64.
But like, so does LLVM.
Code doesn't need to "gain C compiler support", that's the point of having a language standard.
Someone has to write the platform-specific backend. A language standard doesn't help you if nothing implements it for your new platform.
Which Rust still does not have. If serious projects like Git and Linux are adopting Rust, the Rust team might want to consider writing a spec.
The nature considering the future is that our actions _now_ affect the answer _then_. If we tie our foundational tools to LLVM, then it's very unlikely a new platform can exists without support for it. If we don't tie ourselves to it, then it's more likely we can exist without it. It's not a matter of if LLVM will be supported. We ensure that by making it impossible not to be the case. It's a self fulfilling prophecy.
I prefer to ask another question: "Is this useful". Would it be useful, if we were to spin up a different platform in the future, to be able to do so without LLVM. I think the answer to that is a resounding yes.
That doesn't leave rust stranded. A _useful_ path for rust to pursue would be to defined a minimal subset of the compiler that you'd need to implement to compile all valid programs. The type checker, borrow checker, unused variable tracker, and all other safety features should be optional extensions to a core of a minimal portable compiler. This way, the rust compiler could feasibly be as simple as the simplest C compiler while still supporting all the complicated validation on platforms with deep support.
rustc is only loosely tied to LLVM. Other code generation backends exist in various states of production-readiness. There are also two other compilers, mrustc and GCC-rs.
mrustc is a bootstrap Rust compiler that doesn't implement a borrow checker but can compile valid programs, so it's similar to to your proposed subset. Rust minus verification is still a very large and complex language though, just like C++ is large and complex.
A core language that's as simple to implement as C would have to be very different and many people (I suspect most) would like it less than the Rust that exists.
bruh what is that goofy ass capcha protection??
Will they introduce Ada and announce that it will become mandatory