For years, the best argument for centralizing on Github was that this was where the developers were. This is where you can have pull requests managed quickly and easily between developers and teams that otherwise weren't related. Getting random PRs from the community had very little friction. Most of the other features were `git` specific (branches, merges, post-commit hooks, etc), but pull requests, code review, and CI actions were very much Github specific.
However, with more Copilot, et al getting pushed through Github (and now-reverted Action pricing changes), having so much code in one place might not be enough of a benefit anymore. There is nothing about Git repositories that inherently requires Github, so it will be interesting to see how Gentoo fares.
I don't know if it's a one-off or not. Gentoo has always been happy to do their own thing, so it might just be them, but it's a trend I'm hearing talked about more frequently.
I'm watching this pretty closely, I've been mirroring my GitHub repos to my own forgejo instance for a few weeks, but am waiting for more federation before I reverse the mirrors.
Also will plug this tool for configuring mirrors: https://github.com/PatNei/GITHUB2FORGEJO
Note that Forgejo's API has a bug right now and you need to manually re-configure the mirror credentials for the mirrors to continue to receive updates.
I wonder if federation will also bring more diversity into the actual process. Maybe there will be hosts that let you use that Phabricator model.
I also wonder how this all gets paid for. Does it take pockets as deep as Microsoft's to keep npm/GitHub afloat? Will there be a free, open-source commons on other forges?
You can push any ref not necessarily HEAD. So as long as you send commit in order from a rebase on main it should be ok unless I got something wrong from the doc?
Can't you branch off from their head and cherry-pick your commits?
Once the protocols are in place, one hopes that other forges could participate as well, though the history of the internet is littered with instances where federation APIs just became spam firehoses (see especially pingback/trackback on blog platforms).
Sure, the world has pretty much decided it hates the protocol. However, people _were_ doing all of that.
There's not much point in observing "but you could have done those things with email!". We could have done them with tarballs before git existed, too, if we built sufficient additional tooling atop them. That doesn't mean we have the functionality of current forges in a federated model, yet.
Those exist (badly and not integrated) as part of additional tools such as email, or as tasks done manually, or as part of forge software.
I don't think there's much point in splitting this hair further. I stand by the original statement that I'd love to see federated pull requests between forges, with all the capabilities people expect of a modern forge.
Give me “email” PR process anytime. Can review on a flight. Offline. Distraction free. On my federated email server and have it work with your federated email server.
And the clients were pretty decent, at running locally. And it still works great for established projects like Linux Kernel etc.
It’s just pain to set up for a new project, compared to pushing to some forge. But not impossible. Return the intentionality of email. With powerful clients doing threading, sorting, syncing etc, locally.
I think it's dwm that explicitly advertises a small and elitist userbase as a feature/design goal. I feel like mailing lists as a workflow serve a similar purpose, even if unintentionally.
With the advent of AI slop as pull request I think I'm gravitating to platforms with a higher barrier to entry, not lower.
There is code or repository, there is a diff or patch. Everything else your labeling as pull request is unknown, not part of original design, debatable.
GitHub style pull request is not part of the original design. What aspects and features you want to keep, and what exactly you say many others are interested in?
We don't even know what a forge is. Let alone a modern one.
---
> If you're a code forge competing with GitHub and you look anything like GitHub then you've already lost. GitHub was the best solution for 2010. [0]
> Using GitHub as an example but all forges are similar so not singling them out here This page is mostly useless. [1]
> The default source view ... should be something like this: https://haskellforall.com/2026/02/browse-code-by-meaning [2]
[0] https://x.com/mitchellh/status/2023502586440282256#m
I'm also quite used to the GitHub layout and so have a very easy time using Codeberg and such.
I am definitely willing to believe that there are better ways to do this stuff, but it'll be hard to attract detractors if it causes friction, and unfamiliarity causes friction.
I say this as someone who does browse the web view for repos a lot, so I get the niceness of browsing online... but even then sometimes I'm just checking out a repo cuz ripgrep locally works better.
when he's working on his own project, obviously he never uses the about section or releases
but if you're exploring projects, you do
(though I agree for the tree view is bad for everyone)
I also look for releases if it's a program I want to install... much easier to download a processed artifact than pull the project and build it myself.
But, I think I'm coming around to the idea that we might need to rethink what the point of the repository is for outside users. There's a big difference in the needs of internal and external users, and perhaps it's time for some new ideas.
(I mean, it's been 18 years since Github was founded, we're due for a shakeup)
"This new thing that hasn't been shipped, tested, proven, in a public capacity on real projects should be the default experience going forwards" is a bit much.
I for one wouldn't prefer a pre-chewed machine analysis. That sounds like an interesting feature to explore, but why does it need to be forced into the spotlight?
Find a project, find out if it's the original or a fork, and either way, find all the other possibly more relevant forks. Maybe the original is actually derelict but 2 others are current. Or just forks with significant different features, etc. Find all the oddball individual small fixes or hacks, so even if you don't want to use someone's fork you may still like to pluck the one change they made to theirs.
I was going to also say the search but probably that can be had about the same just in regular google, at least for searching project names and docs to find the simple existence of projects. But maybe the code search is still only within github.
Pretty sure several of these distros started doing this with cvs or svn way back before git became popular even.
The first hit I could find of a git repository hosted on `archlinux.org` is from 2007; https://web.archive.org/web/20070512063341/http://projects.a...
For us Europeans has more to do with being local that reliability or copilot.
I hope so. When Microsoft embraced GitHub there was a sizeable migration away from it. A lot of it went to Gitlab which, if I recall correctly, tanked due to the volume.
But it didn't stick. And it always irked me, having Microsoft in control of the "default" Git service, given their history of hostility towards Free software.
But the implementation of Gerrit seems rather unloved, it just seems to get the minimal maintenance to keep Go/Android chooching along, and nothing more.
Gitlab CI is good but we use local (k8s-hosted) runners so I have to imagine there's a bunch of options that provide a similar experience.
Hell even if you don't use VSCode there are much better options than messing around with patch files.
Patch files are excellent for small diffs at a glance. Sure, I can also `git remote add coworker ssh://fork.url` and `git diff origin/main..coworker/branch`, and that would even let me use Difftastic (!), but the patch is entirely reasonable for quick glances of small branch diffs.
> No need for that.
I generally expect a less complex solution, it seems like your is more complex (easiness is arguable though)
The original AGit blog post is no longer available, but it is archived: https://web.archive.org/web/20260114065059/https://git-repo....
From there, I found a dedicated Git subcommand for this workflow: https://github.com/alibaba/git-repo-go
I really like what I've read about AGit as a slightly improved version of the Gerrit workflow. In particular, I like that you can just use a self-defined session ID rather than relying on a commit hook to generate a Gerrit ChangeId. I would love to see Gerrit support this session token in place of ChangeIds.
Steam proved gaming doesn't depend on Windows, Linux can do it too.
Countries in Europe feed-up with Windows moving to Linux
LibreOffice is eating Microsoft 365 lunch
Microsoft buying GitHub caused a mass-exodus, its AI push is causing another mass-exodus.
Big open-source project moving away from GitHub, we only need a big player to make the move, followers will come.
No way.
I love LibreOffice. It's fantastic. I rolled it out to a prior employer where everyone needed a word processor but we certainly didn't need to pay Office prices when we didn't have requirements that only Office could satisfy. A high point was when we were having trouble collaborating on a DOCX file with a customer, then they sheepishly told us that they weren't using Office, but this other LibreOffice (OO.org at the time) thing. We laughed and told them we were, too. That day we started swapping ODT files instead and everything worked 100x better.
And all that said, I haven't seen LibreOffice in person in years. Mac shops uses Pages & friends for internal stuff, but really, almost everyone not using Office 365 or whatever they're calling it now is using Google Docs. Google is eating Microsoft's lunch in this space, and my gut estimate is that they split 95+% of the office software market between them.
I do wish that weren't the case, but my personal experience tells me it is. I wish it were more common, and also that there was a virtuous cycle where more Mac users made it get more attention, and more attention made it feel more like a "Mac-assed Mac app", and feeling more like a Mac-assed Mac app got it more users, etc. I just don't see that playing out.
This one misses the point entirely, I'm sorry to say. Microsoft 365's "lunch" is that a majority of US businesses, schools, and governments are reliant on 365 for anything in their organization to function.
Basically a substantial (non-software) enginerring or financial work is done in Microsoft's proprietary formats, occasionaly involving VBA.
Many businesses cannot even pay salaries without macro-ridden Excel documents.
With 365 Microsoft has even stronger moat: cloud integrated co-editing using desktop apps. No browser will exceed C++/C# Office apps running directly on the PC. Not even proprietary apps have equivalent experience.
On top of that add all Azure, SharePoint etc. All big companies without exception use those and put significant portion of their business knowledge on Microsoft platforms.
US can literally kill Europe by just forcing Microsoft to shutdown its operations. It is fucking scary.
Both powerpoint and excel are well ahead of the competition.
This “Great Uncoupling” is well underway and will take us toward a less monocultural Internet.
Gentoo's Github mirrors have only been to make contributing easier for -I expect- newbies. The official repos have -AFAIK- always been hosted by the Gentoo folks. FTFA:
This [work] is part of the gradual mirror migration away from GitHub, as already mentioned in the 2025 end-of-year review.
These [Codeberg] mirrors are for convenience for contribution and we continue to host our own repositories, just like we did while using GitHub mirrors for ease of contribution too.
And from the end-of-year review mentioned in TFA [0] Mostly because of the continuous attempts to force Copilot usage for our repositories, Gentoo currently considers and plans the migration of our repository mirrors and pull request contributions to Codeberg. ... Gentoo continues to host its own primary git, bugs, etc infrastructure and has no plans to change that.
we learn that the primary reason for moving is Github attempting to force its shitty LLM onto folks who don't want to use it.So yeah, the Gentoo project has long been "decoupled" or "showing it can be done" or whatever.
- It's slow for git command-line tasks, despite the site UX being much faster, git operations are really slow compared to Github.
- It doesn't have full feature parity with Github actions. Their CI doesn't run a full pkgcheck I guess, so it's still safer for a new Gentoo contributor to submit PR's to github until that gets addressed.
I stand corrected (thanks Sam): This was previously the case before they made the announcement, it’s fully working now. Feel free to contribute on Codeberg.
If you haven't seen it already, Codeberg is seeking donations here: <https://docs.codeberg.org/improving-codeberg/donate/>. A good way to support a product you like rather than becoming the product yourself.
Honestly I don't understand all the GitHub hate recently. Honestly seems like a fashionable trend. Virtue signaling.
There was a decade where they barely innovated. Maybe people forgot about that? Or maybe they are too young to remember? I'll gladly take all the advances over the past 8-ish years for the odd issue they have. GH actions has evolved A LOT and I'm a heavy Copilot user at the org/enterprise level..
Overall, though, it's ... fine. That's all. A little worse than it used to be, which is frustrating, but certainly nowhere near unusable. I stood up my own forge and mirror some repos to it. The performance is almost comically better. I know it's not a fair comparison: I have only one user. On the other hand, I'm on a 9-year-old Xeon located geographically farther from me than GitHub's servers.
I REALLY recommend it
Codeberg does suffer from the occasional DDOS attack—it doesn't have the resources that GH has to mitigate these sorts of things. Also, if you're across the pond, then latency can be a bit of an issue. That said, the pages are lighter weight, and on stable but low-bandwith connections, Codeberg loads really quickly and all the regular operations are supper zippy.
I don't think that it would take great contortions to implement a HTML + JS frontend that's an order of magnitude faster than the current crapola, but in practice it... just doesn't seem to happen.
I hear you and you're right that Codeberg has some struggles. If anyone needs to host critical infra, you're better off self-hosting a Forgejo instance. For personal stuff? Codeberg is more than good enough.
I now run a local Gitea. Its rather more performant and uptime is rather better too!
I have no idea why on earth I even considered using GH in the first place. Laziness I suppose.
I have not used Codeberg that much myself. I have known about it, but the UI is a bit ... scary. Gitlab also has a horrible UI. It is soooo strange that github is the only one that got UI right. Why can't the others learn from KEEPING THINGS SIMPLE?
The alliance any up-and-comers can make with the ecosystem is to develop more of what they host in the open source. In return for starting much closer to the finish line, we only ask that they also make the lines closer for those that come after them.
That's a bit of an indirect idea for today's Joe Internet. Joe Internet is going to hold out waiting for such services to be offered entirely for free, by a magical Github competitor who exists purely to serve in the public interest. Ah yes, Joe Internet means government-funded, but of course government solutions are not solutions for narrow-interest problems like "host my code" that affect only a tiny minority. And so Joe Internet will be waiting for quite some time.
Personally I wouldn't mind paying for access but I doubt there is a critical mass of users that can be weaned off of free access. Competing with free networks is hard. Codeberg, as far as I can tell, basically has a donation model where you can volunteer to pay and be a "member", but 0.5% of users choose that option, that is, they made a one time payment of 10 euros. That's enough to fund how many months of bandwidth and a couple of recycled servers. For cloud infrastructure standards are pretty high, you want replication, backup, anti-DDOS, monitoring, etc. All of that costs money. It would also help if they made it easier to donate with a paypal link instead of a SEPA QR code that requires an international bank transfer.
Not a wikipedia banner. No guilt verbiage. No unrelatable total site/year numbers like "2.6M out of 5M goal" etc.
Just like some little bit of ui in a corner somewhere that passively just sits there and shows it's state like a red/yellow/green light or a battery meter or something. And what it shows is some at-a-glance representation of what you are costing the service, positive or negative.
If the org is open and low profit or even non profit, or even reasonable profit but organized as a co-op, this can be a totally honest number, which will probably be suprisingly small.
(and if any full-profit type services don't like having that kind of info made quite so public because it makes it hard to explain their own prices, well golly that sure sounds awful)
This will obviously have no effect on some people.
But I know that something like that will absolutely eat at some people until they decide they will feel better if they make that dot turn green.
And everyone else who just wants to take something for free and doesn't like being reminded of it, has no basis for complaining or claiming to be outraged at being nagged or browbeaten. It's a totally passive out of the way bit of display making no demands at all and not even hindering or speedbumping anything.
Even when you click on it for more info and the links to how to donate etc, the verbiage is careful not to make kids or drive-by laypeople or anyone else without real means feel bad or feel obligated. We don't need your soup money, don't sweat it.
Maybe even include some stories about how we all wound up in our high paying IT jobs because of the availability of stuff other people wrote and let us use for free when we were kids or former truck drivers etc, and so that's how you can understand and believe we really are ok with you now using this for free.
Can't possibly get any lighter touch than that.
And yet the fact that the little thing is just there all the time in view, that alone will make it like a voluntary itch that if you know you can afford it, you should make that light green. It's like a totally wholesome use of gamification psychology.
I guess it will also have to somehow show not just what you cost yourself, but also what all the non-paying users are costing and what your fraction of that would be to cover those. At least some payers would need to pay significantly more than what they cost.
But I'd be real curious to see just how bad that skew is after a while if a lot of individuals do end up paying at least for themselves, where today most of them pay nothing.
That may make the need for whales much reduced and really no whales, just a bunch that only pay like twice what they cost. Or even less, a heavy user that costs more might be able to totally cover the entire cost of 10 other light users with only 10% more than their own cost. It could eventually smooth out to being no real burden at all even for the biggest payers.
That's getting to be a bit much info to display all in a single colored dot or something without text or some complicated graphic, but I think this much could be shown and still be simple and elegant. Even a simple dot can have several dimensions all at once. size, hue, saturation, brightness, let alone any more detail like an outline or more complex shape.
About the only thing I can see that is a bad thing is I bet this is a recipe for unfairly taxing women more than men. You just know that far more women will make that light green even if it's not easy, and far more men will happily let it ride forever even though they could afford it effortlessly, just to spend that $3 on a half of a coffee instead.