I'm not sure what these "remotes" are. They look like they might be source code archives.
Older systems like CVS and subversion dealt with that by requiring the code maintainer to "check out" the code that was to be changed. While checked out no one else could change the code, but the "official" code was unchanged until that person had committed (or rolled back) the changes and checked the code back in.
That didn't work very well when dealing with teams of people who come and go all over the planet, so systems like git were developed. In git nothing is checked out, but when you commit changes, you have to first reconcile them with any changes (possible conflicts0 that were made since you got the latest update from whichever git repo you worked out of.
Another way I could read your question might be if you have the same set of build source code but different people have different build files. That's awful and it brings back ancient nightmares. Among other things, it makes it difficult to do reliable builds. As in the case of the older source code versioning systems, if you need resources from someone who isn't available, or, for that matter, who has a different computer setup, you're looking at a lot of very expensive work. I've been there, done that, and never want to do it again.
The Maven build system and its relative were actually designed for the idea that one build file serves everyone. I can ship a zipped Maven project to 7 people on 5 different continents and they can all build the same byte-for-byte identical module. And, in fact my projects are structured such that that same module can be used without modification or rebuilding on a desktop development machine, a beta test machine or a production server. Meaning that if a production problem occurs, I can analyze it on an offline machine without worries about build differences.
When it comes to destroying a civilization, gas chambers cannot hold a candle to echo chambers.