Zolko wrote: ↑Tue Oct 15, 2019 7:31 am
… all parts are located on the local machine, with a remote repository to synchronize workstations. We had tried to work in a shared database directly, but because of the network lag this proved to be inefficient. Also, using the same files creates locking problems.
Exactly this came to my mind when I thought about organizing the libraries. I came up with a recipe based on Links (again…). Each external (standard) part is "imported" into the project in the form of a link in folder at the top of the file. When I need a standard part in my project file, I create a Link to the "import link", not directly to the library somewhere in the net. In other words, I use an extra layer of indirection for shared parts.
When I have to modify a shared part (e. g. correct mistakes, add some polish, etc.), I create a local copy, update the "import link" to the copy, and work with that copy until it is finished and tested. When done and approved, I can overwrite the original library file with my new version and change the "import link" to point to the updated file on the network or wherever.
(This technique also works fine when I go working off-site.)
This approach minimizes the write access windows, and is suitable for one or possibly a handful of developers. But there still remains the risk of
concurrent update conflicts: When multiple developers work on the same entity, one could easily overwrite the work of another.
With the help of a version control system (SVN, Git,...), these update conflicts can be detected, and managed on beforehand. So, yes, a version control system is highly recommended. Perhaps Git with its distributed repository philosophy could help to overcome the network lag?
But I am drifting off-topic. Sorry for that.