Jump to content

Big repository and backend


ninoles

Recommended Posts

I'm planning to use plasticscm for a big game repository (Unreal game of about 50Gb, including most of the binaries that are normally fetch from Unreal Setup script).  I have tested it with a smaller similar repository and I'm a bit concern about the result.

 

I try to import UnrealTournament (wkspace size = 800Mb, .git size = 2.5Gb, fast-export size 47Gb) using fast-export and ended up with a SQLite repo size of 15G.  Although my system handle it quite well, I wonder about the difference of size between the .git and the sqlite database.

 

How much difference I should expected between individual revision and final repo ?  Is the backend used change something ?  What's the recommanded setup for such settings (expect a small team (<10), mostly small code changes, and a few binaries change once every month or so). This is mainly for evaluating the cost for using plasticscm and how we should manage it.

 

Thanks,

Fabien

Link to comment
Share on other sites

Hello Fabien,

 

The Plastic SCM server will compress each revision prior to storing it inside the repository database, so if you are going to primarily be updating text files the DB impact will be extremely low, on the other hand if you are going to be uploading binary files that can't be compressed the repository size will increase in almost the same size as the new revision. Notice that Plastic SCM does store the complete revision data and not deltas.

 

That being said HD cost is extremely low nowadays but if it still concerns you then the Plastic SCM cloud can be an option: https://plasticscm.com/cloud/index.html

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...