Jump to content

Plastic vs. Git vs. Mercurial


roadrunner

Recommended Posts

XDDDDDDD

Yeah, yeah, I know this video pretty well: http://codicesoftware.blogspot.com/2007/05/linus-torvalds-on-git-and-scm.html. I posted about it long ago.

What he would hate about Plastic??

I bet he would hate everything... because he basically likes Git only! :)

And Git is an exceptionally good piece of code, of course.

But, with a tiny team and lots of effort we have created 4.0 and we truly believe 4.0 is the best DVCS around for teams developing commercial software. I mean, GitHub is excellent for open source projects but we think not as good as ours for commercial teams.

Going to the details:

- Hg branching and merging is weaker than Plastic: Hg merges are limited to one merge to a cset, you can't merge more than one together. In terms of merging Plastic algorithm is simply better: http://codicesoftware.blogspot.com/2011/09/merge-recursive-strategy.html

- We do better renaming tracking than both Git/Hg -> Linus will say this is pointless, but this is one of the points we're very proud of. Not only the detection (rename a directory and find changes in plastic... it will detect the directory move, instead of a myriad file renames as git and hg would do...). We're definitely better handling this during merge too.

- The tools: ok, we not only have an exceptionally good core... we also give you all the surrounding tools like the 3-way merge tool, the diff, the code review, the astonishing Distributed Branch Explorer... well, is like comparing just a bare engine with a fully polished car... for instance. And we thing we do run circles around the two there...

- Support: can you talk to the hg/git core developers and get an answer in a matter of hours if not minutes? You can with plastic. Isn't it a mind blower? :)

Ok, I could talk for hours... stay tuned for our 4.0 online launch on Nov 17th, subscribe to @plasticscm on Twitter.

Link to comment
Share on other sites

Do you guarantee what we put in is what we get out? (One of his requirements)

Sometimes you will get what you put into Plastic, sometimes not http://www.plasticscm.net/index.php?/topic/567-where-is-my-file/, http://www.plasticscm.net/index.php?/topic/418-directory-disappeared-after-merge/, http://www.plasticscm.net/index.php?/topic/490-merging-sub-sub-branches/.

Speaking of 3.0 the feature list of Plastic is very nice and the standalone client fine (though the "stacked" panels is a crazy mans idea).

Installation and configuration of Plastic is also very nice compared to say Git.

Where Git shines (again, speaking of Plastic 3) is in stability and distribution. The Plastic Visual Studio plugin is very unstable (I have filed about 50 bug reports), but

I have not experienced the same problems with, for example, git extensions. Plastic 3 repository distribution is also somewhat half hearted. The feel of it is that it is a feature added later and

not though into the core of Plastic. You have to explicitly choose remote branches to merge and mercy on your sources if you forget to replicate changes in a parent branches.

This is much less (if any) of an issue with Git, where repository distribution is the very core of the system.

That said, there is a lot of improvement to distribution in Plastic 4 and the Plastic team claims, that I am the only one having Visual Studio issues.

If this is true, I can recommend Plastic over Git for the greater simplicity of getting it up running and into the workflow.

Plastic support is, like psantosl wrote, excellent, with quick reples. On the downside, the user base is tiny compared to most other version control systems. For example, PlasticSCM questions are tagged 16 times in stackoverflow.com, where git is tagged +11,000 times.

My team will probably switch to Git at some point, since the Visual Studio plugin issues are causing daily 10-20 minutes of workarounds,

restarting Visual Studio, reconstructing halfway broken check in transactions, etc. But again, we might have a special workflow

or Visual Studio usage that doesn't comply with Plastic very well.

Link to comment
Share on other sites

Hi,

I think is fair to say that Soho has been experiencing some weird issues, some of them hard or even impossible to reproduce down here... But I think he's pretty fair with his remarks.

Let me highlight some of Soho's excellent points:

- Visual Studio Plugin: here you can love it or hate it. We've users saying it simply rocks (specially against Perforce's) and people like Soho who hate it. Soho, before you revert to good-cheap-git ;) give a try to 4.0 and working without plugin at all... and then let Plastic find your changes (including moves, renames, deletes, adds... everything... Check this:

. It all depends on you... I personally tend to hate all plugins (all except refactoring and our astonishing method history) and work with VS and the GUI... I ALT-TAB, find changes and check in and that's all I do. Simply works perfectly. It doesn't mean we're not working like crazy improving the plugin... specially the corner cases that Soho seemed to crash against...

- Replication: well, it is simply not true that "replica" was added as a "patch" to plastic... we released replica on 2.0 so long ago that I can't even remember... ;). But, what Soho points are the problems blending the "dynamic" tree system we used prior to 4.0 with replication... As much as I loved dynamic trees I decided to make a small tweak on the Plastic core for 4.0 (small in concept, huge in impact, described here, but only in Spanish so far http://codicesoftware-es.blogspot.com/2011/09/plastic-internals-de-30-40.html). Now a cset is a static tree (like it is in git) and now replica blends better.

What does it mean? Now you can replicate even a single changeset and you get a full source tree ready to build on your destination repo. In 3.0 and before it simply wasn't there because if you had a branch /main/a/b/c and you only replicate "c", you needed to replicate "b", "a" and "main" too... and it is due to the way in which dynamic trees worked. And this is the main reason why we decide to simplify our core in 4.0 and come up with a much better replication system.

BTW, now (4.0) you can do something like: create main/mybranch on repoDenver replicate ONLY main/mybranch to an empty repoVegas and work there, branch from it, modify it, whatever, and push changes back to repoDenver... it is called partial replication and it is simply out of the scope of git and hg. Cool, isn't it?

So, roadrunner, more insights here... :) And Soho... well, stay on Plastic! :) Get temporarily rid of the plugin and use "transparent scm" as we like to call it.

Link to comment
Share on other sites

I definitely don't care about the VS plug in right now. But if you have problems with revisions not producing the exact content every time then there's issues..

The question for me internally is ease of use vs. GIT stability but it seems that the advanced requirements of GIT may lead to more time loss overall than the minor issues with Plastic.. I'm looking at git Extensions next.

Link to comment
Share on other sites

To be fair, I think most or all my issues regarding missing stuff has been related to either replication or the VS plugin. Since replication has received a most welcome change in 4.0 those issues will probably not be relevant.

As for merging / replication strategy in 3.0, I think it only made sense if you had a strict branch-per-task workflow. For all other workflows, the result of merging or replicating sub-branches would be confusing at best.

Regarding the Visual Studio plugin I will still hold my stand that this is seriously flawed. Codice might have some large customers with special environment or workflows (or don't waste time reporting bugs policy) that doesn't provoke these issues, but I have seen most of my reported issues on many installations all running almost plain vanilla Visual Studio 2010 with a ReSharper plugin. In that sense I don't buy the "cannot repro" excuse http://xkcd.com/583/, but may understand that Codice wants to prioritize large customers rather than a broad audience.

I could skip the plugin and go standalone or "Tortoise" style version control, but if you do a lot of refactoring (like renaming stuff) it is really nice with integrated version control. Also automatic checkouts is essential. I do however love the SVN non-readonly checkout style and I believe Plastic allows such an option, but don't know how well it works.

Link to comment
Share on other sites

Hi there,

Well, we dedicate time to GoToMeet users who are not even paying support... Of course is not our standard, but we do it when necessary. I guess this simply explains why we're not only focused on our large customers but we pay attention to small ones too. And I bet no other SCM company out there is so focused on small customers as we are.

Regarding refactoring: again, please watch the video I sent: 4.0 detects moves and renames without the need of a plugin. That's the beautiful stuff about it. That's what I mean.

The non-readonly thing is what I use (I've been using it for two years already) on a daily basis.

Link to comment
Share on other sites

The question for me internally is ease of use vs. GIT stability but it seems that the advanced requirements of GIT may lead to more time loss overall than the minor issues with Plastic.. I'm looking at git Extensions next.

Roadrunner: this is our goal, be able to give you all the productivity tools around a strong DVCS core... something you simply don't find on any other DVCS.

Isn't the Distributed Branch Explorer alone worth the change? :)

Link to comment
Share on other sites

Every revision is hashed with MD5.

Other than that, it is important to highlight that the underlying requirements are a little bit different. Git is all about public developments with repositories all around that could be potentially hacked to introduce backdoors on the linux kernel... sounds cool but this is not really the issue (correct me if I'm wrong) we face on commercial developments. Even if you use DVCS to the limit it is more unlikely to get the central repo corrupted by a malicious attempt to hack history...

Link to comment
Share on other sites

The SHA1 is not for security...it's for data integrity. (Disk / network issues causing corruption of data) If you guarantee the SHA1 signature you can guarantee integrity of the data. This was one of the questions posed - is what you get out exactly what you put in? If so, what do you do to guarantee that.

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...