Jump to content

All Activity

This stream auto-updates     

  1. Last week
  2. Hi! I'm currently looking into migrating our self hosted Plastic Server from MySQL to a Jet database. Of course I currently test the migration on a testing machine (which has some replicated repos on it). I'm using the web-gui interface. Now it gets stuck on this error: The migration progress has failed: Unable to copy the repository database rep_34: Table 'rep_34.reviewcomment' doesn't exist when I try to restart the migration process this message pops up: The migration progress has failed: Unable to copy the repositories database: Could not map file And I have to delete all the migrated repo folders first before i can try it again (don't know if this is on purpose) Additionally I have some questions regarding the migration: 1. We have some big Repos (100GB+) I've read on http://blog.plasticscm.com/2018/06/story-of-jet-fast-repo-storage.html that they may cause problems? I've not had the possibility to test it yet on the testing machine because of the above mentioned error. 2. Is there any hardware specification changes regarding mysql/jet (More RAM etc.)? Thanks! Best Fabian
  3. David Yang

    Code Review: improvements to better support iterative review

    Thanks Pablo, that's awesome to hear!! Sounds like what you guys have planned covers everything in my post and more (great idea being able to link a changeset to a specific "change" request). The big combination window is a tad overwhelming at first (maybe move the bottom pane to the conversation tab, and simply have a pop-up warning if trying to close the code review or merge the branch with unresolved questions/changes)? Otherwise, everything looks great! Absolutely understand you need to prioritise the key changes needed to support your big customers. Would be really great to see these Code Review improvements after that though! Currently this (and perhaps lack of built-in issue tracker - I'm having some difficulty with the MantisBT integration, which I've posted about separately) are the only big things I'm really missing about the setup I had with Git + Azure DevOps. Otherwise, Plastic is absolutely amazing - the underlying branch mechanics, branch explorer and diff/merge tools are miles and miles ahead of anything else! David PS: By the way, what is the issue people are currently having with single branch workflows in Plastic? I currently use what is essentially a single branch workflow with Git (but that is mostly due to Git's limitations), so let me know if there's any way I can help. Essentially, I create temporary task branches, "squash merge" onto master when finished (with Azure DevOps automatically deleting the task branch, but preserving a history of the task branch's commits in the completed pull request).¹ If people want it, I think this would be fairly simple to implement in Plastic. With "squash merging", you would just need to enforce the deletion of the task branch to prevent people reusing it and creating evil twin conflicts. And preserving the changeset history of the "deleted" task branch could be done by "hiding" the task branch rather than actually deleting it (I think I read somewhere that that's basically what Azure DevOps, Github, etc do with their pull requests - they just move the task branch to a branch called pull/<name> and somehow hide branches with the 'pull/' prefix from Git). Edit: Actually you could already do this with attributes and filters. _____ ¹ This seems to be Microsoft's own approach. I'm guessing they also had trouble with other Git workflows - the much-hyped GitFlow model is actually really hard to use with Git in practice, firstly because commits don't "belong" to a branch in Git (although I know you guys are working on something which makes this a lot better - I'll drop you a separate line on that 🙂), and secondly because Git doesn't handle merging well when there are cherry-picked commits (Git thinks they are different commits which change the same lines, so it reports a merge conflict despite those lines being changed in the same way). Piece of cake with Plastic though 👍
  4. Hi David, First, thank you for taking the time to compile all this great feedback. I'm the CTO here at Plastic, so let me share with you some of the designs we have for the upcoming Code Review system. Our plan was to implement it last year, but we got swamped by a ton of other smaller but still important things. First let me share what we consider the keys of the new code review system: I'm copy/pasting from internal docu :-) And this is how the overall design of the review window and its elements look like. The key elements for Code Review are: Comments are now together with the diff (on a side panel). Comments have replies, so you can handle threads there. The key is that you can "ask questions" and "ask for changes". And then it is very easy to track if those questions were answered or if the actual changes where made, which is one of the things you wanted to see. How? You'll type a special comment with a mark saying it is a change request. Then it will be very easy to track if the change or question was answered, and double clicking will show you the right diff: When you checkin, you'll be able to select (or using a comment) the "change" you are addressing: Finally, we'll have a conversation tab, to discuss general things outside the diffs themselves: And you can review changeset by changeset too, not only entire branches, which is great IMO: The last thing will be "review scripts": And that's just the beginning. So, in summary: we have put together a bunch of ideas thanks to all the feedback we got and our own practice of code reviews. Now it is a matter of finding the time to implement this in an incremental way. We want to finish some key changes on the server, which will enable the next Cloud version (and also key for some huge customers we have with 3000+ seats). And we also have to improve the single branch workflow for developers on game teams... so it is not yet decided when Reviews are going to happen. pablo
  5. Hey guys Plastic SCM kicks Git's butt in pretty much every way except when it comes to code reviews. Below are some features which are commonly available in the Git world (by using pull requests in most Git hosting services), which make iterative code reviews easier and would be most appreciated in Plastic SCM. I think the Branch Changes Window probably achieves 80% of this already (if you embedded that instead of a simple Diff Window for branch-based code reviews). Here is an example of what pull request overview page looks like in Azure DevOps (which shows a number of the features I talk about below): These are only really needed for code reviews for branches (not code reviews for changesets): Timeline of comments and updates: As you can see above, it shows a simple timeline of when the code review was created, code review comments (with a snippet of the relevant code), and subsequent updates (update = push, which might contain several commits, but for Plastic SCM each checkin should probably be it's own update). Replies should probably be grouped under the original comment, rather than have its own entry on the timeline. Easily compare updates to see if comment is resolved: If I click on a comment, it should take me to a diff of the latest changeset on the branch VS the changeset on which the comment was made. This lets reviewers easily see whether the comment was addressed and how. (Optional) Show comments in-line with code: This one might be the trickiest to get working with the diff tool, and isn't necessary for the other improvements to work. What it does do is help make unresolved comments a lot easier to see. Here is what this and #2 look like in Azure DevOps: Ability to mark comments as resolved: Original comments (not replies) should have a property to indicate its status (eg, active, resolved, can't be fixed, etc). If a comment is resolved or otherwise closed, it should be collapsed by default (eg, in the timeline, in-line comments, bottom comment pane, etc) so it is easy to see just the unresolved comments. Here is the same screen but with the comment collapsed (default when comment is resolved). Comment can be expanded by clicking on the blue circle: Unresolved comments should carry over to updates: For the above to work properly, unresolved comments should be visible on new changesets for that branch, so they can't be missed. Currently, they will only appear in the bottom pane, and clicking on it changes the right-side diff window to the original changeset (which is not what we want). I understand there will be some complexity as comments are stored by line number (which could change as a result of code additions/deletions/moves), but hopefully this is not insurmountable with the really awesome xdiff tools you already have? Would also be grateful for an update on how development on the Code Review system is going overall (there are a number of UserVoice suggestions where you've mentioned an overhaul of the system is in the works, but those suggestions date back several years and there doesn't seem to be an update)? Hopefully you guys are still committed to improving the Code Review system. As I said, Plastic SCM pretty much kicks ass in every way - this is one of the only things I really miss from the Git world. Hopefully with your existing tools (eg, Branch Changes Window and Xdiff), the above features won't be insurmountable? Kind regards David Edit: Here is a link to this item on UserVoice. Please upvote if you agree.
  6. calbzam

    Let's talk about file transfers

    Hi, The thing is this setting is dependent on your hard drive. With HDD drives sometimes parallelize is not a good idea (especially for the reading operations). If the network is fast, you can lose more in the disk than you win parallelizing the traffic. And also note that If different clients are performing updates in parallel (with these settings enabled) the server will get out of available threads faster than the regular scenario. If you are using SSD or fast disks you can play a bit with the "DownloadPoolSize" parameter and check the value that fits better for you. Regards, Carlos.
  7. Hey guys Really loving Plastic SCM but having trouble integrating with MantisBT. Basically, "test connection" is successful, but the pending tasks list is empty (even if I tick the box asking it to show pending tasks from all users). I have copied plastic.php into the Mantis install directory, and yes I have created an unresolved task in MantisBT. Some possibilities I could think of: Do I need to tell Plastic SCM my password for MantisBT? If so, how? Do I need to link the Plastic SCM repo with the MantisBT project? Even if I don't have to, is there a way to do so such that only unresolved tasks from the relevant project are displayed? Do I need to create an API token in MantisBT? If so, how to I give the API token to Plastic SCM? Setup details Plastic SCM 7.0.16.2904 MantisBT 2.19.0 XAMPP for Windows 7.3.0
  8. Earlier
  9. Hey guys Loving Plastic so far. Pretty much the only thing stopping me from grabbing a Plastic Cloud subscription or the Cloud Edition right now is the issue tracker integration. I'm currently a hobbyist / solo developer but work 50/50 between two computers on different networks, so Plastic Cloud would be really handy for me (I'm used to the world of Git repo hosting, where I don't have to worry about server setup and maintenance). But if I have to set up a server for the issue tracker anyway... well that kind of defeats the purpose for me. So, wondering if any of the following options are planned for the near future (or already included and I've just missed it): Ability to use Plastic Cloud as the server for one of the free issue trackers (eg, MantisBT). That would save people like me from having to pay for Plastic Cloud + pay for server rental or another cloud service for the issue tracker. Or even better, if Plastic Cloud came with the issue tracker server already set up and integrated, so we could just use it out-of-the-box like with most Git hosting services. Setting up and using Plastic SCM has mostly been a fantastic and painless experience, but the same can't be said for trying to get a free issue tracker up and running. An issue tracker extension for Github, Bitbucket, Azure Boards (part of Azure DevOps formerly VSTS) or one of the other free web-based issue trackers for Git repos. (Yes I realise I can create my own extensions, but only if by 'can' you mean 'theoretically possible' rather than 'realistically able to'. Also would sort of defeat the cloud convenience factor.) Anyway, look forward to hearing back from you on my options. Really loving Plastic SCM and keen to get started properly as soon as possible. Kind regards David PS: on a slightly unrelated note, if I get Cloud Edition (or cloud extension with Personal Edition), that still comes with a localhost clone so I can checkin/branch locally (quickly) right? Also, if I get Cloud Edition for now but want to switch to Team Edition on-prem server in future, I can do that without losing any data right?
  10. kevinossia

    Let's talk about file transfers

    I tried multiple different values for the download pool size, including 16, 32, and even 256. None of them seem to have an effect on the time it takes to update.
  11. scaslin

    Unity VersionControl API

    We have a build machine that light bakes our scenes. I need to submit the light maps to the plastic server. I have a basic form of this working OK but when the build machine is out of sync with our plastic server I get an error saying "PlasticSCM : Submit: A merge is needed from changeset 95 to changeset 94 (the changeset you are currently working on) before submitting your changes. You need to solve the conflicts between your current workspace contents and the new contents on your current branch." So I need to merge and resolve any conflicts before submitting. How do I do this with the VersionControl API? I've tried Provider.Merge & Provider.Resolve but to no avail. Is there any example code for this? or documentation describing the process? Help
  12. calbzam

    Meta files in Library/PackageCache will not ignore

    Thanks for the update!
  13. miryamgsm

    Odd calls of external Pas2Yaml parser

    We will take into account your suggestions for future improvements. Thanks for your feedback. Best regards, Míryam
  14. Hey @manu, I apologize for the huge delay, end of year was a fun time as we closed up 2018. I really really appreciate the offer for an on-prem license, however I've slept on it over and over and spoken with the team, and they all seem to think I'm crazy for demanding a switch from Mercurial+Largefiles. Consensus is that we're too late in the game production, and things are working (no one is bothered by the sometimes slow TortoiseHg, somehow...), to justify a switch. So, I hope that in any case I will come knocking back on the door of PlasticSCM in 2019 when we start the next project (or 2020, since this will be a full year!). Thank you, happy New Year, and thanks for making such an attractive SCM system in general, its nice to know that there's a nice place to be. Matt
  15. yoonjoonso

    Meta files in Library/PackageCache will not ignore

    Thanks, it was the plastic-global-config. That had a ignore.conf that included !.*,meta, which overrode my own ignore.conf files. I also had to restart the client to see the changes. Solved.
  16. dummzeuch

    Odd calls of external Pas2Yaml parser

    Yes, that was my impression as well. Of course I can do that. My main problem with this was that the parser never exited because it kept reading empty lines. That stdin might have already been closed occurred to me only later. May I suggest some changes to the API (for the next version of it): 1. In addtion to the shell parameter pass the API version so an external parser can check if it supports it, e.g.: parser.exe shell 2 flagfile The pas2yaml parser apparently originally did not support the encoding line but instead interpreted it as the output file which could result in it overwriting uncommitted changes (see my issue on the github page). 2. Also, I think it would be worthwile to not only pass the file names but prefix them with their meaning which could also prevent this kind of problem, e.g.: InputFile: <input file goes here> Encoding: <encoding goes here> OutputFile: <output file goes here> This won't add much overhead but make the communication much more reliable. 3. And last but not least: Require that an external parser does not to overwrite an existing output file. I think that shouldn't create any new issues since SemanticMerge always creates uniqe file names for each call, so these files should never exist in the first place. But again, this would prevent the issue of unwittingly overwriting files because the input was misinterpreted.
  17. miryamgsm

    Odd calls of external Pas2Yaml parser

    I reviewed the source code of the SemanticMerge application and the pas2yaml parser. The 'end' string is only sent when SemanticMerge is launched through our version control tools (Plastic SCM and gmaster). If I am correct, I think the "Exception Exiting because INPUT reached EOF" exception occurs when you close SemanticMerge due the process is not closed correctly using the 'end' string. We will fix it as soon as possible. Meanwhile, you probably could workaround the issue avoid throwing the exception. Does it sound good for you? We will keep you informed as soon as it is ready! Thanks for your feedback. Best regards, Míryam
  18. dummzeuch

    Odd calls of external Pas2Yaml parser

    Just in case anybody is interested, here is my current source code: https://github.com/dummzeuch/pas2yaml
  19. dummzeuch

    Odd calls of external Pas2Yaml parser

    The log file does not help me. It does not contain any errors that I could identify. I have attached it to this message. This is a single call to compare two files (but you'll probably see that yourself in the log). The log file of pas2yaml reads like this: READY Received line: C:\Users\twm\AppData\Local\Temp\TestFile.pas-revBASE.svn001.tmp.pas Received line: Windows-1252 Received line: C:\Users\twm\AppData\Local\Temp\b384da6b-a5cf-4cf7-8b1c-395446a37c98.tree Received inputfile: C:\Users\twm\AppData\Local\Temp\TestFile.pas-revBASE.svn001.tmp.pas Received encoding: Windows-1252 Received outputfile: C:\Users\twm\AppData\Local\Temp\b384da6b-a5cf-4cf7-8b1c-395446a37c98.tree Parsed and writing to outputfile: C:\Users\twm\AppData\Local\Temp\b384da6b-a5cf-4cf7-8b1c-395446a37c98.tree wrote OK READY Received line: D:\Source\_github\SemanticMergeDelphi\src\TestFile.pas Received line: Windows-1252 Received line: C:\Users\twm\AppData\Local\Temp\b0fef9d2-98b2-4295-8f9c-7e7c2d7f8b6b.tree Received inputfile: D:\Source\_github\SemanticMergeDelphi\src\TestFile.pas Received encoding: Windows-1252 Received outputfile: C:\Users\twm\AppData\Local\Temp\b0fef9d2-98b2-4295-8f9c-7e7c2d7f8b6b.tree Parsed and writing to outputfile: C:\Users\twm\AppData\Local\Temp\b0fef9d2-98b2-4295-8f9c-7e7c2d7f8b6b.tree wrote OK READY Exception: Exception Exiting because INPUT reached EOF. As you can see, the INPUT (=stdin) file gets closed before an 'end' line was received. This lets me suspect a bug in the way SemanticMerge calls the external parser: It probably closes the pipe to the external parser's stdin without first writing the 'end' line (or maybe simply forgets the CR/LF after it?). semantic.log.txt.zip
  20. dummzeuch

    Odd calls of external Pas2Yaml parser

    OK, found it. There was already a semanticmerge.log.conf file in the SemanticMerge installation directory and this is also where the log file exists. (You should really add that to your instructions.)
  21. dummzeuch

    Odd calls of external Pas2Yaml parser

    I rewrote the logic of the main loop and now I get a diff display. But I still do not get the 'end' line. My program terminates after receiving 10 empty lines. How do I configure logging and where are the log(s) placed? The link says how to set the log level by creating a semantic.log.conf file, but not where this file has to be placed nor where the log file will be created. I created the semantic.log.conf file in c:\users\<me>\AppData\Local\plastic4 with the suggested content (see attached file), but could not find a log file in %temp%. So where does it go? semantic.log.conf
  22. dummzeuch

    Odd calls of external Pas2Yaml parser

    Hi, thanks for your answer. I have already tried the parser stand alone, which works fine as far as I can see. For whatever reason I did not think of calling SemanticMerge directly. Must have been very tired... Also, I will check the logging.
  23. miryamgsm

    Odd calls of external Pas2Yaml parser

    Hi dummzeuch! * I think it's better to try the parser step by step in order to determine the origin of the problem. That is, first only the parser then the integration with SemanticMerge and finally with Tortoise svn. To check only the parser: yourparserapplication.exe shell path_to_the_flag_file More info: https://users.semanticmerge.com/documentation/external-parsers/external-parsers-guide.shtml#Asampleparsingsession To check the integration with SemanticMerge: semanticmergetool.exe -s sample\source.code -d sample\destination.code -ep=codeparser.exe More info: https://users.semanticmerge.com/documentation/external-parsers/external-parsers-guide.shtml#SemanticMergeandSemanticDiff * Sometimes, issues will happen if you forget to fill in the declarations tree correctly. If that happens, besides checking the error messages, looking at the logs might be very useful. In this section, you can find the steps to activate the "ExternalParser" logging: https://users.semanticmerge.com/documentation/external-parsers/external-parsers-guide.shtml#Howtodebugyourparser Tip: Maybe you can start with simplified source code, check it and, once it works, add the rest of the code. In case the issue is not solved, you can share with us the parser in order to debug the SemanticMerge application to figure out what is happens. Please do not hesitate to contact us if you have any questions. Hope it helps! Best regards, Míryam
  24. calbzam

    Automatic Cloud Sync on local checkin?

    Hi, there is a "Replikate" tool that allows replicating all the changes done inside a repository via command line. You can schedule a task to run it or even configure it in an after-checkin trigger: http://blog.plasticscm.com/2012/02/after-accidentally-cloning-sexy.html https://www.plasticscm.com/documentation/triggers/plastic-scm-version-control-triggers-guide Regards, Carlos.
  25. Is there a way to automatically sync the local repo with the cloud upon successful local checkin? It would be nice to not require the second step of explicitly performing a cloud sync.
  26. calbzam

    Global file configuration on Plastic Cloud

    Hi, let me share again with the team. I'm afraid it's not yet fixed. Regards, Carlos.
  27. calbzam

    Meta files in Library/PackageCache will not ignore

    Hi, - If you right-click the file and select "Add to ignore list" and refresh the view, isn't it marked as ignored? - Could you attach your "ignore.conf"? (Check in your local workspace and also in your user folder: "C:\Users\xxx\AppData\Local\plastic4\ignore.conf") - Do you have a "plastic-global-config" repo that may be overwriting some rules? https://www.plasticscm.com/documentation/administration/plastic-scm-version-control-administrator-guide#Globalfileconfiguration Regards, Carlos.
  1. Load more activity
×