Jump to content

Colin - Stacking Chairs

Members
  • Posts

    5
  • Joined

  • Last visited

Colin - Stacking Chairs's Achievements

Newbie

Newbie (1/14)

  • One Month Later Rare
  • Week One Done Rare
  • First Post Rare
  • Conversation Starter Rare

Recent Badges

0

Reputation

  1. Hi @blue, thanks for the link. I'd actually already been following the information in that link, and have previously been able to archive some old revisions from the cloud repo. The problem I'm having in this case is that cm archive (see above for specific command usage) claims that deleted revisions do not exist, and so it does not archive them. Coming back to this afresh, I now wonder if my issue is that I'm attempting to archive the revision that was deleted, whereas I should probably be archiving all previous revisions, which would have contained actual content. How exactly I can achieve this with cm find, I'm unsure, however. Any specific pointers regarding archiving (or otherwise cleaning) deleted content from the cloud would be much appreciated. Thanks, Colin
  2. I'd like to permanently remove some deleted items in our cloud repo to free up space, and I'd be grateful if someone could let me know if/how this can be achieved, please. I've found a couple of related threads, but these don't completely answer my question, unfortunately. The above thread suggests trying archiving, but warns that this will not free space. I wonder if it could actually free space in the case of a cloud repo, where I would archive to a local drive. Nevertheless, I was unable to archive removed revisions, as cm archive claims that the revisions in question do not exist. This is the structure of the command I've tried: cm find "removed on repository 'repo@org@cloud'" --format=rev:revid:{id}@repo@org@cloud --nototal | paste -d " " - | cm archive - --file=/external/archive Is there an alternate method of archiving that will accept the removed revision IDs? The above thread suggests deleting the changesets that added the files in question. Unfortunately, these files are not cleanly associated with a single changeset, but are rather in with other changes, making it impractical to delete changesets without collateral damage. Thanks, Colin
  3. As others have noted, I had difficulties using this command, but one issue that I did not see mentioned elsewhere is that parent!=-1 does not seem to exclude the HEAD revision, but rather the root/initial revision. This makes intuitive sense to me, as only this revision would not have a parent revision. I've instead used the following command to archive all revisions above 1 MB that are not the HEAD revision on a cloud repo: cm find "revs where size > 1000000 and returnparent = 'true' on repository 'repo@org@cloud'" --format=rev:revid:{id}@repo@org@cloud --nototal | paste -d " " - | cm archive - --file=/external/archive I'm using returnparent = 'true' instead, which ensures that any hits on HEAD revisions use the parent revisions instead. It should be noted that this does not guarantee those parent revisions are larger than 1 MB, and it could theoretically miss some revisions where the size of the child is below the threshold, but this was sufficient for my needs for now. I would rather exclude the latest N revision in a totally reliable way, as others have requested in this thread, but my dusty bash skills were not up to that. If anyone has CLI commands to do this, I'd love to see them! I'll also add my voice to those that are asking for an automated way to maintain only the N most recent revisions per type/folder. This one of the main things I'm missing after the switch from Perforce.
  4. Thanks for the info @ryancassell. Softlocks sound like they could help us, so I'll keep my eyes open for news regarding this feature. Colin
  5. It seems that the default workflow with Plastic Cloud in Unity is to allow saving of local changes to a file that is locked by another user. There is a warning that checkout was not possible, due to the lock, but this can be easily missed by someone in the midst of making lost of changes (as an aside, I see this warning for files I have checked out in the same workspace, when I re-save the scene, for example, which makes it tempting to write off the warning as not relevant). Otherwise, it can be difficult to spot that it didn't succeed in checking out the modified file; Unity itself seems happy enough, and the file is shown as changed in the Pending Changes list with the Plastic plugin. Only at the point of attempting to commit the file is the user actually hard blocked from progressing due to someone else holding the lock. While this is sufficient to ensure that the lock is respected and another user can't jump ahead and commit other changes, this process can cause a lot of wasted effort on the part of the user(s) without the lock. It is possible for such a user to work for a considerable amount of time on changes to an asset that they subsequently find is locked when they come to commit. It is then non-trivial to know how to proceed; whether to revert the changes to that asset and apply them differently or await the lock being released, or in the case of prefab changes, whether to unpack them so that they can be applied separately in the short-term. Is there any way for us to block the user at the initial point of failing to check out the asset, please? A modal error pop-up or similar would be impossible to ignore. With the current workflow, I'm concerned that we'll keep finding that we've been working on assets we can't do anything with, due to a lock contention that we'd missed. I'm aware that we could try a branch per task workflow instead, which might alleviate some of these issues with locking. However, I have a few of concerns with this: 1) My colleagues are not as familiar with version control and branching as I am. Your tools and UI seem to make this quite accessible, but I still worry that they would be intimidated by it and may make mistakes. 2) Branch per task will inevitably add time to each task, and this overhead might be quite significant for members of my team that make a lot of small changes. 3) It's been several years since I used UnityYAMLMerge, but I'm not convinced that it will always work without the need for manual intervention or (even worse) loss of data. So I'd be reluctant to lean on it multiple times per day throughout our production. Any insight on the above would be greatly appreciated. I'm very happy with our initial experience using Plastic Cloud overall, and I'm really just wanted to head-off any pain points as early as possible in our project. Thanks in advance, Colin
×
×
  • Create New...