Jump to content

GitSync Error - item cannot be loaded in the workspace


Twin17
 Share

Recommended Posts

Hi all,
I'm trying to sync / migrate a huge repo which was never migrated before to a private BitBucket instance, but the output gives out an error which is shown below
 

ACTIVE... |
- 0 changesets to pull
- 26738 changesets to push

ACTIVE... OK
There are changes to push.
The new item cannot be loaded in the workspace. Probably, the path is already used. Please unload the item (from the configuration view) and retry the operation.... OK
Keep source changes (preserve the add and discard the move)... OK
Error: Unable to read data from the transport connection: The connection was closed.B

Steps taken to do the sync:
- cm sync <refspec> git <repo> --user=user --pwd=password
- cm sync <refspec> git <repo> --user=user --pwd=password --skipgitlfs
- On GUI: Select Branch > Right Click > Pull > Sync with Git 

Debugging steps i've tried:
- Remaking the destination repo from scratch (also changing it)
- Tracing the cm and git process (nothing unusual)
- Checking the debug logs on bitbucket (see below)
- Add the plasticpipeprotocol.conf file with various configurations
- Test the process a whole lot of times 

Meaningful Observations:
- The files are actually packed and sent to the BitBucket server, as after recreating the repo and "failing" the process once, it shows the exact size of the repo on disk (and in the "DiskSize Calculator" in Bitbucket): 1,5GB ca.
- The error below is repeated like a billion times in the debug log, all corresponding to the exact moment when the "upload" should be finalized

error: Could not read 814070262e265e90dd2cdb9ea3d82146e0de2cf7
fatal: Failed to traverse parents of commit 987b65accfa419b454a73e23ff84bc8709f7c984

- Error happens both in UI and CLI

Can someone help me or at least point me to other debugging steps i could do, or some workaround too!

Thanks!

Link to comment
Share on other sites

Hi,

Quote

The new item cannot be loaded in the workspace. Probably, the path is already used. Please unload the item (from the configuration view) and retry the operation.... OK

- Could you upgrade your Plastic Scm version to the lost recent one. At least this error, shouldn't appear.

- Could you attach the full debug logs so we can review them? You can send it to me in private if necessary.

- Does the also issue happen if you try to push the repo to some other git cloud service (GitHub...)?

Regards,

Carlos.

 

Link to comment
Share on other sites

Thanks for the reply Carlos,
I'll try with the steps provided, it takes a while to run, but yes with updating at least the error message is not appearing.

Will update back with any results, and if needed I'll forward you the debug logs (the ones from Bitbucket, right?)

Cheers!

Link to comment
Share on other sites

Hi, update!
The output after the update changed in this

Receiving references... -
- 0 changesets to pull
- 26918 changesets to push

Receiving references... OK
There are changes to push.
Exporting... OK
Packaging...... OK
Error: The response ended prematurely.

I forwarded privately the debug logs to you @calbzam.

Thanks

Link to comment
Share on other sites

Hi,

- Sorry if was not clear enough but what we need is to review the Plastic client logs. If you are using the "cm", you will need to manually ernable the "cm" logs:

https://www.plasticscm.com/documentation/technical-articles/kb-enabling-logging-for-plastic-scm-part-i

If there is a problem in the Bitbucket side, I'm afraid we won't be able to help.

- My guess is Bitbucket could be rejecting the push from Plastic. You mentioned that the repo is huge. Does the repo have big files? Do you know if LFS is enabled in Bitbucket? is it a cloud Bitbucket server?

- Can we try to push the Plastic repo to GitHub to check if the issue persist?

Regartds,

Carlos.

 

 

Link to comment
Share on other sites

Oh, sorry there was the misunderstanding on my side.

Will run with plastic logs enabled and send you the logs, thanks for the pointer for the docs.

Anyway, the repo doesn't contain single files bigger than 400MB and LFS is active, so that shouldn't be too much of an issue.
I will ask the dev responsible for the repo if we can make the github test.

Thanks!

Link to comment
Share on other sites

Hi,

- I can see the operation is aborted with no additional errors. Not sure if we can make a test pusing the Plastic repo to GitHub instead.

- How big is the repo database? I was guessing if you could share the repo with us for try and debug the same GitSync operation.

Regards,

Carlos.

 

Link to comment
Share on other sites

On 5/20/2022 at 1:13 PM, calbzam said:

- I can see the operation is aborted with no additional errors. Not sure if we can make a test pusing the Plastic repo to GitHub instead.

I also had a look at the logs, and yes can confirm. Sorry to say but probably it wouldn't be possible to upload the repo to GitHub, it's still private data.

 

 

On 5/20/2022 at 1:13 PM, calbzam said:

- How big is the repo database? I was guessing if you could share the repo with us for try and debug the same GitSync operation.

Database in the folder rep_xxx is 1.12GB, so nothing too crazy big. If it's useful for you i can package and send that to you too.

Thanks!

Link to comment
Share on other sites

Quote

Database in the folder rep_xxx is 1.12GB, so nothing too crazy big. If it's useful for you i can package and send that to you too.

If it's not a problem, this way we can try the same as you to debug the issue.

Regards,

Carlos.

Link to comment
Share on other sites

Hi Carlos, 
After some more analyzing, and also thanks to the tests you let me do, i found the culprit of the error. I update here because it could be useful also for other people in the future.

The last conclusion comes after my tests with current mainline version so: 11.0.16.6994
Also Bitbucket version: 7.20.0

Basically the original PlasticSCM repository didn't contain any tags, so what happened was that the sync process created an empty "tags" folder in the .git data, for the repo. In this case, cloning the repo via Sourcetree did work, but the webUI (which is used for most collaboration tools) was giving a 500 Error.

image.png.ba1476636a8d7e020329c893fd4f9c42.png

After removing that folder, modifying a random file to let it actually commit and push, makes everything work good again.



@calbzam Is this something fixable/patchable? My first stupid idea would be to add a "gitsync" tag with the migration, if the repo has no tags.

Link to comment
Share on other sites

Quote

 but the webUI (which is used for most collaboration tools) was giving a 500 Error.

If I properly understand, you were able to sync the repo with GitHub but you are still facing this problem accessing the repo in Bitbucket via webUI? Can you let me know the steps to reproduce it?

At this point point. I'm not 100% sure if the bug is with the Bitbucket webUI or if we can somehow avoid creating this empty "tags" folder. Anyway, can you confirm, if creating a test tag is avoiding the issue?

Regards,

Carlos.

Link to comment
Share on other sites

4 minutes ago, calbzam said:

If I properly understand, you were able to sync the repo with GitHub but you are still facing this problem accessing the repo in Bitbucket via webUI? Can you let me know the steps to reproduce it?

Yup,
Steps were the usual, the same cm sync i stated in the OP.

 

 

4 minutes ago, calbzam said:

At this point point. I'm not 100% sure if the bug is with the Bitbucket webUI or if we can somehow avoid creating this empty "tags" folder. Anyway, can you confirm, if creating a test tag is avoiding the issue?

Edit: Actually there are a lot of tags in the repo, so my guess was wrong, but at this point, what's the empty tags folder for? The same tags are also present on git, they were synced successfully.

 image.thumb.png.ad899729faecb41dbb68cdd56ed636e9.png

Link to comment
Share on other sites

Hi,

This is a bit strange but I was able to sync the repo with GitHub and I can access the repo via the GitGub Web UI with no issues (I can invite you to the private repo if you want to check it). I can also see that the tags were properly synced:

image.png

 

I'm guessing if there is a problem with the Bitbucket UI.

Regards,

 

Link to comment
Share on other sites

Probably yes, it would be a BitBucket UI only issue. I guess i would have to contact their support to let them know. 

At least this migration works now! Thanks for all the support!

  • Like 1
Link to comment
Share on other sites

@calbzam Sorry for bringing up this post again. It seems there is another issue with the sync.

So after the sync was done, we waited a bit, some commits and then synced again and this happened:
- The sync fails with "/main [other 98 branches], have more than one head. Please merge them to be able to continue synchronizing" 
- In the branch explorer commits are doubled (see screen below)

image.thumb.png.80f7b654973feb58129662e92a895b74.png

Any clue as of why this may happen?

Link to comment
Share on other sites

Hi,

I can see all the repo history has been duplicated. Are you running the GitSync operation always from the same Plastic client? Once the mappings have been created on a Plastic client, you need to always the operation from the same client and the sync should involve the same Plastic and git repos.

Regards,

Carlos.

Link to comment
Share on other sites

No, we have run the sync from two clients, i have noticed after that this creates this kind of error, and we're working on a solution, which mostly worked but there are some stuck duplicates that cannot be deleted.

Link to comment
Share on other sites

@calbzam Sorry to be here again.

So here's what we've done in to solve the problem, but it's still not completely solved.

We identified ALL the duplicated changesets and ran a script which iterated through the list to remove them.

Still some are stuck and cannot be removed as they have children, and obviously said, the children are other bugged out changesets. (Example Below)

image.png.65350c4c4d5d83a4a90516a70512063b.png

 

 

Whenever we try to remove one of those it will give the following:

image.png.63ffbc5e30b5505d42737eac6e2052ae.png

 

 

Where can we go from here?

Thanks in advance

Link to comment
Share on other sites

Hi,

I would crate a new Plastic repo from scratch and re-sync from the git one to re-generate the data. At this point, I can see the current Plastic repo is broken (with duplicated history). It's not easy to manually fix as you cannot delete a changeset of it's the parent of a different changeset.

Not sure if you have backups for the Palstic repos. It may be also a solution.

Regards,

Carlos.

Link to comment
Share on other sites

Sadly backups we have now are quite old, the project is still undergoing development, and as of now we would lose weeks of work. That would be also when resyncing the git one.

Would keep it like this, if only it wasn't that when attempting a new sync it gives multiple merge error, the manual fix would be *impossible* or just very hard to achieve?

Link to comment
Share on other sites

Hi,

- The problem is Plastic doesn't allow to sync a repo if it was more than one head. The ideal solution is to delete the duplicated changesets from the last one to the beggining (because it's not possible to delete a changeset if it's the parent of some other changeset).

- You could potentiall merge the duplicated heads in each branch (instead of deleting them) but I'm not sure if it's actually duable assuming there are tons of branches with the duplicated history in the repo.

Regards,

Carlos.

Link to comment
Share on other sites

I will have a look if any of the two is applicable. 

Actually the bulk delete left few duplicated heads, should be around 15, so it would be doable to merge them back. We'll see!

Thanks for the extensive support for now!

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

×
×
  • Create New...