Jump to content

GitSync Error - item cannot be loaded in the workspace


Twin17
 Share

Recommended Posts

Hi all,
I'm trying to sync / migrate a huge repo which was never migrated before to a private BitBucket instance, but the output gives out an error which is shown below
 

ACTIVE... |
- 0 changesets to pull
- 26738 changesets to push

ACTIVE... OK
There are changes to push.
The new item cannot be loaded in the workspace. Probably, the path is already used. Please unload the item (from the configuration view) and retry the operation.... OK
Keep source changes (preserve the add and discard the move)... OK
Error: Unable to read data from the transport connection: The connection was closed.B

Steps taken to do the sync:
- cm sync <refspec> git <repo> --user=user --pwd=password
- cm sync <refspec> git <repo> --user=user --pwd=password --skipgitlfs
- On GUI: Select Branch > Right Click > Pull > Sync with Git 

Debugging steps i've tried:
- Remaking the destination repo from scratch (also changing it)
- Tracing the cm and git process (nothing unusual)
- Checking the debug logs on bitbucket (see below)
- Add the plasticpipeprotocol.conf file with various configurations
- Test the process a whole lot of times 

Meaningful Observations:
- The files are actually packed and sent to the BitBucket server, as after recreating the repo and "failing" the process once, it shows the exact size of the repo on disk (and in the "DiskSize Calculator" in Bitbucket): 1,5GB ca.
- The error below is repeated like a billion times in the debug log, all corresponding to the exact moment when the "upload" should be finalized

error: Could not read 814070262e265e90dd2cdb9ea3d82146e0de2cf7
fatal: Failed to traverse parents of commit 987b65accfa419b454a73e23ff84bc8709f7c984

- Error happens both in UI and CLI

Can someone help me or at least point me to other debugging steps i could do, or some workaround too!

Thanks!

Link to comment
Share on other sites

Hi,

Quote

The new item cannot be loaded in the workspace. Probably, the path is already used. Please unload the item (from the configuration view) and retry the operation.... OK

- Could you upgrade your Plastic Scm version to the lost recent one. At least this error, shouldn't appear.

- Could you attach the full debug logs so we can review them? You can send it to me in private if necessary.

- Does the also issue happen if you try to push the repo to some other git cloud service (GitHub...)?

Regards,

Carlos.

 

Link to comment
Share on other sites

Thanks for the reply Carlos,
I'll try with the steps provided, it takes a while to run, but yes with updating at least the error message is not appearing.

Will update back with any results, and if needed I'll forward you the debug logs (the ones from Bitbucket, right?)

Cheers!

Link to comment
Share on other sites

Hi, update!
The output after the update changed in this

Receiving references... -
- 0 changesets to pull
- 26918 changesets to push

Receiving references... OK
There are changes to push.
Exporting... OK
Packaging...... OK
Error: The response ended prematurely.

I forwarded privately the debug logs to you @calbzam.

Thanks

Link to comment
Share on other sites

Hi,

- Sorry if was not clear enough but what we need is to review the Plastic client logs. If you are using the "cm", you will need to manually ernable the "cm" logs:

https://www.plasticscm.com/documentation/technical-articles/kb-enabling-logging-for-plastic-scm-part-i

If there is a problem in the Bitbucket side, I'm afraid we won't be able to help.

- My guess is Bitbucket could be rejecting the push from Plastic. You mentioned that the repo is huge. Does the repo have big files? Do you know if LFS is enabled in Bitbucket? is it a cloud Bitbucket server?

- Can we try to push the Plastic repo to GitHub to check if the issue persist?

Regartds,

Carlos.

 

 

Link to comment
Share on other sites

Oh, sorry there was the misunderstanding on my side.

Will run with plastic logs enabled and send you the logs, thanks for the pointer for the docs.

Anyway, the repo doesn't contain single files bigger than 400MB and LFS is active, so that shouldn't be too much of an issue.
I will ask the dev responsible for the repo if we can make the github test.

Thanks!

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

×
×
  • Create New...