Neha Khalwadekar Posted July 26, 2021 Report Share Posted July 26, 2021 On behalf of the Plastic SCM team, We are extremely delighted to announce the launch of Dynamic Workspace Alpha! Checkout our announcement blog post. Here are a few short videos to get you started: Short demo: https://www.youtube.com/watch?v=4QBOZjk_vjg How to configure it and enable it: https://www.youtube.com/watch?v=YicarumSFdg Feature, interact with the filesystem : https://www.youtube.com/watch?v=PEwIb4fBLrU Feature, interact with the filesystem (Part II): https://www.youtube.com/watch?v=gb5GUuyuQrg Why does it matter? Your local cache will now get populated based on the tools your team members used. This helps during the very first update when your cache is empty. Instead of downloading everything, or downloading files one by one, it can boost downloads based on “which files are typically accessed together with this one”. Feel free to provide feedback in the comments! We are actively working on adding more features and improvements and we really value your feedback. Thank you. 1 Link to comment Share on other sites More sharing options...
Mikael Kalms Posted August 4, 2021 Report Share Posted August 4, 2021 Very glad to see this go Alpha! I understand that this is a rather niche question, and understand if you aren't yet in a position to give a relevant answer, but: Do you intend to make the Dynamic Workspace feature (including plasticfs driver) operate within Windows Containers? What about Linux containers? Link to comment Share on other sites More sharing options...
Mikael Kalms Posted August 11, 2021 Report Share Posted August 11, 2021 How do I change plastic FS' backing storage location from %LOCALAPPDATA%\plastic4 to somewhere else? We are typically setting up our workstations like this: C for applications, D for development. When we begin to use plasticfs, any files that are fetched or created locally (think "build output") will now take up space on the C drive instead of D. We will run out of space on C quickly, due to the intermediate files that get produced when building stuff. 1 Link to comment Share on other sites More sharing options...
calbzam Posted August 12, 2021 Report Share Posted August 12, 2021 You can change the output path with the fllowing flag: --cachepath PATH 1)Via command line: $ plasticfs --cachepath PATH 2) Or create a shortcut for PlasticFS and add the parameter: (right-click > Properties > Target) Regards, Carlos. Link to comment Share on other sites More sharing options...
Mikael Kalms Posted August 12, 2021 Report Share Posted August 12, 2021 Thank you, that works. Bug report: Plasticfs does not cooperate well with Total Commander ( https://www.ghisler.com/ ). If I have a *.bat file within the plasticfs file system, and I attempt to launch it via Total Commander, I get the following error message: Similarly, if I try to launch an exe file: Also, when deleting a file, Total Commander first attempts to move it to the recycle bin - which fails, which prompts Total Commander to display a dialog, and ask if I want to delete it permanently. The permanent delete succeeds. I can launch batch files & executables via Windows Explorer. A "move to recycle bin" delete operation via Windows Explorer triggers a permanent delete directly (with a popup). I presume this means that plasticfs simply doesn't support the concept of the recycle bin. Link to comment Share on other sites More sharing options...
calbzam Posted August 13, 2021 Report Share Posted August 13, 2021 Hi, - You are right, PlasticFs doesn't currently support the recycle bin. - Regarding the Total Commander report, the paths from the screenshots doesn't refer to mounting points. They are already resolved by Windows. Not sure if it would be easy to get some reproduction steps so we can try to debug it. Regrads, Carlos. Link to comment Share on other sites More sharing options...
Mikael Kalms Posted August 26, 2021 Report Share Posted August 26, 2021 Sorry for the delay. Here are reproduction steps: 1) Ensure you have a dynamic workspace, in this example at C:\MyDynamicWorkspace. 2) Create a batch file at C:\MyDynamicWorkspace\Hello.bat, with the following content: "echo hello". 3) Download and Install Total Commander from https://www.ghisler.com/ . 4) Open a command prompt; run C:\MyDynamicWorkspace\Hello.bat; observe that the command window runs the batch file successfully. 5) Open Total commander; navigate to C:\MyDynamicWorkspace; double click on Hello.bat; observe that you get the following error message: (The above screenshot says D-drive -- when I tested this, I used an existing dynamic workspace on my D drive. I have also moved my Plasticfs cache folder to the D drive.) Link to comment Share on other sites More sharing options...
ollieblanks Posted August 27, 2021 Report Share Posted August 27, 2021 Hi Mikael, Unfortunately this is a limitation of how we are mounting the workspace. Since 10.0.16.5882, you are now able to workaround this by creating the workspace with a network path e.g. \\workspaces\MyWorkspace. Using this method uses the Windows Mount Manager and therefore works seamlessly with software like Total Commander. Hope this helps! Link to comment Share on other sites More sharing options...
Khaled Posted August 28, 2021 Report Share Posted August 28, 2021 Very good news! How can we use Dynamic Workspace with TeamCity? I think this will give a huge boost to the build pipeline Link to comment Share on other sites More sharing options...
phvalve Posted September 8, 2021 Report Share Posted September 8, 2021 I can see a use for this for us once we'll be able to trim out older revisions. Do you think it's possible or are you perhaps planning to add a status column similar to OneDrive that shows if the file is downloaded, locked by a user, changed locally etc.? Link to comment Share on other sites More sharing options...
ollieblanks Posted September 15, 2021 Report Share Posted September 15, 2021 @Khaled, Dynamic workspaces with TeamCity is a fantastic idea. As this is still a very young feature, it is not available in the TeamCity integration yet. But I will be sure to raise this with the team. @phvalve, there is already a status column in the Workspace Explorer in the Plastic SCM client that shows whether a file is controlled, ignored, contains changes, etc. The only caveat there is it does not show whether a file is locked by another user. (Only Gluon does this currently) I agree that as this feature progresses showing the "dynamic" status of files to the user. This again is great feedback and I will raise it with the team. 1 Link to comment Share on other sites More sharing options...
Mikael Kalms Posted October 22, 2021 Report Share Posted October 22, 2021 Today, I deleted a dynamic workspace via the Plastic SCM client. I noticed that the disk space was not reclaimed immediatedly. I thought "huh, that is odd". Then I closed the Plastic SCM client. This triggered a BSOD - PAGE_FAULT_IN_NONPAGED_AREA. This is the first BSOD that I have with this workstation in a long time - I think, the first in 12+ months. After restarting my machine, I notice that the the Dynamic Workspaces feature doesn't seem to remove plasticfs workspace storage content for me. When I delete a dynamic workspace, the folder remains with all its contents within the "plasticfs-workspace-storage" folder. An entry is added to the "dynamic-wk-to-delete.conf" file, with folder name + a date. I have verified this once by creating & removing a dynamic workspace since after the bluescreen. I have now deleted the unused storage folders + the config file manually, so to get the diskspace back. I notice no other weird results. Normal interactions with a preexisting Dynamic Workspace seems to be have as expected. I can't say for sure if deleting Dynamic Workspaces always behaved like this, or if this is a new phenomenon... but I think it has always behaved like this for me. Link to comment Share on other sites More sharing options...
Xarbrough Posted November 17, 2021 Report Share Posted November 17, 2021 A philosophical question if I may ask: Will this feature be intended to support the concept of monorepos? It seems like a lot of big companies are using a single huge repo instead of many smaller ones because it makes dependency tracking and code sharing easier, so our company is considering this as well for the future. So far, however, it seems if you're not Google, you can't afford the infrastructure to support such large repositories, especially if they also need to work with remote working nowadays. So, would you say this will make a monorepo feasible for e.g. indie game studios that are using Unity and maybe share a dozen or so projects and a team size of 20 people? Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now