Jump to content


Popular Content

Showing content with the highest reputation since 10/10/2011 in all areas

  1. 25 points
    Labeling in version 4.0 is completely different yes and as a result how is the migration of labels and there history addressed if you're migrating from version 3.0 to 4.0? The reason why I'm asking is because we recently exported/imported a 3.0 repository over to 4.0 but the labels were all messed up.
  2. 24 points
    If your brains can handle all the secrets of the recursive merge scenarios we have a fast webinar for you!!!! Join us or repent later!!!! 6:00 PM - 7:00 PM CEST 9:00 AM - 10:00 AM PDT https://www2.gotomeeting.com/register/114177594
  3. 7 points
  4. 5 points
    It's out! www.semanticmerge.com What do you think about it? Reddit discussion here: www.reddit.com/r/programming/comments/1ck8m3/c_semantic_merge_beta_just_launched/
  5. 4 points
    Hello, Please check the next document: https://www.semanticmerge.com/documents/SemanticMerge-TFS.pdf Regards, Carlos.
  6. 4 points
    Hi there! I found this great plagin/extension created by you, but I'm late - we are using VS 2012. How can I install Method History for Subversion to the latest VS 2012?
  7. 4 points
    After following the instructions to configure semantic merge as the difftool, I get an error of that looks like the following when right-clicking a shelved files and clicking "Diff against source revision": Could not find file '{path}\SomeCSharpFile@' Looking at the log in Perforce I see the following command: p4 diff2 {path}/SomeCSharpFile.cs#65 {path}/SomeCSharpFile.cs@=418368 It appears that SemanticMerge is not handling the @=4183868 revision symbol at the end of the second file. Diffing unshelved files works fine for me.
  8. 4 points
    Hi I'm using a plastic deployed to Amazon so the connection from the clients to it is done over SSL Upon first connection, the client gets the 'Hostname mismatch in secure connection' and I click on OK I installed a plastic proxy locally and configured the client to use it but it looks like the proxy is unable to establish connection with the server. From its log file: 2013-07-02 14:05:11,162 (null) (null) ERROR Codice.CM.Server.ExceptionTracerSink - Dumping in-transit exception:Cannot create channel sink to connect to URL 'ssl://ip-XXX-XXX-XXX-XXX:8088/ItemHandler'. An appropriate channel has probably not been registered. 2013-07-02 14:05:11,256 (null) (null) INFO ChannelCall - recb: 1271|rect: 0|sentb: 2203|sendt: 0|prt: 172|th: 4476|dest: 0|mt: 125|sert: 16|zip: 0||GetObjectsData
  9. 3 points
    Hi all, I have the latest Jenkins and latest Plastic plugin for it. I am attempting to set up a Unity build machine. Anyway, each time I build, the Plastic Jenkins plugin is removing the existing workspace and re-downloading it in its entirety. This is even with the "Use Update" flag set to true in the project configuration. This is not what I want as our build is capable of working incrementally and just switching to the head of /main each time we want a new build. The time difference is like 3m vs 40m. This seems like a bug. I know that it used to be a bug around v2 of the plugin, but was since fixed. Can someone help?
  10. 3 points
    Hi, I've tried to download the version for linux, but the actual link (https://www.plasticscm.com/download/ returns 404, page not found, and I can't install this version with dnf (I use fedora) because of repository only download last version. It's posible to download this version for other link or other fedora repository ? Thanks a lot!
  11. 3 points
    Write us to support and we'll give you another.
  12. 3 points
    Installed yesterday, now it says it is corrupted...
  13. 3 points
    Hello SCM Team! Does the Unlimited License mean I get I also get perpetual updates, or do I need the subscription for those? Best regards, Michael
  14. 3 points
    Three more things to mention:- 1. The F# Compiler Services library wasn't that well developed when I used this, but I'm fairly certain it's improved hugely since then. I would strongly recommend updating the NuGet dependency to track the latest version. There are other projects that use this library too, it's worth taking a look around to see what they do with it. 2. There is a comment in 'FileProcessor.fs':- type LineSpan = pos * pos // the second is a zero-relative character position within the line. // The second sub-pair points one past the end of the construct, // so the interval is [). It may either point past the end on the // same line, or may point to the zeroeth character position on the // next line, this depends on whether the construct is contained // within one line or spans several. type CharacterSpan = int * int // [] interval using zero-relative character position. The first part of this is *wrong* - the line spans used in the Semantic Merge specification are [] intervals - so the second sub-pair points to the end of the construct (so if the linespan is empty, it points one character in front of the start). I fixed this bug in the Scala plugin, but never got round to this in the F# plugin. It might be a useful familiarisation exercise if you fixed this bug yourself - just bear in mind that the spans that come from the F# compiler service parser really do obey that comment - in other words, the spans work slightly differently in the F# compiler services and Semantic Merge. If you read Scala, take a look at the Scala plugin, which is more developed port of the F# plugin - again, the Scala presentation compiler also has the same slightly different notion of spans, so the Scala plugin has to translate the spans - the F# plugin needs to have this fix back-ported into it. 3. What's definitely missing from the F# plugin is just a lot, a *lot*, more pattern matching on the various constructs of the abstract syntax tree - I stopped at the highest level, namely 'SynModuleDecl' for entire module constructs - what needs to be done is to decompose the sub-parts of the tree more to pull out classes and functions. Again, the Scala plugin does this and can be used as a reference, although I will say now that it's a lot easier to do this with the Scala presentation compiler than it was with the F# compiler services - but hopefully this may have changed on the F# side since I took a look. If not, it's a tedious but straightforward slog through lots of recursive pattern matching on the abstract syntax tree. I'm fairly sure that the F# source code formatter does exactly the same thing - I'd recommend taking a look at it, you may be able to either lift source code or even call it to do heavy lifting for you.
  15. 3 points
    Hi Nymaen - I'm the person who wrote that plugin - I'm glad you're taking an interest. You are right, I'm not actively developing that plugin any more, but please do fork the Git repo and take it further - that would be great. I'm happy to try to answer questions you may have in getting up to speed, so please post to this thread if you have more. Semantic Merge (and Plastic SCM) use a different configuration these days to incorporate external parsers - there was a post on the Codice blog not so long ago - see http://codicesoftware.blogspot.com/2015/09/custom-languages-in-semantic-version.html. That post is quite comprehensive - I use it myself for the companion Scala plugin, so on my machine I have the 'externalparsers.conf' file with contents of:- .sc=C:\Workspaces\Neptunium\target\neptunium.cmd .scala=C:\Workspaces\Neptunium\target\neptunium.cmd In this example, 'neptunium.cmd' is a self-executing JAR with a Windows command script preamble that runs the plugin on Scala files of suffix '.scala' and '.sc'. OK, that's for the Scala plugin - but you want to work on the F# plugin. In your case, you would build the Visual Studio solution from your cloned Git repository, the solution contains two projects:- 1. FSharpPlugin - this is the driver program that contains a 'Main' that is run from either Semantic Merge or Plastic SCM in the manner described by that blog post I referenced above. Its job is to run in a loop and process requests from either product, handing off a pair of input and output file paths to the 'real' parser that is provided by the library 'StructureDiscovery'. The input file path will reference some file (might be a temporary one, or might be part of your workspace - it depends on what you are doing in the GUI, the point is that it should be an F# source file), the output file path will be the path that Plastic SCM / Semantic Merge expects the plugin to write the transcribed YAML for that source file to. This is a very simple piece of code that is really a shim around the real parser. 2. StructureDiscovery - this is a library where the real work takes place. Right now, there is a single module, 'FileProcessor' in it, that contains an entry point 'DiscoverStructure' - this is what the driver calls. 'DiscoverStructure' uses the open source F# compiler services framework to build an abstract syntax tree describing the input F# source file. See here http://fsharp.github.io/FSharp.Compiler.Service for an introduction. It then transforms that syntax tree into YAML by recursive decomposition, starting with the function 'yamlForOverallStructure' that is applied to the top of the syntax tree. There is a bit of fixing up of the syntax tree that is done before that function is called - this is done with 'adjustSpansToCoverInputFile', this makes sure that the spans cover *all* of the text in the file without leaving any holes - you get these holes when you use an abstract syntax tree, it's the same for both the F# and Scala parsers. If you wondering what 'spans' are, read both the Codice documentation and the F# compiler services documentation - both use the same concept, although they have their own ways of representing them. Anyway, you build the Visual Studio solution, and the resulting command line executable, 'FSharpPlugin.exe' is what you'd refer to in the external parsers config file, so you'd end up with something like this:- .fs=C:\Workspaces\SemanticMergeFSharpPlugin\FSharpPlugin\bin\debug\FSharpPlugin.exe .fsx=C:\Workspaces\SemanticMergeFSharpPlugin\FSharpPlugin\bin\debug\FSharpPlugin.exe At this point, if you run Semantic Merge, it should pick up the plugin and do its stuff. I'm guessing that the SDK you refer to is either the description of the plugin protocol - see the first link above for that, or the F# compiler services - see the second. The Visual Studio project uses NuGet to pull down the F# compiler services dependencies, so that should work straight out of the box. If you run into problems with the configuration, the Plastic folks will definitely be able to help you, but I can also try to give you pointers. All the best, Gerard
  16. 3 points
    Yep, already removed. Thanks for the reports.
  17. 3 points
    Hi My Jenkins and PlasticSCM are working complately fine. Use Update is working OK as long as I am not changeing selector value (sometimes I need to tune branch). Could you please tell me what is the reason to delete complately workspace in such case and re-create it from scrach? Generally it would not be a big problem except that my repo is quite big and when workspace is deleted my cloaked, ignore and hidden_change configuration is also complately removed Especially that in GUI I am switching between branches and even repositories and/or server and there is no need to re-create workspace. Thanks in advance for the answer.
  18. 3 points
    Hi, I'm finishing off an open source Scala plugin for Semantic Merge, I've attached some teaser screenshots with this post. It already allows Semantic Merge do successfully diff and merge Scala source files; however I am currently working through replacing the TODOs with descriptions of the language constructs. The code is over at GitHub:- https://github.com/sageserpent-open/SemanticMergeScalaPlugin.git I'll think of a nice way of distributing this plugin, currently you'll have to build it for yourselves, which isn't difficult, but we'd all like a more shrink-wrapped product. Enjoy! Gerard
  19. 3 points
    Hi, PlasitcSCM review system is pretty basic and most of the time it does not suit the purpose(atleast for me) . - It doesn't work on xlink - I wanted to submit the difference between 2 cs for review (no idea how to do it). Because of the above shortcomings I wanted to integrate it with 'ReviewBoard' but it doesn't work. After installing reviewboard and plastic plugin, when I select 'PlasticSCM' as repository, it just keep trying to do something (loading animation does't end) but never completes. Is it problem with reviewboard or plastic plugin? Regards, Abhirav Kushwaha
  20. 3 points
    We have a workspace for a whole project, part of this turns out to be re-usable and as such we want to move it out of it's current repository into it's own and setup an xLink so that the existing repository uses the library from it's new location. We shall then also set up xLinks to use the library in 2 new project repositories. The question is how can I move a directory (with history) from one repository to another? One approach that I've considered is to copy the whole existing repository (somehow) and the move the relevant library directory to the root and delete the rest of the repositories' content. What's the best approach, is there support for this kind of refactoring? Andrew.
  21. 3 points
    I'm still getting this url4short spam - every time I start a new browser session, the first time I open a search result to PSCM from Google.
  22. 3 points
    In short, where can I download a Linux 64 client for Details: I've just joined a team who are all using Forcing everyone else to upgrade isn't really a viable option. No-one else has a Linux 64 installer so I tried to download it but I only see installers for .37 and .38. I tried .37 but that won't work. Both the command-line client and the GUI complain of a version mismatch. The GUI tries to download the appropriate client version automatically but it fails with a 404. There's a pretty clear warning on the downloads page that I shouldn't use .38. Given my lack of other options I gave it a go and, to my surprise, it seems to work OK. However, I'm hugely worried that it'll cause some sort of nasty explosion further down the line. Is it safe to use the .38 client against a .35 server? I don't need/want a local server, I just need to be able to commit changes to the remote .35 server. Any changes I commit to the server must remain compatible with other users who are using the .35 client. It seems odd to me that the .37 client can't work with a .35 server. Am I missing something? Surely it's common for members of large, distributed teams to be using slightly different versions at the same time? Is there any good reason why older versions are not made readily available on the downloads page? It would have saved me hours of fighting and frustration if I'd been able to download the version I wanted straight away. Plastic is new to me and even if all the problems I've had are my own fault it would have removed a lot of doubt in my mind if I had been able to use a version that I knew was compatible with the rest of the system. Thanks.
  23. 3 points
    Within the VCS root page of TeamCity I can successfully test the connection to Plastic SCM 4, but when I try to build my project I keep on getting the error shown below. Checking for changes [13:09:48]Publishing internal artifacts [13:09:48][Publishing internal artifacts] Sending build.start.properties.gz file [13:09:48]Clearing temporary directory: C:\TeamCity\buildAgent\temp\buildTmp [13:09:48]Checkout directory: C:\TeamCity\buildAgent\work\7eca2f3a253d1fa [13:09:48]Updating sources: server side checkout (1s) [13:09:48][updating sources] Will perform clean checkout. Reason: Agent doesn't have any version of the project sources [13:09:48][updating sources] Building clean patch for VCS root: Plastic SCM [13:09:49][updating sources] Failed to build patch for build #6 {build id=2600}, VCS root: repository "eRecruitment@" path "/" smartbranch "/main" {instance id=27, parent id=10}, due to error: Error updating vcs root Plastic SCM: [13:09:49][updating sources] Repository sources transferred [13:09:49]Will repeat attempt when server will be available, number of attempts left: 2 [13:09:59]Updating sources: server side checkout [13:09:59][updating sources] Will perform clean checkout. Reason: Agent doesn't have any version of the project sources [13:09:59][updating sources] Building clean patch for VCS root: Plastic SCM [13:10:00][updating sources] Failed to build patch for build #6 {build id=2600}, VCS root: repository "eRecruitment@" path "/" smartbranch "/main" {instance id=27, parent id=10}, due to error: Error updating vcs root Plastic SCM: [13:10:00][updating sources] Repository sources transferred [13:10:00]Will repeat attempt when server will be available, number of attempts left: 1 [13:10:10]Updating sources: server side checkout [13:10:10][updating sources] Will perform clean checkout. Reason: Agent doesn't have any version of the project sources [13:10:10][updating sources] Building clean patch for VCS root: Plastic SCM [13:10:11][updating sources] Failed to build patch for build #6 {build id=2600}, VCS root: repository "eRecruitment@" path "/" smartbranch "/main" {instance id=27, parent id=10}, due to error: Error updating vcs root Plastic SCM: [13:10:11][updating sources] Repository sources transferred [13:10:11]Patch is broken, can be found in file: C:\TeamCity\buildAgent\temp\globalTmp\temp4935659305517870871patch_2600 [13:10:11]Failed to build patch for build #6 {build id=2600}, VCS root: repository "eRecruitment@" path "/" smartbranch "/main" {instance id=27, parent id=10}, due to error: Error updating vcs root Plastic SCM: jetbrains.buildServer.agent.impl.patch.PatchDownloaderImpl$1: Server was not able to build correct patch, most likely due to VCS errors at jetbrains.buildServer.agent.impl.patch.PatchDownloaderImpl.throwError(PatchDownloaderImpl.java:114) at jetbrains.buildServer.agent.impl.patch.PatchDownloaderImpl.checkPatch(PatchDownloaderImpl.java:104) at jetbrains.buildServer.agent.impl.patch.PatchDownloaderImpl.copyPatchAndCheck(PatchDownloaderImpl.java:65) at jetbrains.buildServer.agent.impl.patch.UpdateSourcesPatcherBase.copyPatchToTempFile(UpdateSourcesPatcherBase.java:70) at jetbrains.buildServer.agent.impl.patch.UpdateSourcesFromServer.updateSources(UpdateSourcesFromServer.java:62) at jetbrains.buildServer.agent.impl.patch.UpdateSourcesBuildStageBase.doSourceUpdate(UpdateSourcesBuildStageBase.java:91) at jetbrains.buildServer.agent.impl.patch.UpdateSourcesBuildStageBase.doRecoverableStage(UpdateSourcesBuildStageBase.java:59) at jetbrains.buildServer.agent.impl.buildStages.startStages.RecoverableBuildStage.doLastAttempt(RecoverableBuildStage.java:112) at jetbrains.buildServer.agent.impl.buildStages.startStages.RecoverableBuildStage.doBuildStage(RecoverableBuildStage.java:70) at jetbrains.buildServer.agent.impl.buildStages.BuildStagesExecutor$1.callStage(BuildStagesExecutor.java:31) at jetbrains.buildServer.agent.impl.buildStages.BuildStagesExecutor$1.callStage(BuildStagesExecutor.java:24) at jetbrains.buildServer.agent.impl.buildStages.StagesExecutor.callRunStage(StagesExecutor.java:78) at jetbrains.buildServer.agent.impl.buildStages.StagesExecutor.doStages(StagesExecutor.java:37) at jetbrains.buildServer.agent.impl.buildStages.BuildStagesExecutor.doStages(BuildStagesExecutor.java:24) at jetbrains.buildServer.agent.impl.BuildRunAction.doStages(BuildRunAction.java:70) at jetbrains.buildServer.agent.impl.BuildRunAction.runBuild(BuildRunAction.java:50) at jetbrains.buildServer.agent.impl.BuildAgentImpl.doActualBuild(BuildAgentImpl.java:247) at jetbrains.buildServer.agent.impl.BuildAgentImpl.access$100(BuildAgentImpl.java:48) at jetbrains.buildServer.agent.impl.BuildAgentImpl$1.run(BuildAgentImpl.java:220) at java.lang.Thread.run(Unknown Source) Caused by: jetbrains.buildServer.vcs.patches.UnsuccessfulPatchException: Failed to build patch for build #6 {build id=2600}, VCS root: repository "eRecruitment@" path "/" smartbranch "/main" {instance id=27, parent id=10}, due to error: Error updating vcs root Plastic SCM: at jetbrains.buildServer.vcs.patches.AbstractPatcher$1.fail(AbstractPatcher.java:93) at jetbrains.buildServer.vcs.patches.LowLevelPatcher.readPatchStream(LowLevelPatcher.java:156) at jetbrains.buildServer.vcs.patches.LowLevelPatcher.applyPatch(LowLevelPatcher.java:79) at jetbrains.buildServer.vcs.patches.AbstractPatcher.applyPatch(AbstractPatcher.java:42) at jetbrains.buildServer.agent.impl.patch.PatchApplierImpl.applyPatch(PatchApplierImpl.java:18) at jetbrains.buildServer.agent.impl.patch.PatchDownloaderImpl.checkPatchInFileIsCompleted(PatchDownloaderImpl.java:84) at jetbrains.buildServer.agent.impl.patch.PatchDownloaderImpl.checkPatch(PatchDownloaderImpl.java:94) ... 18 more [13:10:11]Publishing internal artifacts [13:10:11][Publishing internal artifacts] Sending build.finish.properties.gz file [13:10:11]Build failed to start. Artifacts will not be published for this build [13:10:11]Build finished
  24. 3 points
    What's the best way to organize projects with shared dependencies? Here's a sample scenario: Project1 and Project2 both have a dependency on Lib1 and Lib2 Lib1 also has a dependency on Lib2 I need to be able to have TeamCity check this out and build it, too. I have not been able to successfully create xlinks to solve this. Whenever I use xlinks, I always end up in a situation where it tells me there's a change for Project1.csproj, but when I attent to commit it, it tells me there is no change. If I attempt to update, then it tells me there is a change pending.
  25. 3 points
    I have PlasticSCM installed and working, the GUI displays in an X window with poor type quality. I noticed when starting Plastic from the command line, a reference to a missing GTK Library. Are there any instructions for correcting this? mkt:~ janis$ plastic Gtk not found (missing LD_LIBRARY_PATH to libgtk-x11-2.0.so.0?), using built-in colorscheme mkt:~ janis$ cm version mkt:~ janis$ Mac OS X 10.6.8
  26. 3 points
    Hi! This one is specific for Manu, I think. Do you remember that I told you I was having trouble using Git Sync? I performed the sync process and the labels were not been sent to the Git repo. Were you able to reproduce the problem? If you don't remember, I have the test repo I noticed the problem here: http://sdrv.ms/13HKyup Cheers!
  27. 3 points
    One of our employees deleted several repositories, which are hosted on a plastic 4 server with sql server backend. I know the actual repository db does not get deleted when this happens, but how can I restore them ? The administrator guide only talks about reconnecting archived *.fdb repositories. My first thought was to restore a backup of the 'repositories' database, but I find out now these backups are not ok (only diff files, but the bak got deleted).
  28. 3 points
    Hi Diego, yes the labels are quite different between 3.0 and 4.0. In 3.0 you can apply a label a workspace content having revisions from tons of changesets. In 4.0 are not applied to a certain changeset thanks to the DAG structure. So the relationship has been moved from 1 - n to 1-1. That said, the labels migration are not an easy thing, so we have to go trough a compromise solution. The 90% of the cases the labels are mapped to 4.0 as you can see them drawn in the 3.0 branch explorer, in the other 10% are more difficult to handle. Can you show us and example of one of your imported labels? Manu.
  29. 2 points
    Hi all, Delphi support is the top 5 request in our User Voice: Delphi support in SemanticMerge. So we're eager to add Object Pascal to the list of supported languages. In order to do so we’ve developed a way to plug-in external parsers, so if you can develop a Delphi language parser it will be very simple to get it invoked from Semantic. If you’re interested on joining our “Delphi Parser” effort, please join this thread and we will send you the required tools. Right now all what plugin a parser requires is: * Create an standalone executable. * Able to receive some data as arguments. * And able to export the "tree" of the file in YAML format. Of course you'll need all the details but this is just an intro of what it takes. We've also developed tools to help testing the parsers, like a "directory parser" which will loop through a code tree parsing (invoking your parser) and then rebuilding the source file making sure the original and the regenerated ones match. We're eager to get this started! pablo
  30. 2 points
    I've been using - Brighton as a hermit developer on a single workspace for about a year now and recently had to refactor my PC drives and network mapping. The drives on my PC have changed and I would like to change the location of the workspace path. I searched the knowledge base and the forum and didn't see any posts on this subject, but perhaps I'm not searching for the correct terms. Anyhow, I would like to keep the current changeset history but redirect the workspace path. For example: from - F:\MyApps to - G:\Network\MyApplications Thanks -
  31. 2 points
    Is there a version of Gluon for mac? there doesn't seem to be an option to install it on mac like there is in the windows installer. If there is no mac version is one planed? ​Cheers!
  32. 2 points
    Hi, During Beta I created an organization called "worldwizards". Currently it lists as "disabled" if I try to connect to it. How can I sign up for a paid account and regain access to it? Thanks
  33. 2 points
    Thank you for the feedback. Let me start a few tests and I'll try to get more info.
  34. 2 points
    Hi I have noticed new Jenkins plugin version (2.5)with some corrections around "Use Update". So checked again behaviour with parametrized selectors and workspace deletion... It looks that the issue is still present I went quickly through plugin code (at GitHub) and this is what I have found: PlasticSCM.checkout(...) -> PlasticSCM.isWorkspaceDeleteNeeded -> workspaceConfiguration.equals(...) -> builder.append(this.selector, other.selector); So it looks that in fact any change in selector will always lead to Workspace deletion and re-creation from the scrach... Do you plan to change this behaviour? Or at least please do not delete conf files (cloaked, hidden_changes and ignore).
  35. 2 points
    I found this (2 year old) plugin for F# and Semantic Merge. I would like to try and use it, and sense it is almost certainly out of date, try to get it working with current SemanticMerge. This implied there is some for of SDK I need too. Can anyone tell me what I need to get setup so I can make progress? A quick web search did not reveal anything about using plugins for SemanticMerge or anything about an SDK. Thanks in advance for the help!
  36. 2 points
    There are several spam threads at the general forum, by the same user, emawatson.
  37. 2 points
    Hi, I'm currently on the trial of semantic merge. My major pain point right now is that if the focus is not on semantic merge and you press the Esc key, you semantic merge window closes. It's a bit frustrating when navigating between the diff tool and visual studio. Is there a way to deactivate this feature?
  38. 2 points
    Hi Carlos, I followed the procedure outlined in chapter 11 precisely on two different server setups. Unfortunately upon starting the client GUI I am not automatically prompted to upgrade the client installation. The client simply starts as usual. Mike
  39. 2 points
    I'm getting the url4short spam often when I visit these boards: http://peter.upfold.org.uk/blog/2013/01/15/cleaning-up-the-ip-board-url4short-mess/
  40. 2 points
    Thanks for the feedback, guys. Since I have access to the PlasticSCM server from all locations, I will use PlasticSCM only and move my code out of DropBox. I will also force explicit checkout of files, in case I forget & start working at 2 locations. Will "merge" is a very powerful thing, I'd rather not use it at all (weeell, _maybe_ after branching). In general, I have projects which I can only work on in the office (because they interact with some special hardware) and some more lightweight/almost hobby projects (or prototypes) which I work on at home. For me, personally, it will be easy to split my projects into two groups; it might not be so easy for others. Thanks again, I have barely begun with PlasticSCM but am already impressed with it - and with the forum :-)
  41. 2 points
    As a fairly new user of Plastic SCM Community Edition, I found myself in a bit of a pickle. I created a Read-Only Xlink to a Common Library. I then made code changes including changes to the Common Library via the XLink. Since the XLink is Read-Only I can't check-in those changes. My thought was to set the working space to the Common Library and check the changes in from the Common Library repository. Unfortunately, I find the the Common Library does not see the changes in the actual working space. I imagine that is because the changes are in the database and memory only. Now that I am in this situation, I need to get the changes checked. The XLink User's Guide tells me I can change the XLink to writable, however that causes an error. How can I get my coding changes into the proper Common Library repository and how can I change my Xlink to a WXlink? dgp
  42. 2 points
    I followed the guide to use Semantic Merge for my diff and merge tools, but when I try to actually do a diff I get the message "Your license is not valid, you are not able to use the diff tool. Please contact with support" I am using the Community Edition of Plastic.
  43. 2 points
    Using - Zagreb Attempting to check-in a single changed file results in an error "No checkout branch found." Same error occurs if the project folder is selected and "Checkin pending changes..." menu is selected Using the GUI tool works fine to check-in changes. Attachment shows UI of both operations.
  44. 2 points
    Currently evaluating whether to use Plastic SCM or not, but so far, I'm impressed. I've successfully gone through the installation and the tutorial, and have a couple of questions about using XLINK. We are relatively new to VCS methodologies/techniques, so I may not be correctly using the XLINK; if so, let me know. We develop embedded applications, and as such, we often licence software libraires (as source code) from third party suppliers (i.e. RTOSs, GUIs, etc.). The licences are for source code (as opposed to linkable object files), as they need to be compiled for specific embedded MCU's/platforms/etc. with specific optimisations (speed or size) depending on the project. These libraries are regularly updated by their maintainers, and we generally release updates of our products using the latest version of these libraries. Here's my question: how do I correctly add the 3rd party library to a repo, and use it in my projects' repos? I know I need to use XLINK, but the libraries come with a bunch of other files/documents. For instance, one specific library has the following structure: \GUI_XX.YY.ZZ (root of ZIP file I get from GUI library supplier, where XX.YY.ZZ represent the version number and build) \doc (contains version specific documentation for using the library) \source (contains several sub-directories for various parts of the library) \core (contains a bunch of .c, and .h files) \bitmaps (contains a bunch of .c, and .h files) \fonts (contains a bunch of .c, and .h files) \other components, I think you get the idea \examples (contains .c code that shows how to use/demonstrate the GUI) So, as you see, I have a bunch of file that I want to "archive" in a repo for that library. I want the documentation and examples to follow the libray source code, as developpers need to have access to them when they use the library. However, they won't be modifying any of these files (nor the source, for that matter, the whole GUI repo can be read-only as far as I'm concerned.) I'm assuming (correctly?) the library will have its own repo "main line", and as each release becomes available from the supplier, it will "grow". From 1.1.46, I may go to 1.2.0, to 1.3.2, to 2.0.0, etc. I don't expect this main line to branch at all (and consequently, no merges either.) Only linear changesets as they are added over time. Now, on to the questions... Do I need to create a workspace for the library? Can I just link to the \source sub-directory (using XLINK) when I create a repo for our project? I don't want to actually "download" the documentation, examples, etc. into the project workspace, but the developper may do so on a "per case" basis for consultation. The \source WILL have to be downloaded, as it's needed to compile the project. However, should the contents of \source change (by mistake), I don't want these changes to be "committable" to the library's repo. Is this possible? Let me know if I'm trying to use XLINK the correct way, or not. Thanks!
  45. 2 points
    I just checked new permission system. And of course it does not work Look at the screen: User2 has deny everything to dir2, but still he can read files from this directory.
  46. 2 points
    Hi all, Manu shared the news a week ago or so, but we're entering "closed beta" of our new merge-tool. In case you didn't notice yet, you can join the program here: http://plasticscm.com/sm/index.html The tool is awesome and able to do inside a single file what Plastic merge engine does for files and directories. Please sign up, give it a try, share it with as many beta-testing volunteers as you can find and check how the future of merging is going to be :-) Also, do not forget to join the conversation at #mergebegins. Thanks!!
  47. 2 points
    I think the same thing! This is why I ended up using Plastic in the first place and I really feel happy at the end of day! And now I can't see myself using any other tool. I talk a lot about plastic and people usually ask me if I work for the company who develop the Plastic... I'm out of votes, but as soon they come back, I'll vote for your request!
  48. 2 points
    Hi I have coupple of request for client improvements: 1) In the list of repositories it would be nice if there was an indicator of workspaces behind in updates (so are not working on the tip). I'm currently working on two places and using sync to synchronize, but besides syncing i also need to set the workspace to the most recent version before starting to work on it. It would be handy if i could see what project need this kind of attention without opening branch explorers for all of them. 2) Activate a workspace when double clicking them (maybe optional) but right-clicking all the time is not that easy to do. and bad for RSI. 3) I would like to see a view with full comments like typed in in the changeset view or in a report vreated from it with multi-line comments (not flattend to a single line). I normally use a (numbered) bullet style. So i would like to see a kind of changelog with changeset numbers, labels and full multi-line comments. Just a exportable report would be fine too. 4) Have an option for some basic formatting in the chec-in comment dialog (bulltets or numbers). 5) Get a warning if you try to close the check-in dialog without checking-in something when there are changes. The current check-in button is far from intuiitive and the current close button is where i would expect the check-in button. The close button should perhaps be named cancel. regards Wim van der Vegt
  49. 2 points
    Does it seem similar to this: http://www.plasticscm.net/index.php?/topic/972-vs2010-plastic-41-vs-integration-sporadic-ide-hangs/ ? If so you might want to upgrade as that bug has been fixed.
  50. 2 points
    Hello, I don't know if you guys notices, but the in plastic 4 beta "create a new workspace" window does not have the "new repository" button. I love this feature! I know that's just a "sugar tasty" for the tool, but it really would be nice to have this button back! Thanks for the great work and keep build this magnificent tool! Plácido Bisneto
  • Create New...