Using DX without dependencies hell?

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP





.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty margin-bottom:0;







up vote
8
down vote

favorite
3












I am thinking of leveraging DX but don't want to go through the dependencies management hell and the whole packaging mess.



Basically, I am thinking of just store code in the repo using SFDX and use that for Scratch Org but simply convert to Mdapi for production & sandbox deployment.



What I am gaining:



  • Can leverage scratch org since my code is in SFDX.

  • New code format is easier to work with since it breaks down the .object file.

  • Still happy with the happy soup. No need to spend years on breaking down stuff, manage dependencies, worry about duplicate metadata, package versioning, package versions dependencies, transitive dependencies, over-engineer dependencies injection and all the baggage that comes with DX. Honestly, I don't see we can ever justify that the benefit worths the effort. I don't want to spend 1.5 years to get nowhere like what the customer did here: https://www.youtube.com/watch?v=MY2_AfjtBp8


  • Not have to deal with multiple CI processes, one for package and another for unpackaged, since you can't package everything anyway.


What I am loosing:
- Deploy everything at once so deployment is slower compared to the package installation approach.



I don't want to blindly follow whatever Salesforce people said, coz they have to sell what they made, whether it is good or bad. It's the customers who pay. Appreciate any honest opinion that doesn't come from a Salesforce evangelists or MVP.



Also, please answer in the context of a complex org with years of metadata, not DreamHouse app.



Below was what discussed in the video, which is exactly what I am not going to do.



enter image description hereenter image description here










share|improve this question























  • I am very curious about whether or not the new, 50-mb-limit-less Metadata REST API makes this approach more feasible.
    – David Reed
    Sep 2 at 2:05










  • @David we haven't ran into this 49mb limit in the last few years so this isn't a problem but can switch from SOAP to REST mdapi deploy if needed.
    – codeinthecloud
    Sep 2 at 2:11






  • 1




    I think you've misunderstood what an evangelist or MVP is. We don't sugarcoat things to make the platform seem better than it is. We're simply passionate about what the system does, despite its shortcomings. If you've read some of my previous DX answers, you would know I'm still not wholly satisfied with the state of affairs with DX. But instead of just griping, we actually contact PMs, leave feedback to relevant people, so they can hopefully fix the problems we've observed. Our goal is to make Salesforce the best it can be.
    – sfdcfox
    Sep 2 at 2:31










  • @sfdcfox, no problems with your passion, but you need to realize that it is very hard for Salesforce not to sell what they spent 2 years building, even if it is at other's expense. I am just looking for honest answers here without saying "oh, you should follow Salesforce recommended best practices. or it is the future. What I want is a future I choose. Choices that we all have with other open source platform".
    – codeinthecloud
    Sep 2 at 2:51







  • 2




    Okay, well, I'm just curious if you want to hear what I have to say. I spearheaded the DX project for our org, where we have 5,000 active users, 750+ roles in our role hierarchy, 500+ classes, about 150 custom objects, ~15 installed AppExchange apps, and collectively over 10,000 metadata files in DX format. I do happen to be an MVP, though.
    – sfdcfox
    Sep 2 at 3:15
















up vote
8
down vote

favorite
3












I am thinking of leveraging DX but don't want to go through the dependencies management hell and the whole packaging mess.



Basically, I am thinking of just store code in the repo using SFDX and use that for Scratch Org but simply convert to Mdapi for production & sandbox deployment.



What I am gaining:



  • Can leverage scratch org since my code is in SFDX.

  • New code format is easier to work with since it breaks down the .object file.

  • Still happy with the happy soup. No need to spend years on breaking down stuff, manage dependencies, worry about duplicate metadata, package versioning, package versions dependencies, transitive dependencies, over-engineer dependencies injection and all the baggage that comes with DX. Honestly, I don't see we can ever justify that the benefit worths the effort. I don't want to spend 1.5 years to get nowhere like what the customer did here: https://www.youtube.com/watch?v=MY2_AfjtBp8


  • Not have to deal with multiple CI processes, one for package and another for unpackaged, since you can't package everything anyway.


What I am loosing:
- Deploy everything at once so deployment is slower compared to the package installation approach.



I don't want to blindly follow whatever Salesforce people said, coz they have to sell what they made, whether it is good or bad. It's the customers who pay. Appreciate any honest opinion that doesn't come from a Salesforce evangelists or MVP.



Also, please answer in the context of a complex org with years of metadata, not DreamHouse app.



Below was what discussed in the video, which is exactly what I am not going to do.



enter image description hereenter image description here










share|improve this question























  • I am very curious about whether or not the new, 50-mb-limit-less Metadata REST API makes this approach more feasible.
    – David Reed
    Sep 2 at 2:05










  • @David we haven't ran into this 49mb limit in the last few years so this isn't a problem but can switch from SOAP to REST mdapi deploy if needed.
    – codeinthecloud
    Sep 2 at 2:11






  • 1




    I think you've misunderstood what an evangelist or MVP is. We don't sugarcoat things to make the platform seem better than it is. We're simply passionate about what the system does, despite its shortcomings. If you've read some of my previous DX answers, you would know I'm still not wholly satisfied with the state of affairs with DX. But instead of just griping, we actually contact PMs, leave feedback to relevant people, so they can hopefully fix the problems we've observed. Our goal is to make Salesforce the best it can be.
    – sfdcfox
    Sep 2 at 2:31










  • @sfdcfox, no problems with your passion, but you need to realize that it is very hard for Salesforce not to sell what they spent 2 years building, even if it is at other's expense. I am just looking for honest answers here without saying "oh, you should follow Salesforce recommended best practices. or it is the future. What I want is a future I choose. Choices that we all have with other open source platform".
    – codeinthecloud
    Sep 2 at 2:51







  • 2




    Okay, well, I'm just curious if you want to hear what I have to say. I spearheaded the DX project for our org, where we have 5,000 active users, 750+ roles in our role hierarchy, 500+ classes, about 150 custom objects, ~15 installed AppExchange apps, and collectively over 10,000 metadata files in DX format. I do happen to be an MVP, though.
    – sfdcfox
    Sep 2 at 3:15












up vote
8
down vote

favorite
3









up vote
8
down vote

favorite
3






3





I am thinking of leveraging DX but don't want to go through the dependencies management hell and the whole packaging mess.



Basically, I am thinking of just store code in the repo using SFDX and use that for Scratch Org but simply convert to Mdapi for production & sandbox deployment.



What I am gaining:



  • Can leverage scratch org since my code is in SFDX.

  • New code format is easier to work with since it breaks down the .object file.

  • Still happy with the happy soup. No need to spend years on breaking down stuff, manage dependencies, worry about duplicate metadata, package versioning, package versions dependencies, transitive dependencies, over-engineer dependencies injection and all the baggage that comes with DX. Honestly, I don't see we can ever justify that the benefit worths the effort. I don't want to spend 1.5 years to get nowhere like what the customer did here: https://www.youtube.com/watch?v=MY2_AfjtBp8


  • Not have to deal with multiple CI processes, one for package and another for unpackaged, since you can't package everything anyway.


What I am loosing:
- Deploy everything at once so deployment is slower compared to the package installation approach.



I don't want to blindly follow whatever Salesforce people said, coz they have to sell what they made, whether it is good or bad. It's the customers who pay. Appreciate any honest opinion that doesn't come from a Salesforce evangelists or MVP.



Also, please answer in the context of a complex org with years of metadata, not DreamHouse app.



Below was what discussed in the video, which is exactly what I am not going to do.



enter image description hereenter image description here










share|improve this question















I am thinking of leveraging DX but don't want to go through the dependencies management hell and the whole packaging mess.



Basically, I am thinking of just store code in the repo using SFDX and use that for Scratch Org but simply convert to Mdapi for production & sandbox deployment.



What I am gaining:



  • Can leverage scratch org since my code is in SFDX.

  • New code format is easier to work with since it breaks down the .object file.

  • Still happy with the happy soup. No need to spend years on breaking down stuff, manage dependencies, worry about duplicate metadata, package versioning, package versions dependencies, transitive dependencies, over-engineer dependencies injection and all the baggage that comes with DX. Honestly, I don't see we can ever justify that the benefit worths the effort. I don't want to spend 1.5 years to get nowhere like what the customer did here: https://www.youtube.com/watch?v=MY2_AfjtBp8


  • Not have to deal with multiple CI processes, one for package and another for unpackaged, since you can't package everything anyway.


What I am loosing:
- Deploy everything at once so deployment is slower compared to the package installation approach.



I don't want to blindly follow whatever Salesforce people said, coz they have to sell what they made, whether it is good or bad. It's the customers who pay. Appreciate any honest opinion that doesn't come from a Salesforce evangelists or MVP.



Also, please answer in the context of a complex org with years of metadata, not DreamHouse app.



Below was what discussed in the video, which is exactly what I am not going to do.



enter image description hereenter image description here







salesforcedx






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Sep 2 at 3:54

























asked Sep 2 at 2:03









codeinthecloud

894




894











  • I am very curious about whether or not the new, 50-mb-limit-less Metadata REST API makes this approach more feasible.
    – David Reed
    Sep 2 at 2:05










  • @David we haven't ran into this 49mb limit in the last few years so this isn't a problem but can switch from SOAP to REST mdapi deploy if needed.
    – codeinthecloud
    Sep 2 at 2:11






  • 1




    I think you've misunderstood what an evangelist or MVP is. We don't sugarcoat things to make the platform seem better than it is. We're simply passionate about what the system does, despite its shortcomings. If you've read some of my previous DX answers, you would know I'm still not wholly satisfied with the state of affairs with DX. But instead of just griping, we actually contact PMs, leave feedback to relevant people, so they can hopefully fix the problems we've observed. Our goal is to make Salesforce the best it can be.
    – sfdcfox
    Sep 2 at 2:31










  • @sfdcfox, no problems with your passion, but you need to realize that it is very hard for Salesforce not to sell what they spent 2 years building, even if it is at other's expense. I am just looking for honest answers here without saying "oh, you should follow Salesforce recommended best practices. or it is the future. What I want is a future I choose. Choices that we all have with other open source platform".
    – codeinthecloud
    Sep 2 at 2:51







  • 2




    Okay, well, I'm just curious if you want to hear what I have to say. I spearheaded the DX project for our org, where we have 5,000 active users, 750+ roles in our role hierarchy, 500+ classes, about 150 custom objects, ~15 installed AppExchange apps, and collectively over 10,000 metadata files in DX format. I do happen to be an MVP, though.
    – sfdcfox
    Sep 2 at 3:15
















  • I am very curious about whether or not the new, 50-mb-limit-less Metadata REST API makes this approach more feasible.
    – David Reed
    Sep 2 at 2:05










  • @David we haven't ran into this 49mb limit in the last few years so this isn't a problem but can switch from SOAP to REST mdapi deploy if needed.
    – codeinthecloud
    Sep 2 at 2:11






  • 1




    I think you've misunderstood what an evangelist or MVP is. We don't sugarcoat things to make the platform seem better than it is. We're simply passionate about what the system does, despite its shortcomings. If you've read some of my previous DX answers, you would know I'm still not wholly satisfied with the state of affairs with DX. But instead of just griping, we actually contact PMs, leave feedback to relevant people, so they can hopefully fix the problems we've observed. Our goal is to make Salesforce the best it can be.
    – sfdcfox
    Sep 2 at 2:31










  • @sfdcfox, no problems with your passion, but you need to realize that it is very hard for Salesforce not to sell what they spent 2 years building, even if it is at other's expense. I am just looking for honest answers here without saying "oh, you should follow Salesforce recommended best practices. or it is the future. What I want is a future I choose. Choices that we all have with other open source platform".
    – codeinthecloud
    Sep 2 at 2:51







  • 2




    Okay, well, I'm just curious if you want to hear what I have to say. I spearheaded the DX project for our org, where we have 5,000 active users, 750+ roles in our role hierarchy, 500+ classes, about 150 custom objects, ~15 installed AppExchange apps, and collectively over 10,000 metadata files in DX format. I do happen to be an MVP, though.
    – sfdcfox
    Sep 2 at 3:15















I am very curious about whether or not the new, 50-mb-limit-less Metadata REST API makes this approach more feasible.
– David Reed
Sep 2 at 2:05




I am very curious about whether or not the new, 50-mb-limit-less Metadata REST API makes this approach more feasible.
– David Reed
Sep 2 at 2:05












@David we haven't ran into this 49mb limit in the last few years so this isn't a problem but can switch from SOAP to REST mdapi deploy if needed.
– codeinthecloud
Sep 2 at 2:11




@David we haven't ran into this 49mb limit in the last few years so this isn't a problem but can switch from SOAP to REST mdapi deploy if needed.
– codeinthecloud
Sep 2 at 2:11




1




1




I think you've misunderstood what an evangelist or MVP is. We don't sugarcoat things to make the platform seem better than it is. We're simply passionate about what the system does, despite its shortcomings. If you've read some of my previous DX answers, you would know I'm still not wholly satisfied with the state of affairs with DX. But instead of just griping, we actually contact PMs, leave feedback to relevant people, so they can hopefully fix the problems we've observed. Our goal is to make Salesforce the best it can be.
– sfdcfox
Sep 2 at 2:31




I think you've misunderstood what an evangelist or MVP is. We don't sugarcoat things to make the platform seem better than it is. We're simply passionate about what the system does, despite its shortcomings. If you've read some of my previous DX answers, you would know I'm still not wholly satisfied with the state of affairs with DX. But instead of just griping, we actually contact PMs, leave feedback to relevant people, so they can hopefully fix the problems we've observed. Our goal is to make Salesforce the best it can be.
– sfdcfox
Sep 2 at 2:31












@sfdcfox, no problems with your passion, but you need to realize that it is very hard for Salesforce not to sell what they spent 2 years building, even if it is at other's expense. I am just looking for honest answers here without saying "oh, you should follow Salesforce recommended best practices. or it is the future. What I want is a future I choose. Choices that we all have with other open source platform".
– codeinthecloud
Sep 2 at 2:51





@sfdcfox, no problems with your passion, but you need to realize that it is very hard for Salesforce not to sell what they spent 2 years building, even if it is at other's expense. I am just looking for honest answers here without saying "oh, you should follow Salesforce recommended best practices. or it is the future. What I want is a future I choose. Choices that we all have with other open source platform".
– codeinthecloud
Sep 2 at 2:51





2




2




Okay, well, I'm just curious if you want to hear what I have to say. I spearheaded the DX project for our org, where we have 5,000 active users, 750+ roles in our role hierarchy, 500+ classes, about 150 custom objects, ~15 installed AppExchange apps, and collectively over 10,000 metadata files in DX format. I do happen to be an MVP, though.
– sfdcfox
Sep 2 at 3:15




Okay, well, I'm just curious if you want to hear what I have to say. I spearheaded the DX project for our org, where we have 5,000 active users, 750+ roles in our role hierarchy, 500+ classes, about 150 custom objects, ~15 installed AppExchange apps, and collectively over 10,000 metadata files in DX format. I do happen to be an MVP, though.
– sfdcfox
Sep 2 at 3:15










2 Answers
2






active

oldest

votes

















up vote
8
down vote














Basically, I am thinking of just store code in the repo using SFDX and use that for Scratch Org but simply convert to Mdapi for production & sandbox deployment.




As of Winter '19, this is no longer necessary. You can use the new force:source:deploy to package up your DX-compatible file tree and deploy it without the hassle of force:mdapi:convert. This also basically eliminates the need to have a classic mdapi format, unless you need it for an IDE or some other reason. Regardless, it's still a sound strategy overall if you have a complicated setup, like we do.




Can leverage scratch org since my code is in SFDX.




Technically, you could do that with the force:mdapi:deploy command, which works in both scratch orgs and other orgs. There's no specific need to use force:source:push if you don't want to.




Still happy with the happy soup. No need to spend years on breaking down stuff, manage dependencies, worry about duplicate metadata, package versioning, package versions dependencies, transitive dependencies, over-engineer dependencies injection and all the baggage that comes with DX. Honestly, I don't see we can ever justify that the benefit worths the effort.




Note that packages are completely optional, and in fact, I'd recommend you stay away from them if you need more than about 5-10 or so, as they quickly start to make a mess of things. For new customers with no pre-existing configuration, I would recommend packages to. May as well start off on the right foot.



For small-to-medium size orgs, I'd recommend researching if packages are viable or not. For large organizations, like ours, packaging is still pretty much a pipe dream. We might eventually one day start building packages, but many of our features have incredibly complicated dependencies. We can select a single item and end up finding hundreds of dependent items.




Not have to deal with multiple CI processes, one for package and another for unpackaged, since you can't package everything anyway.




It depends on how you do your CI. For example, our CI makes diffs between the source and destination, so we never do a full deployment anyways. In our case, packages, then, would be redundant, because we're basically already doing what DX packaging offers. However, even if you just package your core system library, you might still have a considerable savings in deployment time that might justify the CI complexity required.



But, there's always a trade off. Complexity for deployment time. You have to consider how valuable your time is, because you're going to end up spending it one way or the other. If you have a setup where deployments take only a few minutes anyways, packaging probably isn't worth the complexity.



One place I worked at had a 45 minute deployment time. If we had DX then, and we could have reduced our complexity such that deployments only took 15 minutes, we would have spent the time to build packages. Imagine a deployment failing 4 times at 45 minutes each (5 total deployments). I had that happen to me, and I didn't leave the office until 2am as a result (deployments started at 10pm).




Deploy everything at once so deployment is slower compared to the package installation approach.




Yes, and no. You're really not losing much in most cases, because the duplicate items will be a no-op (they don't really change). In many orgs, the actual deployment time is overwhelmingly unit tests, which packages are not going to help you skip anyways.




To address more specific concerns...




No need to spend years on breaking down stuff,




That's the Salesforce "organic migration" approach. And, as far as I can tell, it might be viable, if you even knew where to start. Most orgs, I wager, have a ton of deeply nested dependencies you simply can't break easily, so you'd end up with either a large core and lots of small side packages, or simply give up and put everything into one big package, which defeats the purpose of it.




manage dependencies,




Especially since it's all completely manual. If we had a tool to automate the dependency resolution, it might be ... not as bad. Starting from ground zero, the dependencies would be manageable. For existing bases, especially as large as ours, packaging would be a nightmare. We don't use packaging for this reason.




worry about duplicate metadata,




That's actually more of a non-worry, because the deployments always just seem to go okay regardless of duplicates, so long as they don't conflict with each other. Honestly, I was surprised by how DX seemed to do the right thing consistently, as long as I didn't do anything too obviously broken.




package versioning,




The system kind of takes care of the versioning for you, so it's not that bad of an issue.




package versions dependencies,




Generally a non-issue, because DX does a decent job of managing them for you. It's rare that you'd have to deal with this directly once everything is set up.




transitive dependencies,




I'm not sure how this applies in a metadata context. I'd love a concrete example of how this might be a problem.




over-engineer dependencies injection ...




I agree, one should not DI just to fulfill a packaging requirement. This is the point of dependent packages, though. However, I do understand that there are situations that could arise where A depends on B, but B depends on A. The typical solution would be to move the common dependencies to C, and A and B would both depend on C. This leads us back to an earlier statement that you'd probably end up with a huge core library and lots of small packages that depend on it, which defeats the purpose of packaging.




There's one other potentially damaging loss: namespaces. Using packaging namespaces, you can actually help eliminate a lot of the duplicate metadata problems by isolating them and being able to refer to them uniquely. This is comparable to languages like C and C# that have had these features forever. If you find yourself prefixing classes all the time (e.g. Account_Extension, Account_TriggerHandler, etc), using packages might make sense for you, and help isolate code.




I wouldn't dismiss packaging outright (even we intend to eventually use it, if we can get the features we asked for), but also don't feel bad if you decide not to use it. I feel like you just might want to do some more research before you conclusively say "no, I'll never do this." (your question already reads like foregone conclusion). A lot of the features that exist are promising.



You can even do pseudo-packaging for right now; set up a bunch of paths to sort your metadata into, but don't actually build the packages. This might save you on deployment times in the future. And you don't need to do it all at once, either, just every time you're in a particular area, start picking out pieces. Do it as part of the normal development cycle. You'll hardly even notice the difference. As a bonus, if you decide to package, you've already done the hard part, and if not, you can use your source tree as is.




tl;dr



Ultimately, the choice is yours. One size does not, and cannot, fit all. DX is designed to fit a certain demographic of clients, but it certainly cannot accommodate everyone. And it's still missing features that are critical to using it as advertised. If you don't care for packaging, you're not forced to use it. If you want to use the old mdapi format, or the new DX format, you have that choice (especially with the new force:source:deploy command).



Please note that DX isn't particularly a selling point, as salesforce.com isn't making any money off this, at least not in the direct sense. They're genuinely trying to make development easier and more manageable, like other modern languages; it's a direct response to the numerous complaints from ISVs, large clients, and community developers at large. DX is a tool, like a hammer or a screwdriver. It's up to you to figure out how you're going to use it, or if it's even the right choice.






share|improve this answer



























    up vote
    1
    down vote













    This is a comment to @sfdcfox reply.



    I appreciate you put serious thoughts into this thorough answer. I agree with pretty much all the points. I think you are not speaking the same way as the DX team. They almost make it sounds like it is the future for everybody. I'd like them to put banners and highlight the target audiences.



    You can tell that with VS Code, they almost leave no ways for teams to continue with mdapi format. Ie. switch to DX or use other non-support IDEs. I'd like to see them continue supporting mdapi format in VS code.



    They did at least made the right move allowing VS code to deploy to a non-scratch org with force:mdapi:deploy in Winter 19 but Mdapi code format needs to be supported too.



    I like how you think about complexity vs deployment time. My team always opted for low complexity with higher deployment time. That has worked really well. I'm against deploying diff/partial or incremental deployment. We have CI deployment from master to production everyday and we deploy everything from master, not selective. That gives us confidence that everything in the repo is the source of truth without worrying about overwriting newer changes in production ever.



    You are right that there isn't much time saving in deploy all vs package install since majority of the time it takes is running test. So I can take that off the benefits of using package.



    I also think that most developers should take a step back and evaluate like you do before forcing something on themselves & not be so excited about things like force-di. It's an elegant pattern from other stacks (ie. Spring DI, Guice) but is just not a good fit for Salesforce.



    "They're genuinely trying to make development easier and more manageable, like other modern languages"



    OK so this is good intention but I'd expected that they approach it with simplicity and tackle the problems at the core rather than putting band-aid and wrapper around the real problems. The could just start small with things like optimize their internal system to make tests run faster without putting the complexity on their customers to manage packaging . Sure they can figure out classes that haven't changed and skip the whole tests, for example. Perhaps those are the simple ways that can reduce test time dramatically without customers having spend years on modularization. They could have also just added a simple support in the MDAPI to support broken down the .object files without introducing the whole new source format, conversion, a CLI, a new IDE that supports only certain bits. Nobody seems to care about simplicity and elegance.



    Finally, I like Salesforce as a company. I just think DX is currently in such an awful state and that should be acknowledged more often so people can make informed decisions.



    Your answer is super helpful and on points. I'd stay away from packaging & hope to gain something from using DX with the happy soup.



    Thanks.






    share|improve this answer






















      Your Answer







      StackExchange.ready(function()
      var channelOptions =
      tags: "".split(" "),
      id: "459"
      ;
      initTagRenderer("".split(" "), "".split(" "), channelOptions);

      StackExchange.using("externalEditor", function()
      // Have to fire editor after snippets, if snippets enabled
      if (StackExchange.settings.snippets.snippetsEnabled)
      StackExchange.using("snippets", function()
      createEditor();
      );

      else
      createEditor();

      );

      function createEditor()
      StackExchange.prepareEditor(
      heartbeatType: 'answer',
      convertImagesToLinks: false,
      noModals: false,
      showLowRepImageUploadWarning: true,
      reputationToPostImages: null,
      bindNavPrevention: true,
      postfix: "",
      onDemand: true,
      discardSelector: ".discard-answer"
      ,immediatelyShowMarkdownHelp:true
      );



      );













       

      draft saved


      draft discarded


















      StackExchange.ready(
      function ()
      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fsalesforce.stackexchange.com%2fquestions%2f230956%2fusing-dx-without-dependencies-hell%23new-answer', 'question_page');

      );

      Post as a guest






























      2 Answers
      2






      active

      oldest

      votes








      2 Answers
      2






      active

      oldest

      votes









      active

      oldest

      votes






      active

      oldest

      votes








      up vote
      8
      down vote














      Basically, I am thinking of just store code in the repo using SFDX and use that for Scratch Org but simply convert to Mdapi for production & sandbox deployment.




      As of Winter '19, this is no longer necessary. You can use the new force:source:deploy to package up your DX-compatible file tree and deploy it without the hassle of force:mdapi:convert. This also basically eliminates the need to have a classic mdapi format, unless you need it for an IDE or some other reason. Regardless, it's still a sound strategy overall if you have a complicated setup, like we do.




      Can leverage scratch org since my code is in SFDX.




      Technically, you could do that with the force:mdapi:deploy command, which works in both scratch orgs and other orgs. There's no specific need to use force:source:push if you don't want to.




      Still happy with the happy soup. No need to spend years on breaking down stuff, manage dependencies, worry about duplicate metadata, package versioning, package versions dependencies, transitive dependencies, over-engineer dependencies injection and all the baggage that comes with DX. Honestly, I don't see we can ever justify that the benefit worths the effort.




      Note that packages are completely optional, and in fact, I'd recommend you stay away from them if you need more than about 5-10 or so, as they quickly start to make a mess of things. For new customers with no pre-existing configuration, I would recommend packages to. May as well start off on the right foot.



      For small-to-medium size orgs, I'd recommend researching if packages are viable or not. For large organizations, like ours, packaging is still pretty much a pipe dream. We might eventually one day start building packages, but many of our features have incredibly complicated dependencies. We can select a single item and end up finding hundreds of dependent items.




      Not have to deal with multiple CI processes, one for package and another for unpackaged, since you can't package everything anyway.




      It depends on how you do your CI. For example, our CI makes diffs between the source and destination, so we never do a full deployment anyways. In our case, packages, then, would be redundant, because we're basically already doing what DX packaging offers. However, even if you just package your core system library, you might still have a considerable savings in deployment time that might justify the CI complexity required.



      But, there's always a trade off. Complexity for deployment time. You have to consider how valuable your time is, because you're going to end up spending it one way or the other. If you have a setup where deployments take only a few minutes anyways, packaging probably isn't worth the complexity.



      One place I worked at had a 45 minute deployment time. If we had DX then, and we could have reduced our complexity such that deployments only took 15 minutes, we would have spent the time to build packages. Imagine a deployment failing 4 times at 45 minutes each (5 total deployments). I had that happen to me, and I didn't leave the office until 2am as a result (deployments started at 10pm).




      Deploy everything at once so deployment is slower compared to the package installation approach.




      Yes, and no. You're really not losing much in most cases, because the duplicate items will be a no-op (they don't really change). In many orgs, the actual deployment time is overwhelmingly unit tests, which packages are not going to help you skip anyways.




      To address more specific concerns...




      No need to spend years on breaking down stuff,




      That's the Salesforce "organic migration" approach. And, as far as I can tell, it might be viable, if you even knew where to start. Most orgs, I wager, have a ton of deeply nested dependencies you simply can't break easily, so you'd end up with either a large core and lots of small side packages, or simply give up and put everything into one big package, which defeats the purpose of it.




      manage dependencies,




      Especially since it's all completely manual. If we had a tool to automate the dependency resolution, it might be ... not as bad. Starting from ground zero, the dependencies would be manageable. For existing bases, especially as large as ours, packaging would be a nightmare. We don't use packaging for this reason.




      worry about duplicate metadata,




      That's actually more of a non-worry, because the deployments always just seem to go okay regardless of duplicates, so long as they don't conflict with each other. Honestly, I was surprised by how DX seemed to do the right thing consistently, as long as I didn't do anything too obviously broken.




      package versioning,




      The system kind of takes care of the versioning for you, so it's not that bad of an issue.




      package versions dependencies,




      Generally a non-issue, because DX does a decent job of managing them for you. It's rare that you'd have to deal with this directly once everything is set up.




      transitive dependencies,




      I'm not sure how this applies in a metadata context. I'd love a concrete example of how this might be a problem.




      over-engineer dependencies injection ...




      I agree, one should not DI just to fulfill a packaging requirement. This is the point of dependent packages, though. However, I do understand that there are situations that could arise where A depends on B, but B depends on A. The typical solution would be to move the common dependencies to C, and A and B would both depend on C. This leads us back to an earlier statement that you'd probably end up with a huge core library and lots of small packages that depend on it, which defeats the purpose of packaging.




      There's one other potentially damaging loss: namespaces. Using packaging namespaces, you can actually help eliminate a lot of the duplicate metadata problems by isolating them and being able to refer to them uniquely. This is comparable to languages like C and C# that have had these features forever. If you find yourself prefixing classes all the time (e.g. Account_Extension, Account_TriggerHandler, etc), using packages might make sense for you, and help isolate code.




      I wouldn't dismiss packaging outright (even we intend to eventually use it, if we can get the features we asked for), but also don't feel bad if you decide not to use it. I feel like you just might want to do some more research before you conclusively say "no, I'll never do this." (your question already reads like foregone conclusion). A lot of the features that exist are promising.



      You can even do pseudo-packaging for right now; set up a bunch of paths to sort your metadata into, but don't actually build the packages. This might save you on deployment times in the future. And you don't need to do it all at once, either, just every time you're in a particular area, start picking out pieces. Do it as part of the normal development cycle. You'll hardly even notice the difference. As a bonus, if you decide to package, you've already done the hard part, and if not, you can use your source tree as is.




      tl;dr



      Ultimately, the choice is yours. One size does not, and cannot, fit all. DX is designed to fit a certain demographic of clients, but it certainly cannot accommodate everyone. And it's still missing features that are critical to using it as advertised. If you don't care for packaging, you're not forced to use it. If you want to use the old mdapi format, or the new DX format, you have that choice (especially with the new force:source:deploy command).



      Please note that DX isn't particularly a selling point, as salesforce.com isn't making any money off this, at least not in the direct sense. They're genuinely trying to make development easier and more manageable, like other modern languages; it's a direct response to the numerous complaints from ISVs, large clients, and community developers at large. DX is a tool, like a hammer or a screwdriver. It's up to you to figure out how you're going to use it, or if it's even the right choice.






      share|improve this answer
























        up vote
        8
        down vote














        Basically, I am thinking of just store code in the repo using SFDX and use that for Scratch Org but simply convert to Mdapi for production & sandbox deployment.




        As of Winter '19, this is no longer necessary. You can use the new force:source:deploy to package up your DX-compatible file tree and deploy it without the hassle of force:mdapi:convert. This also basically eliminates the need to have a classic mdapi format, unless you need it for an IDE or some other reason. Regardless, it's still a sound strategy overall if you have a complicated setup, like we do.




        Can leverage scratch org since my code is in SFDX.




        Technically, you could do that with the force:mdapi:deploy command, which works in both scratch orgs and other orgs. There's no specific need to use force:source:push if you don't want to.




        Still happy with the happy soup. No need to spend years on breaking down stuff, manage dependencies, worry about duplicate metadata, package versioning, package versions dependencies, transitive dependencies, over-engineer dependencies injection and all the baggage that comes with DX. Honestly, I don't see we can ever justify that the benefit worths the effort.




        Note that packages are completely optional, and in fact, I'd recommend you stay away from them if you need more than about 5-10 or so, as they quickly start to make a mess of things. For new customers with no pre-existing configuration, I would recommend packages to. May as well start off on the right foot.



        For small-to-medium size orgs, I'd recommend researching if packages are viable or not. For large organizations, like ours, packaging is still pretty much a pipe dream. We might eventually one day start building packages, but many of our features have incredibly complicated dependencies. We can select a single item and end up finding hundreds of dependent items.




        Not have to deal with multiple CI processes, one for package and another for unpackaged, since you can't package everything anyway.




        It depends on how you do your CI. For example, our CI makes diffs between the source and destination, so we never do a full deployment anyways. In our case, packages, then, would be redundant, because we're basically already doing what DX packaging offers. However, even if you just package your core system library, you might still have a considerable savings in deployment time that might justify the CI complexity required.



        But, there's always a trade off. Complexity for deployment time. You have to consider how valuable your time is, because you're going to end up spending it one way or the other. If you have a setup where deployments take only a few minutes anyways, packaging probably isn't worth the complexity.



        One place I worked at had a 45 minute deployment time. If we had DX then, and we could have reduced our complexity such that deployments only took 15 minutes, we would have spent the time to build packages. Imagine a deployment failing 4 times at 45 minutes each (5 total deployments). I had that happen to me, and I didn't leave the office until 2am as a result (deployments started at 10pm).




        Deploy everything at once so deployment is slower compared to the package installation approach.




        Yes, and no. You're really not losing much in most cases, because the duplicate items will be a no-op (they don't really change). In many orgs, the actual deployment time is overwhelmingly unit tests, which packages are not going to help you skip anyways.




        To address more specific concerns...




        No need to spend years on breaking down stuff,




        That's the Salesforce "organic migration" approach. And, as far as I can tell, it might be viable, if you even knew where to start. Most orgs, I wager, have a ton of deeply nested dependencies you simply can't break easily, so you'd end up with either a large core and lots of small side packages, or simply give up and put everything into one big package, which defeats the purpose of it.




        manage dependencies,




        Especially since it's all completely manual. If we had a tool to automate the dependency resolution, it might be ... not as bad. Starting from ground zero, the dependencies would be manageable. For existing bases, especially as large as ours, packaging would be a nightmare. We don't use packaging for this reason.




        worry about duplicate metadata,




        That's actually more of a non-worry, because the deployments always just seem to go okay regardless of duplicates, so long as they don't conflict with each other. Honestly, I was surprised by how DX seemed to do the right thing consistently, as long as I didn't do anything too obviously broken.




        package versioning,




        The system kind of takes care of the versioning for you, so it's not that bad of an issue.




        package versions dependencies,




        Generally a non-issue, because DX does a decent job of managing them for you. It's rare that you'd have to deal with this directly once everything is set up.




        transitive dependencies,




        I'm not sure how this applies in a metadata context. I'd love a concrete example of how this might be a problem.




        over-engineer dependencies injection ...




        I agree, one should not DI just to fulfill a packaging requirement. This is the point of dependent packages, though. However, I do understand that there are situations that could arise where A depends on B, but B depends on A. The typical solution would be to move the common dependencies to C, and A and B would both depend on C. This leads us back to an earlier statement that you'd probably end up with a huge core library and lots of small packages that depend on it, which defeats the purpose of packaging.




        There's one other potentially damaging loss: namespaces. Using packaging namespaces, you can actually help eliminate a lot of the duplicate metadata problems by isolating them and being able to refer to them uniquely. This is comparable to languages like C and C# that have had these features forever. If you find yourself prefixing classes all the time (e.g. Account_Extension, Account_TriggerHandler, etc), using packages might make sense for you, and help isolate code.




        I wouldn't dismiss packaging outright (even we intend to eventually use it, if we can get the features we asked for), but also don't feel bad if you decide not to use it. I feel like you just might want to do some more research before you conclusively say "no, I'll never do this." (your question already reads like foregone conclusion). A lot of the features that exist are promising.



        You can even do pseudo-packaging for right now; set up a bunch of paths to sort your metadata into, but don't actually build the packages. This might save you on deployment times in the future. And you don't need to do it all at once, either, just every time you're in a particular area, start picking out pieces. Do it as part of the normal development cycle. You'll hardly even notice the difference. As a bonus, if you decide to package, you've already done the hard part, and if not, you can use your source tree as is.




        tl;dr



        Ultimately, the choice is yours. One size does not, and cannot, fit all. DX is designed to fit a certain demographic of clients, but it certainly cannot accommodate everyone. And it's still missing features that are critical to using it as advertised. If you don't care for packaging, you're not forced to use it. If you want to use the old mdapi format, or the new DX format, you have that choice (especially with the new force:source:deploy command).



        Please note that DX isn't particularly a selling point, as salesforce.com isn't making any money off this, at least not in the direct sense. They're genuinely trying to make development easier and more manageable, like other modern languages; it's a direct response to the numerous complaints from ISVs, large clients, and community developers at large. DX is a tool, like a hammer or a screwdriver. It's up to you to figure out how you're going to use it, or if it's even the right choice.






        share|improve this answer






















          up vote
          8
          down vote










          up vote
          8
          down vote










          Basically, I am thinking of just store code in the repo using SFDX and use that for Scratch Org but simply convert to Mdapi for production & sandbox deployment.




          As of Winter '19, this is no longer necessary. You can use the new force:source:deploy to package up your DX-compatible file tree and deploy it without the hassle of force:mdapi:convert. This also basically eliminates the need to have a classic mdapi format, unless you need it for an IDE or some other reason. Regardless, it's still a sound strategy overall if you have a complicated setup, like we do.




          Can leverage scratch org since my code is in SFDX.




          Technically, you could do that with the force:mdapi:deploy command, which works in both scratch orgs and other orgs. There's no specific need to use force:source:push if you don't want to.




          Still happy with the happy soup. No need to spend years on breaking down stuff, manage dependencies, worry about duplicate metadata, package versioning, package versions dependencies, transitive dependencies, over-engineer dependencies injection and all the baggage that comes with DX. Honestly, I don't see we can ever justify that the benefit worths the effort.




          Note that packages are completely optional, and in fact, I'd recommend you stay away from them if you need more than about 5-10 or so, as they quickly start to make a mess of things. For new customers with no pre-existing configuration, I would recommend packages to. May as well start off on the right foot.



          For small-to-medium size orgs, I'd recommend researching if packages are viable or not. For large organizations, like ours, packaging is still pretty much a pipe dream. We might eventually one day start building packages, but many of our features have incredibly complicated dependencies. We can select a single item and end up finding hundreds of dependent items.




          Not have to deal with multiple CI processes, one for package and another for unpackaged, since you can't package everything anyway.




          It depends on how you do your CI. For example, our CI makes diffs between the source and destination, so we never do a full deployment anyways. In our case, packages, then, would be redundant, because we're basically already doing what DX packaging offers. However, even if you just package your core system library, you might still have a considerable savings in deployment time that might justify the CI complexity required.



          But, there's always a trade off. Complexity for deployment time. You have to consider how valuable your time is, because you're going to end up spending it one way or the other. If you have a setup where deployments take only a few minutes anyways, packaging probably isn't worth the complexity.



          One place I worked at had a 45 minute deployment time. If we had DX then, and we could have reduced our complexity such that deployments only took 15 minutes, we would have spent the time to build packages. Imagine a deployment failing 4 times at 45 minutes each (5 total deployments). I had that happen to me, and I didn't leave the office until 2am as a result (deployments started at 10pm).




          Deploy everything at once so deployment is slower compared to the package installation approach.




          Yes, and no. You're really not losing much in most cases, because the duplicate items will be a no-op (they don't really change). In many orgs, the actual deployment time is overwhelmingly unit tests, which packages are not going to help you skip anyways.




          To address more specific concerns...




          No need to spend years on breaking down stuff,




          That's the Salesforce "organic migration" approach. And, as far as I can tell, it might be viable, if you even knew where to start. Most orgs, I wager, have a ton of deeply nested dependencies you simply can't break easily, so you'd end up with either a large core and lots of small side packages, or simply give up and put everything into one big package, which defeats the purpose of it.




          manage dependencies,




          Especially since it's all completely manual. If we had a tool to automate the dependency resolution, it might be ... not as bad. Starting from ground zero, the dependencies would be manageable. For existing bases, especially as large as ours, packaging would be a nightmare. We don't use packaging for this reason.




          worry about duplicate metadata,




          That's actually more of a non-worry, because the deployments always just seem to go okay regardless of duplicates, so long as they don't conflict with each other. Honestly, I was surprised by how DX seemed to do the right thing consistently, as long as I didn't do anything too obviously broken.




          package versioning,




          The system kind of takes care of the versioning for you, so it's not that bad of an issue.




          package versions dependencies,




          Generally a non-issue, because DX does a decent job of managing them for you. It's rare that you'd have to deal with this directly once everything is set up.




          transitive dependencies,




          I'm not sure how this applies in a metadata context. I'd love a concrete example of how this might be a problem.




          over-engineer dependencies injection ...




          I agree, one should not DI just to fulfill a packaging requirement. This is the point of dependent packages, though. However, I do understand that there are situations that could arise where A depends on B, but B depends on A. The typical solution would be to move the common dependencies to C, and A and B would both depend on C. This leads us back to an earlier statement that you'd probably end up with a huge core library and lots of small packages that depend on it, which defeats the purpose of packaging.




          There's one other potentially damaging loss: namespaces. Using packaging namespaces, you can actually help eliminate a lot of the duplicate metadata problems by isolating them and being able to refer to them uniquely. This is comparable to languages like C and C# that have had these features forever. If you find yourself prefixing classes all the time (e.g. Account_Extension, Account_TriggerHandler, etc), using packages might make sense for you, and help isolate code.




          I wouldn't dismiss packaging outright (even we intend to eventually use it, if we can get the features we asked for), but also don't feel bad if you decide not to use it. I feel like you just might want to do some more research before you conclusively say "no, I'll never do this." (your question already reads like foregone conclusion). A lot of the features that exist are promising.



          You can even do pseudo-packaging for right now; set up a bunch of paths to sort your metadata into, but don't actually build the packages. This might save you on deployment times in the future. And you don't need to do it all at once, either, just every time you're in a particular area, start picking out pieces. Do it as part of the normal development cycle. You'll hardly even notice the difference. As a bonus, if you decide to package, you've already done the hard part, and if not, you can use your source tree as is.




          tl;dr



          Ultimately, the choice is yours. One size does not, and cannot, fit all. DX is designed to fit a certain demographic of clients, but it certainly cannot accommodate everyone. And it's still missing features that are critical to using it as advertised. If you don't care for packaging, you're not forced to use it. If you want to use the old mdapi format, or the new DX format, you have that choice (especially with the new force:source:deploy command).



          Please note that DX isn't particularly a selling point, as salesforce.com isn't making any money off this, at least not in the direct sense. They're genuinely trying to make development easier and more manageable, like other modern languages; it's a direct response to the numerous complaints from ISVs, large clients, and community developers at large. DX is a tool, like a hammer or a screwdriver. It's up to you to figure out how you're going to use it, or if it's even the right choice.






          share|improve this answer













          Basically, I am thinking of just store code in the repo using SFDX and use that for Scratch Org but simply convert to Mdapi for production & sandbox deployment.




          As of Winter '19, this is no longer necessary. You can use the new force:source:deploy to package up your DX-compatible file tree and deploy it without the hassle of force:mdapi:convert. This also basically eliminates the need to have a classic mdapi format, unless you need it for an IDE or some other reason. Regardless, it's still a sound strategy overall if you have a complicated setup, like we do.




          Can leverage scratch org since my code is in SFDX.




          Technically, you could do that with the force:mdapi:deploy command, which works in both scratch orgs and other orgs. There's no specific need to use force:source:push if you don't want to.




          Still happy with the happy soup. No need to spend years on breaking down stuff, manage dependencies, worry about duplicate metadata, package versioning, package versions dependencies, transitive dependencies, over-engineer dependencies injection and all the baggage that comes with DX. Honestly, I don't see we can ever justify that the benefit worths the effort.




          Note that packages are completely optional, and in fact, I'd recommend you stay away from them if you need more than about 5-10 or so, as they quickly start to make a mess of things. For new customers with no pre-existing configuration, I would recommend packages to. May as well start off on the right foot.



          For small-to-medium size orgs, I'd recommend researching if packages are viable or not. For large organizations, like ours, packaging is still pretty much a pipe dream. We might eventually one day start building packages, but many of our features have incredibly complicated dependencies. We can select a single item and end up finding hundreds of dependent items.




          Not have to deal with multiple CI processes, one for package and another for unpackaged, since you can't package everything anyway.




          It depends on how you do your CI. For example, our CI makes diffs between the source and destination, so we never do a full deployment anyways. In our case, packages, then, would be redundant, because we're basically already doing what DX packaging offers. However, even if you just package your core system library, you might still have a considerable savings in deployment time that might justify the CI complexity required.



          But, there's always a trade off. Complexity for deployment time. You have to consider how valuable your time is, because you're going to end up spending it one way or the other. If you have a setup where deployments take only a few minutes anyways, packaging probably isn't worth the complexity.



          One place I worked at had a 45 minute deployment time. If we had DX then, and we could have reduced our complexity such that deployments only took 15 minutes, we would have spent the time to build packages. Imagine a deployment failing 4 times at 45 minutes each (5 total deployments). I had that happen to me, and I didn't leave the office until 2am as a result (deployments started at 10pm).




          Deploy everything at once so deployment is slower compared to the package installation approach.




          Yes, and no. You're really not losing much in most cases, because the duplicate items will be a no-op (they don't really change). In many orgs, the actual deployment time is overwhelmingly unit tests, which packages are not going to help you skip anyways.




          To address more specific concerns...




          No need to spend years on breaking down stuff,




          That's the Salesforce "organic migration" approach. And, as far as I can tell, it might be viable, if you even knew where to start. Most orgs, I wager, have a ton of deeply nested dependencies you simply can't break easily, so you'd end up with either a large core and lots of small side packages, or simply give up and put everything into one big package, which defeats the purpose of it.




          manage dependencies,




          Especially since it's all completely manual. If we had a tool to automate the dependency resolution, it might be ... not as bad. Starting from ground zero, the dependencies would be manageable. For existing bases, especially as large as ours, packaging would be a nightmare. We don't use packaging for this reason.




          worry about duplicate metadata,




          That's actually more of a non-worry, because the deployments always just seem to go okay regardless of duplicates, so long as they don't conflict with each other. Honestly, I was surprised by how DX seemed to do the right thing consistently, as long as I didn't do anything too obviously broken.




          package versioning,




          The system kind of takes care of the versioning for you, so it's not that bad of an issue.




          package versions dependencies,




          Generally a non-issue, because DX does a decent job of managing them for you. It's rare that you'd have to deal with this directly once everything is set up.




          transitive dependencies,




          I'm not sure how this applies in a metadata context. I'd love a concrete example of how this might be a problem.




          over-engineer dependencies injection ...




          I agree, one should not DI just to fulfill a packaging requirement. This is the point of dependent packages, though. However, I do understand that there are situations that could arise where A depends on B, but B depends on A. The typical solution would be to move the common dependencies to C, and A and B would both depend on C. This leads us back to an earlier statement that you'd probably end up with a huge core library and lots of small packages that depend on it, which defeats the purpose of packaging.




          There's one other potentially damaging loss: namespaces. Using packaging namespaces, you can actually help eliminate a lot of the duplicate metadata problems by isolating them and being able to refer to them uniquely. This is comparable to languages like C and C# that have had these features forever. If you find yourself prefixing classes all the time (e.g. Account_Extension, Account_TriggerHandler, etc), using packages might make sense for you, and help isolate code.




          I wouldn't dismiss packaging outright (even we intend to eventually use it, if we can get the features we asked for), but also don't feel bad if you decide not to use it. I feel like you just might want to do some more research before you conclusively say "no, I'll never do this." (your question already reads like foregone conclusion). A lot of the features that exist are promising.



          You can even do pseudo-packaging for right now; set up a bunch of paths to sort your metadata into, but don't actually build the packages. This might save you on deployment times in the future. And you don't need to do it all at once, either, just every time you're in a particular area, start picking out pieces. Do it as part of the normal development cycle. You'll hardly even notice the difference. As a bonus, if you decide to package, you've already done the hard part, and if not, you can use your source tree as is.




          tl;dr



          Ultimately, the choice is yours. One size does not, and cannot, fit all. DX is designed to fit a certain demographic of clients, but it certainly cannot accommodate everyone. And it's still missing features that are critical to using it as advertised. If you don't care for packaging, you're not forced to use it. If you want to use the old mdapi format, or the new DX format, you have that choice (especially with the new force:source:deploy command).



          Please note that DX isn't particularly a selling point, as salesforce.com isn't making any money off this, at least not in the direct sense. They're genuinely trying to make development easier and more manageable, like other modern languages; it's a direct response to the numerous complaints from ISVs, large clients, and community developers at large. DX is a tool, like a hammer or a screwdriver. It's up to you to figure out how you're going to use it, or if it's even the right choice.







          share|improve this answer












          share|improve this answer



          share|improve this answer










          answered Sep 2 at 6:11









          sfdcfox

          227k10174389




          227k10174389






















              up vote
              1
              down vote













              This is a comment to @sfdcfox reply.



              I appreciate you put serious thoughts into this thorough answer. I agree with pretty much all the points. I think you are not speaking the same way as the DX team. They almost make it sounds like it is the future for everybody. I'd like them to put banners and highlight the target audiences.



              You can tell that with VS Code, they almost leave no ways for teams to continue with mdapi format. Ie. switch to DX or use other non-support IDEs. I'd like to see them continue supporting mdapi format in VS code.



              They did at least made the right move allowing VS code to deploy to a non-scratch org with force:mdapi:deploy in Winter 19 but Mdapi code format needs to be supported too.



              I like how you think about complexity vs deployment time. My team always opted for low complexity with higher deployment time. That has worked really well. I'm against deploying diff/partial or incremental deployment. We have CI deployment from master to production everyday and we deploy everything from master, not selective. That gives us confidence that everything in the repo is the source of truth without worrying about overwriting newer changes in production ever.



              You are right that there isn't much time saving in deploy all vs package install since majority of the time it takes is running test. So I can take that off the benefits of using package.



              I also think that most developers should take a step back and evaluate like you do before forcing something on themselves & not be so excited about things like force-di. It's an elegant pattern from other stacks (ie. Spring DI, Guice) but is just not a good fit for Salesforce.



              "They're genuinely trying to make development easier and more manageable, like other modern languages"



              OK so this is good intention but I'd expected that they approach it with simplicity and tackle the problems at the core rather than putting band-aid and wrapper around the real problems. The could just start small with things like optimize their internal system to make tests run faster without putting the complexity on their customers to manage packaging . Sure they can figure out classes that haven't changed and skip the whole tests, for example. Perhaps those are the simple ways that can reduce test time dramatically without customers having spend years on modularization. They could have also just added a simple support in the MDAPI to support broken down the .object files without introducing the whole new source format, conversion, a CLI, a new IDE that supports only certain bits. Nobody seems to care about simplicity and elegance.



              Finally, I like Salesforce as a company. I just think DX is currently in such an awful state and that should be acknowledged more often so people can make informed decisions.



              Your answer is super helpful and on points. I'd stay away from packaging & hope to gain something from using DX with the happy soup.



              Thanks.






              share|improve this answer


























                up vote
                1
                down vote













                This is a comment to @sfdcfox reply.



                I appreciate you put serious thoughts into this thorough answer. I agree with pretty much all the points. I think you are not speaking the same way as the DX team. They almost make it sounds like it is the future for everybody. I'd like them to put banners and highlight the target audiences.



                You can tell that with VS Code, they almost leave no ways for teams to continue with mdapi format. Ie. switch to DX or use other non-support IDEs. I'd like to see them continue supporting mdapi format in VS code.



                They did at least made the right move allowing VS code to deploy to a non-scratch org with force:mdapi:deploy in Winter 19 but Mdapi code format needs to be supported too.



                I like how you think about complexity vs deployment time. My team always opted for low complexity with higher deployment time. That has worked really well. I'm against deploying diff/partial or incremental deployment. We have CI deployment from master to production everyday and we deploy everything from master, not selective. That gives us confidence that everything in the repo is the source of truth without worrying about overwriting newer changes in production ever.



                You are right that there isn't much time saving in deploy all vs package install since majority of the time it takes is running test. So I can take that off the benefits of using package.



                I also think that most developers should take a step back and evaluate like you do before forcing something on themselves & not be so excited about things like force-di. It's an elegant pattern from other stacks (ie. Spring DI, Guice) but is just not a good fit for Salesforce.



                "They're genuinely trying to make development easier and more manageable, like other modern languages"



                OK so this is good intention but I'd expected that they approach it with simplicity and tackle the problems at the core rather than putting band-aid and wrapper around the real problems. The could just start small with things like optimize their internal system to make tests run faster without putting the complexity on their customers to manage packaging . Sure they can figure out classes that haven't changed and skip the whole tests, for example. Perhaps those are the simple ways that can reduce test time dramatically without customers having spend years on modularization. They could have also just added a simple support in the MDAPI to support broken down the .object files without introducing the whole new source format, conversion, a CLI, a new IDE that supports only certain bits. Nobody seems to care about simplicity and elegance.



                Finally, I like Salesforce as a company. I just think DX is currently in such an awful state and that should be acknowledged more often so people can make informed decisions.



                Your answer is super helpful and on points. I'd stay away from packaging & hope to gain something from using DX with the happy soup.



                Thanks.






                share|improve this answer
























                  up vote
                  1
                  down vote










                  up vote
                  1
                  down vote









                  This is a comment to @sfdcfox reply.



                  I appreciate you put serious thoughts into this thorough answer. I agree with pretty much all the points. I think you are not speaking the same way as the DX team. They almost make it sounds like it is the future for everybody. I'd like them to put banners and highlight the target audiences.



                  You can tell that with VS Code, they almost leave no ways for teams to continue with mdapi format. Ie. switch to DX or use other non-support IDEs. I'd like to see them continue supporting mdapi format in VS code.



                  They did at least made the right move allowing VS code to deploy to a non-scratch org with force:mdapi:deploy in Winter 19 but Mdapi code format needs to be supported too.



                  I like how you think about complexity vs deployment time. My team always opted for low complexity with higher deployment time. That has worked really well. I'm against deploying diff/partial or incremental deployment. We have CI deployment from master to production everyday and we deploy everything from master, not selective. That gives us confidence that everything in the repo is the source of truth without worrying about overwriting newer changes in production ever.



                  You are right that there isn't much time saving in deploy all vs package install since majority of the time it takes is running test. So I can take that off the benefits of using package.



                  I also think that most developers should take a step back and evaluate like you do before forcing something on themselves & not be so excited about things like force-di. It's an elegant pattern from other stacks (ie. Spring DI, Guice) but is just not a good fit for Salesforce.



                  "They're genuinely trying to make development easier and more manageable, like other modern languages"



                  OK so this is good intention but I'd expected that they approach it with simplicity and tackle the problems at the core rather than putting band-aid and wrapper around the real problems. The could just start small with things like optimize their internal system to make tests run faster without putting the complexity on their customers to manage packaging . Sure they can figure out classes that haven't changed and skip the whole tests, for example. Perhaps those are the simple ways that can reduce test time dramatically without customers having spend years on modularization. They could have also just added a simple support in the MDAPI to support broken down the .object files without introducing the whole new source format, conversion, a CLI, a new IDE that supports only certain bits. Nobody seems to care about simplicity and elegance.



                  Finally, I like Salesforce as a company. I just think DX is currently in such an awful state and that should be acknowledged more often so people can make informed decisions.



                  Your answer is super helpful and on points. I'd stay away from packaging & hope to gain something from using DX with the happy soup.



                  Thanks.






                  share|improve this answer














                  This is a comment to @sfdcfox reply.



                  I appreciate you put serious thoughts into this thorough answer. I agree with pretty much all the points. I think you are not speaking the same way as the DX team. They almost make it sounds like it is the future for everybody. I'd like them to put banners and highlight the target audiences.



                  You can tell that with VS Code, they almost leave no ways for teams to continue with mdapi format. Ie. switch to DX or use other non-support IDEs. I'd like to see them continue supporting mdapi format in VS code.



                  They did at least made the right move allowing VS code to deploy to a non-scratch org with force:mdapi:deploy in Winter 19 but Mdapi code format needs to be supported too.



                  I like how you think about complexity vs deployment time. My team always opted for low complexity with higher deployment time. That has worked really well. I'm against deploying diff/partial or incremental deployment. We have CI deployment from master to production everyday and we deploy everything from master, not selective. That gives us confidence that everything in the repo is the source of truth without worrying about overwriting newer changes in production ever.



                  You are right that there isn't much time saving in deploy all vs package install since majority of the time it takes is running test. So I can take that off the benefits of using package.



                  I also think that most developers should take a step back and evaluate like you do before forcing something on themselves & not be so excited about things like force-di. It's an elegant pattern from other stacks (ie. Spring DI, Guice) but is just not a good fit for Salesforce.



                  "They're genuinely trying to make development easier and more manageable, like other modern languages"



                  OK so this is good intention but I'd expected that they approach it with simplicity and tackle the problems at the core rather than putting band-aid and wrapper around the real problems. The could just start small with things like optimize their internal system to make tests run faster without putting the complexity on their customers to manage packaging . Sure they can figure out classes that haven't changed and skip the whole tests, for example. Perhaps those are the simple ways that can reduce test time dramatically without customers having spend years on modularization. They could have also just added a simple support in the MDAPI to support broken down the .object files without introducing the whole new source format, conversion, a CLI, a new IDE that supports only certain bits. Nobody seems to care about simplicity and elegance.



                  Finally, I like Salesforce as a company. I just think DX is currently in such an awful state and that should be acknowledged more often so people can make informed decisions.



                  Your answer is super helpful and on points. I'd stay away from packaging & hope to gain something from using DX with the happy soup.



                  Thanks.







                  share|improve this answer














                  share|improve this answer



                  share|improve this answer








                  edited Sep 2 at 22:06

























                  answered Sep 2 at 8:20









                  codeinthecloud

                  894




                  894



























                       

                      draft saved


                      draft discarded















































                       


                      draft saved


                      draft discarded














                      StackExchange.ready(
                      function ()
                      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fsalesforce.stackexchange.com%2fquestions%2f230956%2fusing-dx-without-dependencies-hell%23new-answer', 'question_page');

                      );

                      Post as a guest













































































                      這個網誌中的熱門文章

                      How to combine Bézier curves to a surface?

                      Carbon dioxide

                      Why am i infinitely getting the same tweet with the Twitter Search API?