top of page

Search Results

57 items found for ""

  • Always Build in a Sandbox. (Except When You Don't. Or can't.)

    It’s a mantra oft-repeated: “Never build directly in production.” If you’re an admin you should be building new automation, new objects and fields, or otherwise messing with your metadata only in a sandbox environment. And I agree. I regularly admonish my clients and my mentees to build only in a sandbox and to test thoroughly before deploying to production. It takes extra time but it’s an important safeguard. And yet... I’ve already argued that keeping some test data in production makes sense. I might as well lose the rest of my credibility right now: There are times to build directly in production. Honestly, we all do it. Except in the strictest and most regulated organizations/industries, it’s not even that uncommon. Sometimes “building” in production is reasonable, or perhaps even necessary. What Constitutes “Building” Anyway? I don’t think anyone would argue that it’s wrong to make a new report directly in production. [Would you???] It’s already a constant battle to get users to learn to edit and build their own reports. If you’re going to insist they be built in a sandbox you’re mandating that reports are only built by your Salesforce admin team. I believe in teaching my users to fish. Of course anything one might say about building reports is going to apply to dashboards as well... I can see an argument that reports and dashboards fall into a strange gray area where they, themselves, are metadata but they’re so dependent on data that maybe they’re a different category. Plus reports and dashboards are about showing the data that’s in the org, they’re not automation, or changes to the user interface, or the like. But the Lightning Report Builder has the ability to make row-level formulas. (In the past that would have required an admin to build a formula field for a specific reporting need, so bring ‘em on!) If it’s OK to process a formula in the context of running a report, that seems a very short distance from actually creating a formula field, no? That's why I would say that creating a formula field directly in production seems fairly reasonable. [Brief aside: If you’re doing this, take steps to test and validate before you let users see the new field: Uncheck the box that adds the field to page layouts. Maybe restrict field level security until it’s ready. You can look at the field in the meantime using a report. And note that modifying an existing field probably is a category of its own because that has potential spillover effects. Then again, if you find that a formula field is broken, I’d say working on a fix directly in production is fairly reasonable.] Here are a bunch of purely metadata modifications that I’m going to argue are safe to do directly in production: Edit page layouts and compact layouts Modify Lightning Record Pages Make a new Quick Action Those are all changes to the user interface but don’t impact data itself. Sure, depending on how extensive the page modifications are, you could confuse users a bit. Try to make things obvious or else give a heads-up. It’s OK to “move people’s cheese” as long as it’s still somewhere in the same fridge they last saw it. In the same vein, modifying a display-only screen flow should be safe. Or, for that matter, editing the display portions (only!) of a flow that also runs data operations. Go ahead and fix that typo you noticed. Similar to my argument for formula fields, I think you could make a new rollup summary (traditional rollup summaries, DLRS rollups, or NPSP Customizable Rollups). You’re going to learn more testing the rollup against the messy real data that’s in production than what you have to work with in a sandbox. As above: keep the field hidden from users until you've perfected it. And here's a good tip from Stephanie Foerst for a good time to build directly in prod: “When there's a production issue and you can easily resolve it.” Modifying Metadata in Production So we’ve determined that the difference between “data stuff” and metadata is not going to provide a bright line guideline. Sorry. If it’s a matter of degree not type, I think we’re going to have to start considering the impact of your actions. I know: Bummer. Considering the consequences of our actions is sooooooo haaaardddddd. If you’re going to modify metadata directly in production—and I think we all know you are—the best advice I can give is to think about when it’s OK and when it’s not. Ask yourself: 1. Is it gonna break stuff? Things that can break stuff include: Validation rules They don’t just stop users from getting things wrong–they stop integrations and Apex code, too. Automation But here you really have to consider what the automation is doing. 2. Is it going to escape the environment? Anything that sends an email. Integrations (though sometimes you don’t have a sandbox option on the other end of the integration) 3. Do I need to do a lot of testing that will result in records users might stumble upon? New integrations Automation (again) 4. Am I building new and/or self-contained functionality? New object? I would normally say that a new object is so self-contained that it makes perfect sense to build in a sandbox. But I asked around and others were pretty OK with building new objects in prod exactly because they are self-contained... New fields? Most of the time, work in sandbox. But just one field…? Go for it. 5. Experience Cloud I feel like this is almost a topic of its own (that should be written by someone that knows Experience Cloud/Communities better than I do). It’s a pain to deploy a built out Site from sandbox. Making changes in prod (but waiting to publish them) is pretty common. 6. [Special Case] Record Types This is a very specific use case that Peter Churchill pointed out to me: If you create a record type in production “they maintain their Ids in each environment, but Sandbox to Prod will create a new Id.” Sometimes it’s easier to build automation based on record type Ids, so this will save you some hassle. Personally, I prefer to have my automation look up record types by API name rather than rely on the Id being constant. Don’t Punk Yourself! Here’s the thing about making any modifications in production: They’re not likely to flow down into sandboxes—you’re going to forget to do that. In any environment where there is simultaneous development happening in sandboxes—even if you are the one doing the building in both environments—if you make a modification in production there’s a chance of undoing your work later when you deploy from the sandbox. Say I have an active sandbox named “mbkdev” where I’m building a child object to contact. In that sandbox I modify the contact page layout to add the related list for this new object. I make no other changes to the contact layout. My work in this sandbox is taking some time, so I haven’t deployed my new object and contact page layout yet. Meanwhile, I get a request for a formula field on contact to translate from Birthdate to display the contact’s age. (I know, you probably forgot there isn’t a standard field for this!) This is an eminently reasonable request and one that I want to deliver quickly. So I throw together the formula field and I drop it onto the page layout. While I’m at it, I decide to move demographic fields together into their own tidy section. Three weeks later we finish testing the work in the mbkdev sandbox and we’re ready to deploy our new object. Since we modified the contact page to include the new object’s related list, we put that layout in our change set. But the Age field doesn’t even exist in mbkdev and the demographic fields are in their (haphazard) positions from the moment the sandbox was first created. As soon as our deployment goes live, the contact layout has reverted to its state from three weeks back, with only the addition of our new related list. Oops. I’ve just undone my own work. How foolish do I feel? [This is a purely hypothetical situation. I have never allowed this to happen IRL.] It’s a challenge to keep sandboxes in sync. If you have deployment procedures, use them.

  • Setting Up the Free Integration Users

    A couple of weeks ago I was excited to write about the free integration users that Salesforce has granted to all organizations. As of that writing the licenses were still rolling out, so nobody had really had a chance to use them. Now that they’re available, there’s been a great deal of discussion/confusion about how to actually make them work. If you wade through the Trailblazer Community thread in that link, you can figure it out for yourself. But I thought it would probably be helpful if I wrote up clear instructions. As it turns out, the free integration user licenses are free like a beer. But it’s like they’re a beer bought by your high-maintenance friend. Perhaps after you’ve helped him assemble Ikea furniture. Or helped him move. Or both. That is to say, it’s a bit of work to use these free licenses. The steps you have to take are: Create a user with the Salesforce Integration user license and the only profile that you will have an option for once you have selected that license: Salesforce API Only System Integrations. (Or modify an existing user to have this license and profile.) In that user record, under Permission Set License Assignments, assign the Salesforce API Integration PSL. (I do not understand what a permission set license is or does. But without it you can’t assign a permission set in step 3 that grants the access you are going to need. I know you have to do this because of the Help article about the integration user licenses.) Create a permission set (or permission set group) that grants access to the right objects and fields. Assign that permission set to the user. That sounds pretty simple in black and white right there. But Step 3 is going to feel like assembling the worst Ikea project ever. Permsets are the Future What makes using the new Salesforce Integration user a challenge really comes down to the fact that integration user licenses come from a future in which permissions no longer reside on profiles. So you have to assign a permission set in Step 3 or else the integration user can’t access any objects or fields. But you and I don’t yet live in Salesforce orgs where all permissions are granted via permission sets ("permsets"), so we don’t have a permission set ready. In January 2023, Salesforce announced, via a blog post from Cheryl Feldman, that permissions on profiles will “end of life” at the Spring ‘26 release. We’ve known for a couple of years already that permission sets and permission set groups were the future of user management but Cheryl’s announcement put a deadline on the transition to focus our minds. The simple summary of that transition is that in the future every user will have a profile that grants basic login rights and a small handful of deep system privileges but all permissions related to object and field visibility will be layered onto users via permission sets. (Probably, users will get one or more permission set groups, which allow you to group permsets and then grant them all to a user at once. But it’s easier to discuss just in terms of permsets.) This is a better way to be able to manage user access by the principle of least privilege, in which you only give people access to those parts of Salesforce they need to do their job. Most organizations today, particularly the smaller nonprofits that I’ve worked with, have a couple of profiles that grant wide permissions. Even if they are given different profiles, program users and development users can see all the same objects and fields. The difference in the profiles may be that their page layouts or app defaults are different, but fundamental permissions are the same. Honestly, I suspect that most organizations probably use just a single profile (other than sysadmin). And for most of the rest, that have two or three profiles, a side-by-side comparison would show very little difference between them. It’s just rarely worth the effort to make the profiles very different in a small org, as there’s most people will need to see both program and development data. Since users can have only a single profile, what would you make someone that needs to see both program and development data? (Please don’t say, “We just make them a sysadmin.”) You would need either a “program and development profile” or you would have to manage both the profiles and some permsets for granting the other set of permissions. There just isn’t enough time in the day to put a ton of effort into ensuring that we have profiles with minor differences in object and field permissions. And as of right now, Salesforce is still mostly built to accommodate using profiles as the main differentiator. Permsets have existed for a while, but they’ve generally been secondary to profiles that have the bulk of permissions. (For example, when you install from the AppExchange the installer offers to "install for all profiles," with no options relating to permsets.) Even if you want to be forward thinking about using permsets, it’s still a little harder to manage. That was my long way of saying that in my experience most orgs manage user permissions through profiles primarily, if not exclusively. Even those of us that are interested in moving toward the future probably have only taken baby steps along that path. Licenses from The Future So, back to the free Integration User licenses. It’s not really just the licenses that appeared in orgs last month, there is also a profile to go with them, called Salesforce API Only System Integrations, which is the only profile you can assign to the integration user. And this is a profile from The Future: It can’t have object permissions. If you clone that profile and try to add, say, Read and Edit on Accounts, you’ll find that object settings for Account simply aren’t there. The new Integration User license can only take the Salesforce API Only System Integrations profile and that profile can’t be given access to any of the objects and fields you need it to see. That’s because, like I said, it’s from the future. (Just be glad it’s not here to assassinate us to prevent a future rebellion.) Fortunately, the user profile from the future can be granted permission sets. So all you need is a permission set that grants access to the objects and fields your integration user is going to need. Depending on what that integration user does, that might be a short or a very long list. If I’m setting up the integration user for a form tool, for example, I expect that it’s eventually going to need access to most, if not all, of the same objects that power users need. Just think about it, I could make a form for: Donations, which, depending on the complexity of the need, could require access to Account, Contact, Opportunity, Payment, Campaign, Campaign Member, GAU, GAU Allocation, Product, Pricebook, Task, User (to assign the task) and probably several more. Program Registration, which would require access to our custom objects for Program, Enrollment, etc… Surveys, which would require access to Contact, Survey, and possibly several more. So you can see that for some integration users you’re going to need a permset that grants a lot of access, as much as (or possibly more than) some users need. Even if you are super-conscious of security and only add iteratively to the form tool integration user’s permissions as you build each form, the final result is going to be a pretty extensive permset. And it's probably not one you already have. Prepare for the Future Today So we have the reality that at least one of your integration user’s permissions are going to be quite wide. Let me add the other consideration that a whole lot of orgs today have integrations logging in as sysadmins (either an integration user on a sysadmin profile or—worse!—sharing the login of a person who is a sysadmin). I would, therefore, argue that anything we do to grant permissions to the integration user granularly is going to be a security upgrade, even if “granularly” still starts from a large pile of permissions. So as you set up your permission set to make the integration user work, think about it as preparing your org for the user management regime of Spring 2026 and beyond. That means you’re going to make a permset for the integration user that serves as the foundation of your permset for human users. Permission to Build in Production [Temporarily] Granted I think I’m pretty consistent in reminding people to only build in sandboxes, never directly in production. (Though I also am realistic and think there are certain changes that it’s perfectly reasonable to make directly in production. I should probably write a future post on that...) Unfortunately, you simply can’t realistically work in sandboxes for this purpose because of the way profiles and permission sets deploy. When a profile or permset is deployed via a change set (or other deployment management tools), the only parts of it that actually deploy are those that relate to the other metadata that is deploying with them. That’s pretty interesting, if you think about it, because it means that Salesforce doesn’t just deploy a file for the profile or permset, but actually compares what it’s uploading to what is already there and only edits inserts, leaving the rest of the file alone. This interesting functionality supports deployments coming from people working in different sandboxes. If it didn’t work that way, for example, then Jodi would deploy her new custom object, Cars, and a modification to the Program Manager profile granting access to Cars and its fields. An hour later, when Aaron deploys the flow he’s been working on (in a different sandbox) that works with fields only on Contact and Account, his Program Manager profile is coming from an environment that doesn’t have Cars. Aaron’s deployment would overwrite what Jodi deployed, removing access to Cars for the Program Manager profile. So it’s usually quite handy that Salesforce deployments of permissions relate only to the metadata that comes along. But if you are trying to build a permission set that grants access to all objects, all fields, all tabs, and all record types, you would have to build up a change set that also includes all of those things. First, good luck ensuring that you get every relevant object and field into your change set without missing something. Second, the changes you send with all that metadata may overwrite or revert things that have changed in production and are out of sync with the sandbox you were working in. (I know you should have procedures for deployments to avoid that kind of overwrite, but it’s a lot harder to ensure it doesn’t happen when we’re talking about every object and field, which includes all descriptions, all help text, etc.) Copado Essentials, formerly ClickDeploy, my deployment tool of choice, has a “profile only deployment” option. As I understand it, that means that you add all the other metadata to your deployment to indicate the parts of the profile to send. But when it actually is sent, it’s only the profile that moves over. Interesting. But there is no such thing as a “permission set only” deployment. I hear that Gearset has the ability to do a permission set only deployment, but I couldn’t figure out how. I don’t think Salesforce’s native Change Sets allow for either of these options. Besides: Have I mentioned my skepticism that you would manage to add all the relevant related metadata to your change set without missing something? Copado Essentials makes it pretty easy to Select All and I’m still paranoid that something would be missed. Adding all the metadata into a change set via the native Salesforce change set tool is too painful to even contemplate.’re going to build your new permset in production. Building the Standard User Permset Now I have bad news for you: It’s going to take hundreds of clicks to build out your permission set. Maybe more than a thousand. If you work in just one org, at least you can take comfort in only clicking hundreds of times once over. A solo consultant like me gets to do it for each of my client orgs. Ouch. Worse yet, I determined that you have to do this work directly in production the hard way. By “the hard way,” I mean that I did the hundreds of clicks in a client sandbox, with the intent of testing that the integration user had all the permissions it needed before I moved to production. Then I found that I couldn’t deploy the permset and had to hand rebuild it in their production org! Double ouch. Hopefully sometime in the next few years Salesforce will put out tooling that makes this easier. (I know that Cheryl Feldman and her team are already working on some of it.) But unless you want to wait for better tools before you use the integration licenses you’re going to have to go through this pain now. (And having done so, you might not even need the better tools later. Womp womp.  😞) As noted above, I think at least one of your integration users is going to need similar permissions to a standard user (or possibly a little bit more), so I’m going to write these instructions on the assumption that you are building out a single base permset that actually has quite a lot of object and field permissions. If you have only integrations that have limited permission needs, you should build them very limited permsets (again: the principle of least privilege). But if you have at least one integration user (like your form tool) that needs a lot, this is the time to build a wide baseline permset. It’s easier to clone that permset and edit down to make less-privileged versions later. Here’s what you need to do: 1. Make a new permission set. (Setup>Users>Permission Sets>New) You can call it something like “MyOrg Standard.” I always recommend a Description. (Help Future You remember how this permset is used.) Do not associate it with a license type—leave that picklist blank. 2. In the permission set, go to Object Settings. (I’m assuming you’re using the Enhanced Profile User Interface. If you are not, go immediately to Setup>Users>User Management Settings and move the slider to Enabled. I don’t know how anyone works with the classic profile/permset editor!) Here you will have a list of all the objects in your org. (You'll also see a bunch of things such as “App Analytics Query Requests” that are listed as objects but maybe aren’t quite? I don’t understand it. Just ignore those.) 3. Open one of the objects, perhaps in a new tab. Let's take Accounts as our example, since it's at or near the top. This profile is going to need at least Read access for every object the integration user might touch, including Accounts. Given that the integration probably inserts and/or updates data, I think you probably want to grant Create, Read, and Edit (“CRE”) for each of those objects. (Most integrations and most standard users probably don’t need to delete, so we’re not going to grant that permission.) You are also going to need to grant edit for most fields on each of those objects. And for those fields that aren’t editable (like formula fields) you need the integration user to have read access for that field. This is where the enormous amount of clicking comes in, as there is no Select All button. Sigh. [And to make things worse, the field level security boxes are tiny and low contrast, so it’s hard to tell which Edit boxes are grayed out and which you want/need to click. It's hard enough for me. I have no idea how people with vision problems are able to use this interface.] I’m going to be honest here: I just checked every single box on every single object. In theory I should consider the purpose of each field and decide whether this permission set needs read or edit access to it. But that would take forever. It’s just not reasonable in this context. It’s one thing to carefully consider field level security by user persona as you create a new field or three, but it’s exponentially more difficult when you are talking about all fields on all objects at once. 4. Don’t forget to also give visibility into the object’s tab, if applicable. Admittedly, the integration user probably doesn’t need the tab visibility, as it doesn’t use the UI, but I think it’s worth the click now, while you’re already here, in order to make this a permission set you can use for people in the future. 5. Similarly, assign all record types to this permset. Again, any given integration might need only one or two record types, but if this is going to be the basis for a permset used by humans later, they’re probably going to need them all. Save clicks later by making this The Big Full Access Permset. It will be easier later to narrow things. 6. When you’ve done this for every object your integration user (or future humans) might need, you can stretch out your mouse hand and reward yourself. 🧁 My method was to open the Object Settings and option-click a dozen or so objects into new tabs. Then I went through the tabs clicking Edit on each. Then I worked my way down the line of tabs doing all the clicking, saving, then closing that tab. When I ended up back at the objects list, I refreshed, then started again from the bottom-most object that needed access but didn’t have it yet. You may want to listen to a podcast or music while you’re doing this mostly-mindless clicking work. 🎧 Pro Tip: Work on a permset that is not yet part of a permission set group. That will allow you to save much faster. Permsets that are in groups need extra time to process a save because the permset group also recalculates. I found that if my permset was in a group I couldn’t really work in different tabs because I was faster than the recalculation. Further discussion: I just described creating a single master permission set. In theory it would be better to create permission sets either for single objects or at least clusters of objects that go together for individual bits of functionality and then to group those into a permission set group. But as I think I’ve said multiple times, “Who has the time?” Building permsets up like that is great in theory and may be workable as you are implementing a brand new org, but it’s unrealistic when you’re talking about an org that’s already in use and a small Salesforce admin team. If you want to just get integration user licenses working, this is the baseline for how you can do it. It's Worth It This is clearly a ton of work to get set up the first time. But remember that you're laying the groundwork for a user management and security transition you have to make in 2026. Plus you're getting to use free integration users and save your paid (or granted) licenses only for people!

  • Tools I Love: DLRS (the Declarative Lookup Rollup Summary tool)

    Let me introduce you to my friend, DLRS, pronounced either “dee-el-arr-ess” or, my preference, “Dolores.” That’s not its real name. It’s really the Declarative Lookup Rollup Summary tool. I think it’s more fun to call it Dolores because it helps me feel closer to this amazing, free, and powerful resource. DLRS was originally created by Andrew Fawcett to fill some gaps in Salesforce functionality. 🎂 DLRS is turning ten years old this month! That's something to celebrate! DLRS gives you rollup summary superpowers that no nonprofit admin should be without. But before I get into the superpowers, let me help you with some DLRS context, like what it means: Declarative This is the easy part. In fact, what it really means is “this tool is for everyone, it’s easy to use.” Declarative means that something can be done with clicks, not code. So from the start, we know that DRLS is built to allow you to do things that before would have required you to write code. (Quick side note: With the incredible growth in power of flexibility of Flow you could accomplish most of the functionality of DRLS now with Flow, which is technically a declarative tool. But that ability comes from the fact that flow has grown to straddle the declarative vs. development line. I consider it more work to build and maintain rollups via Flow than with DLRS, so I wouldn’t recommend that route. More on that below.) Lookup Lookup refers to the fact that DLRS can perform rollup summary operations across a lookup relationship in Salesforce. This is a simple relationship between records, where one has a reference to the other. Without getting too deep into it right now, there are times when a lookup makes more sense than the tighter relationship of master detail, often known as “parent/child.” Master detail relationships are not generally “reparentable,” usually involve cascade delete (if the parent is deleted, so are all the children), and also have implications for Salesforce record privacy (anyone that can see the parent can see the children). Standard Salesforce rollup summary fields (“RUS fields”) can only be built across a master detail relationship. DLRS allows rollups across the less-close lookup relationship. Rollup Summary Out-of-the-box Salesforce can make rollup summary (“RUS”) fields that show summarized information about child records, such as counting them, finding the largest (MAX) or smallest (MIN), summing, averaging, or the like. RUS fields generate some of the most interesting insights about your data, such as number of people enrolled in a program, attendance percentages, number of people in a household, etc. OK, that’s what it means. But the most important thing to know about DLRS is that it gives you rollup summary superpowers. And those are powers you might want to focus on master detail relationships as well. 🔋 Powers DLRS Has that RUS Do Not Whereas standard Salesforce RUS fields are limited to count, sum, mix, and max, DRLS adds quite a few operations you might want to do, including the ability to concatenate text. Two of DLRS’s powers that I think are worth calling out separately: Relative Date Filters One of the most common things people consider making a rollup summary about is to have a relative date filter such as “this year.” Perhaps you want to count applications “this year” or purchases “last month.” You can do that in a Salesforce report easily. And you can include relative date filters in formulas as well. But you can’t filter the records you’re going to summarize in a standard RUS field with relative date language. Averages Standard RUS fields can count or sum, but average is not an option. This means that the out of the box way to calculate a rollup summary average in Salesforce requires the admin to build three fields. (One to count child records, one to total the value in the child records, and the third is a formula that divides the total by the count.) Obviously this isn’t the end of the world, but DLRS allows you to accomplish an average with just one field, saving time and effort. So even if you are dealing with rollups across master detail relationship, DLRS is a great addition to your toolbox. Why DLRS and Not [fill in the blank]? There are two other tools people often suggest to meet a similar need as DLRS. Rollup Helper Rollup Helper is a freemium AppExchange tool. Nonprofits can have up to three free rollups permanently with Rollup Helper. Also, being a commercial product, Rollup Helper has a user friendly interface and, like DLRS, it can work across lookup relationships and do more kinds of calculations. Honestly, I have nothing bad to say about RollupHelper. I just prefer to save money for nonprofits, so I’ll go with DLRS every time. Flow As I mentioned above, Salesforce Flow has become a very powerful tool, with most of the capabilities of code. It’s not that hard to build a record-triggered flow that loops through related records to summarize them. One major limitation here is that if you want results in realtime you will have to build at least two flows: one on create or update of a “child” record, and one on delete. As of now (and I don’t think there is any roadmap for change any time soon), record-triggered Flows can have either a create/update context or a delete context, but not both. I also think that building and testing a flow (or modifying an existing flow) each time you find a new rollup summary need is significantly more cumbersome than building a DLRS rollup. Things You Might Want To Roll Up 🗞️ If you aren’t already brimming with ideas of things you want to roll up using DLRS or standard rollup summaries, let me try to jump start some thoughts: If you have students applying to colleges (or high schools) you’ll probably want to know: - How many applications does this student have? - How many acceptances does this student have? - How many students have applied to this school this year? - How many applied last year? - How many were accepted/waitlisted/rejected this year/last year? If you rescue animals you might need to know: - How many cats/dogs/rabbits are at each shelter? - What is the average length of stay in shelter? - What percentage of animals are awaiting urgent veterinary care? If you train teachers, you might want to count: - Enrollments in your mentorship program. - Completed/incomplete mentorships. - Teachers at each school that have completed your program. If you raise money for environmental causes you might need to know: - How many donations have gone to each nonprofit partner? - How many unique donors have given to each partner? If you run a women’s shelter you’ll probably need to know: - How many times has a client been in shelter? - What is her average length of stay? - How many services did she access while in shelter? This list barely scratches the surface, of course. Your own program is going to determine the fields you need. If you want even more inspiration, look at the DLRS Cookbook, which has example rollups and instructions for building them. [Full disclosure: I helped write that cookbook.] One More Thing To Love: DLRS is now part of Open Source Commons Andrew Fawcett and a handful of helpers kept DLRS up to date entirely on a volunteer basis for many years. We all owe them a huge debt of gratitude! "This tool was initially a solution looking for a problem and in part a technical demonstration of the Metadata API," says Andrew Fawcett. "And wow did it ever find its place as a solution for many of its users! At its heart at the very start was a desire to continue to do what the platform does best: democratize capabilities—and that continues to make it so popular today." In order to ensure the continued development and support of DLRS, in 2022 Andrew asked if people could step in to take over DLRS. Now it's become part of the Open Source Commons program. That means this app has a formal structure to support it, expand it, and ensure that it will forever be free for nonprofits and anyone else that wants to use it. "Thank you everyone for your support and encouragement over the past 10 years, I cannot wait to see what it looks like in 10 more years!" - Andrew Fawcett I’m proud to volunteer my time for the DLRS project, helping with writing and editing documentation. It’s one of the ways I try to give back to the amazing community of Salesforce professionals that have helped me so much. Go Out and Try It! I hope I’ve convinced you that DRLS is a tool you want in your toolbox. Be warned: as with any tool, you’re going to have to learn how to use it. (Even wielding a hammer is not as simple as just grabbing it and swinging!) I think it’s beyond the scope of my blog to make this or a future post into a full scale How To Build Your First DLRS. But don’t worry—the DLRS documentation is extensive and the documentation team is constantly working to expand it! Need further help? Just ask in the Community Project: DLRS group on Trailblazer Community.

  • The TL;DR on the “New Nonprofit Cloud”

    ⚠️ Warning: Cynicism ahead. I’m going to cut through marketing and spin to call it as I see it. (I hope you already were expecting that from me.) On Tuesday announced the “New Nonprofit Cloud.” Lori Freeman also posted about the post in the Trailblazer Community. Here’s the TL;DR, IMHO: Over the next few years is coming out with an entirely new technology suite of solutions for nonprofits. These collectively will be known as “Nonprofit Cloud.” These will not be based on, nor compatible with, the current solutions, including NPSP. You need not concern yourself with “Nonprofit Cloud” for the time being. Nonprofit Pricing does not change. (Ten free licenses, etc…) Except Nonprofit Cloud licenses cost more than standard licenses for your 11th and up. There is quite a lot to unpack about this announcement, not all of it actually included in the announcement itself. I think it’s worth making this a relatively long post to go bring all the parts together. Pricing In my opinion, the two important things that make Salesforce valuable for nonprofits are the Power of Us program (donating the first ten licenses free to nonprofits) and the additional discounts that apply to all other Salesforce product SKUs. This is not changing. Though I had been assured this would not change before TrailblazerDX, I still felt it was important to hold Salesforce’s feet to the fire. I asked a question during True To The Core, seeking commitment at the highest and most public level. (My question, followed by Parker Harris’ answer, starts at 21:20.) Parker’s answer, Lori’s post about today’s announcement, and other communications give me confidence that we can count on Salesforce’s ongoing support for nonprofits. (As much as we can count on anything a business may say or do.) Pricing after your free licenses is also not necessarily going to be simple, which may partly have to do with’s inclusion under Industries. I don’t even know if’s people know all the ins-and-outs of how the licensing–and, therefore, the pricing–works yet. What I understand is that Nonprofit Cloud will be its own SKU that, essentially, bundles a Sales Cloud license, a Service Cloud license, and something-that-gives-access-to-the-Industries-based-functionality. [I don’t know the right term for that thing. I don’t think the name matters right now.] All ten licenses granted to new orgs on the new Nonprofit Cloud will have that entire bundle. So in that sense, the P10 License Grant has expanded with this new announcement, as that bundle is more expensive than what we have been getting to date and comes with extra tech. (Yay!) For the 11th license and above, organizations will be able to choose what they want to purchase: Nonprofit Cloud licenses (the whole SKU bundle) will cost $720/year. A Lightning Enterprise Edition license (a “Sales Cloud license,” like we usually purchase today) is $432/year. (Neither of these represents any change to current pricing, by the way. What the Nonprofit Cloud SKU encompasses is changing, but the name and price stay the same.) The uncertainty that I mentioned is that I don’t know what functionality users with “just” a Lightning Enterprise Edition will be unable to accomplish compared to those on a Nonprofit Cloud license. By extension, I know even less what users on Platform licenses would be able or unable to do. We don’t know what “Industries” functions such users will lose, much less what parts of the as-yet-unseen Nonprofit Cloud application. The Technology News flash: What has just been announced is not yet ready for Prime Time. I know that you may be surprised to hear that an announcement of a new product from a tech company is not actually ready yet. 😜 You should not plan to migrate your current org to the new Nonprofit Cloud any time soon. There would be no immediate benefits and that migration is going to be a big undertaking. I fully expect most organizations to put it off as long as possible, upwards of five years, probably more. If your system ain’t broke, don’t fix it. If you are about to embark on a net new implementation, you have a tough choice to make: You can implement using NPSP, which has committed to supporting, but is not being additionally developed and will eventually be “the old version.” You can implement the new Nonprofit Cloud, which will be version 1.0 at best, and might feel more like beta 0.9 for some time. You’ll be on the leading edge. (Or possibly the bleeding edge.) Or you can split the difference, implementing on Nonprofit Cloud but building much or all of your functionality custom. That might make it easier to adopt features as Nonprofit Cloud gets more mature. For now you’d be out of step with almost all other nonprofits using Salesforce, who are on NPSP. That’s not an easy choice. I’d be torn between one and three. To the extent that you can buy “it” yet, nobody’s really seen what “it” is. And when it actually comes out, it’s going to be a “minimum viable product.” This is going to be the first release of a new product from a software company. Some cynics might even refer to it as a “paid beta.” One can hope they’ll release a “minimum loveable product,” but I’m not going to hold my breath. What initially rolls out will be program management, then impact measurement, with fundraising not even being released until the fall. I think it’s safe to assume none of those releases will have full feature parity with their equivalent current products. (Though they should have some new cool features of their own.) And Salesforce has said that they are not planning a new payment processing platform (like Elevate.) I honestly just don’t know what to expect out of the actual Nonprofit Cloud product offering. It’s barely even available for anyone to get their hands on. There’s some way partners can get a learning org or… something… (I haven’t even tried.) Anway, as noted above, what comes out in the next few releases is going to be minimum viable, so I feel no urgency around it. (I’m sure I would feel urgency if I worked for an independent software vendor (ISV) that had an AppExchange product that I’d need to rebuild to be compatible with the new model.) The only thing we really know is that Nonprofit Cloud will use Person Accounts. That’s a pretty big data model shift, but it doesn’t bother me. I used person accounts for one project and thought they were fine. There’s a bit we'll all have to learn about how they work (always act like an account, sometimes like a contact) and they use a bit more storage (because every contact is also an account). But since storage was increased several years ago I don’t think this is likely an issue for most organizations. Also, person accounts are apparently the state of the art for things like Financial Services Cloud and have gotten quite a bit more love since Industries became a thing. I’m firmly planning to ignore Nonprofit Cloud for at least a year, probably two or three. Eventually I hope to look back and find that it’s grown up a bit and then I’ll start–only start!–thinking about adopting it. Timing of Announcements (what we knew and when) Full disclosure here: Lots of people–including I–did not just learn of this when it was announced on 3/14. held embargoed meetings with Salesforce MVPs, partners, and presumably some larger nonprofit customers starting in late fall, 2022. (I assume there were people with better access than I have who learned things farther back than that.) made missteps in the communication that caused a great deal of panic. First there were questions about whether the Power of Us donation would go away, or if some things would be no-longer free, or become more expensive, etc. Then there was the technical question of what would be compatible and worries over the cost that organizations will have to bear when/if they migrate to the new thing, not to mention uncertainty over whether or when they would be forced do so. The layoffs at Salesforce (that at least seem to have impacted disproportionately, though no hard data is available on this) added significantly to the uncertainty and the anxiety. And, as you may have noted if you listened to my question from True To The Core, the absorption of two years ago, the folding of the Power of Us Hub into the wider Trailblazer Community, switching technologically to using the Industries core architecture, a complete changeover to a new product, and then the layoffs, definitely opened the question of whether nonprofits are being viewed as “just another industry vertical” by Salesforce. I hope that all of my discussion, above, makes clear that I think those fears/questions have been laid to rest. I did not come to this level of detachment right away, so I think it’s OK for you not to as well. This was one of the most difficult NDA’d pieces of information I’ve ever held, mainly because it wasn’t so much “information” as the opening of a discussion that we were unable to actually discuss. I’m very glad to finally be able to talk about this openly! What Does this Announcement Signal About I have never thought that Salesforce–or even–was some kind of altruistic actor. Read Marc Benioff’s books or listen to any one of his keynotes: That man is A Capitalist. I think he actually believes himself when he says, “business is the greatest force for good.” Salesforce is a capitalist enterprise, a multi-billion dollar corporation that exists only to make money. Do not kid yourself that it is anything else. (Capitalism would not allow it to be anything else. That's what the ism part means.) I applaud the 1-1-1 model and appreciate all that Salesforce does to support nonprofits. But I am under no illusion that this is anything more than noblesse oblige. I know that used to be a “social enterprise,” but maybe I just came along too late in the evolution to be all that impressed by the term. At least by the time I was learning about it, seemed to be a strange kind of hybrid creature that granted some licenses but sold others, had a sales operation that acted exactly like car dealers, and supported interesting events like Open Source Sprints but was completely opaque when it came to organizational structure and goals. So when .org was reabsorbed into .com two years ago, I mostly just shrugged. And it’s actually been quite interesting to see how being “internal” has clearly given better access to technology, decisionmakers, and even resources. (Sprints, for example, now take place in Salesforce’s offices, usually known as "Towers." That ensures a much better wi-fi environment than hotel conference rooms could be counted on for, makes single day events do-able, and gives us access to great free snacks and beautiful city views.) built and supported some interesting technical products that have been good for nonprofits. They’ve also built some things my clients haven’t bothered to pay for. I’ve heard criticism that .org builds for the market of larger and better-resourced nonprofits that can pay for things. Well, “Duh!” (It’s amusing to me that some of the people leveling that particular criticism are consultants at medium or large consultancies that survive financially by working with those size organizations. Consultancies can’t make a profit working with the tiny orgs that are most of my clients, for example.) Nobody expects Ford to donate a Fiesta to any nonprofit anywhere in the world just for asking. Salesforce does the equivalent of exactly that every single day! Apple donates some hardware to schools, but that’s through a formal grant program, it’s not available to every school every year. Schools can buy through the Apple Store for Education at about a 10% discount. Salesforce gives ten licenses in perpetuity to any organization that can waive a 501c3 (or national equivalent) and then discounts 50-80% beyond that for everything else. Neither the automobiles that nonprofits drive nor the computer hardware they use are customized for how those nonprofits do business, at least not unless they pay for such custom work. But gives nonprofits the NPSP/EDA/Nonprofit Cloud/Education Cloud to customize the platform for our needs. If only the Power of Us Grant and the discounts remained, I would still consider that a pretty great support of nonprofits. So long as in some form is alive and kicking, creating software, supporting Open Source Sprints, bringing nonprofits and educational organizations together, etc, that’s a bonus that I am thrilled to benefit from.

  • Actual Free Stuff! Five No-cost Integration Users Announced at TDX23

    Good news nonprofit Salesforce users: We’re getting more free stuff! Actually, everyone’s getting this free stuff, not just nonprofits. 🎊 In response to a question at True to the Core during Dreamforce 2022, Salesforce has granted five free integration user licenses to every Enterprise Edition org. And just so nobody’s confused, let me stress that these are free like a beer! 🍻 If you’re integrating more than five systems and need additional licenses they are very cheap, costing just $120/year (list price). Nonprofits will get the regular 75% nonprofit license discount, bringing these down to just $30/year. Why is this important? I mean, any time we’re getting more for less, I would consider it important news, considering my personal inclinations. But this announcement in particular means that all orgs can improve security and auditability for integrations without cost being a factor. I think that’s something to celebrate! 🎉 What is an Integration User? An integration user is simply a dedicated login account for an external system that is going to integrate with (“talk to”) your Salesforce instance. It's going to communicate only using the API, not the user interface for humans. Think: Fundraising Platforms (GiveLively, Click&Pledge, Classy, etc…) Webform providers (FormAssembly, JotForms, GetFeedback, FormTitan, etc… Middleware (Zapier, Workato, etc…) Email Marketing Platforms (Mailchimp, Constant Contact, etc…) Event Registration Platforms (EventBrite, etc…) Let’s use a fundraising platform like GiveLively as our example. Once you set up your GiveLively account, you are going to have it log into Salesforce to add new users and donations. I (and others) strongly encourage that you do not have GiveLively log into Salesforce as you. And I don’t want GiveLively to log in as your development director either. I want GiveLively to log in using an account specifically dedicated for GiveLively. (Usually First Name “GiveLively,” Last Name “Integration,” or something like that…) There are many benefits to this, among them: Don’t make me guess whether that record created at 3:13am was actually the development manager working in the middle of the night! Probably not, but why work on assumptions? If a record is created or modified during the workday it’s really hard to be sure it was the integration and not the user whose account it had logged in as! You can set up the integration user with the “principle of least privilege,” meaning that the GiveLively integration user should only have access to the objects and data that it needs, and no more. When someone leaves the organization you won’t be deactivating the account that makes the integration work. When you are no longer using the integration you can shut down its user access and know for certain that it won’t be touching your data anymore. I hope you’re convinced that having an integration user is a best practice. And it should be equally clear why the true best practice is to have a single user for each integration. You want to be able to distinguish between the data changed by a person, by GiveLively, and by FormAssembly, even if they all made edits to the same contact around the same time! Credit Where Credit Is Due Kudos to Salesforce for their response to the question from Alon Waisman at True To The Core during Dreamforce 2022. Alon noted that having a dedicated integration user is a clear security best practice. But by requiring a full license for those users, Salesforce was making organizations balance security with cost. And when that’s the situation, security doesn’t win. Co-CEO at the time, Brett Taylor, agreed that should change. What’s in it for me? How much this is actually going to save you depends on the size of your Salesforce instance and on the timing of your contract. Salesforce generally does not allow you to reduce your contract size mid-year, so if you are going to see savings, that won’t happen until your contract renewal. And at that point you will have to directly ask to have the number of licenses reduced before the renewal date. By default Salesforce is going to renew all the licenses you currently have active (whether they are currently being used or not). For Small Nonprofit Instances (P10 Licenses Only) If you currently are not paying for any Salesforce licenses at all (you have just the ten licenses granted under the Power of Us program), this might not mean any dollar savings. But it does mean that you can now have more human users before you will have to pay Salesforce any money. You’ll be able to have up to ten human users and up to five integration users, for a total of 15 licenses, before you need to buy anything. For Nonprofits Over the P10 If you are paying for a handful of licenses right now (your 11th and up), this should save you up to five licenses (that you’re paying $432/year for). For most nonprofits I would expect a savings of two or three licenses. Yay! For Organizations That Do Not Get License Grants If you work for an organization that doesn’t qualify for the Power of Us grant, this could save you quite a lot of money. Let’s assume you have just three integrations and that you are paying list price ($1,800/year) for all of your licenses. That’s $5,400 that just came off your Salesforce bill with no effort! Most organizations pay a bit below the list license price, but probably also have more than three integrations. The savings add up pretty quickly. It's a Pi Day Gift! 🥧 The licenses are set to go live on 3/14, the same day I’ve set this blog to publish. Happy Pi Day everyone! Enjoy some of the delicious baked kind. 🥧

  • My First Einstein Prediction

    [This turns out to be the third in a series of posts. Not that I meant to have a series in the first place. Or if I did, I hoped it would be about the cool things I learned from AI predictions, not a deep dive into installing and licensing... Anyway, this post should stand alone, but if you want the background, read “Cheap (or free?) AI for Nonprofits?, and Still Trying to Try Einstein.” I’ll have more detail and some updates here, but I’m not going to rehash everything I wrote before!] I finally got Einstein Prediction Builder (EPB)/Einstein for Nonprofits to work in a real client production Salesforce instance! Actually, true confession: I pulled that off at the end of November, three months ago. Then I promptly didn’t do anything with it (or even write this blog post) because I was lazy. But that, I think, is part of the learning journey we’re on together! How to Install Einstein for Nonprofits Though there is help documentation, I think it’s worth listing out the steps to install while we’re at it. 1. Go to Setup>Einstein Prediction Builder, review the terms and conditions, and turn it on. Eventually, you'll have this: You need to already have Einstein Prediction Builder (EPB) turned on before you can install Einstein for Nonprofits (EFN). 2. Install Einstein for Nonprofits via Metadeploy ( More on this down below, but it’s possible you won’t be able to install if you don’t have Nonprofit Cloud. 3. Navigate to the Einstein for Nonprofits app (see this help page) and click “Setup and Train Models.” 4. Navigate back to Setup>Einstein Prediction Builder. You should see three predictions installed now. Now’s your chance to [try to] activate one of them. Quick Sidebar into My Experience Activating Predictions As I noted in Still Trying to Try Einstein, the installed predictions refused to activate for me in one of my client orgs (the one that has interesting and plentiful data). They were stuck in Pending status. Support eventually suggested that I clone them and activate the clones. I procrastinated on that for a while because it seemed like it would be a pain, but it actually turned out to be easier than I expected. Einstein takes care of creating a target field for holding your prediction results. You don’t have to create a field before you start the prediction builder wizard. Nice! It was a bit of a "gotcha" that the Description field on a prediction can only hold 128 characters. (You don't find that out until you try to save.) Of course, if you have to clone a prediction that failed to activate, you’re going to have the target field for the failed one cluttering up your org. Not a big deal, but this kind of thing annoys me. Support tells me that my cloned predictions will use the Einstein for Nonprofits backup models if there isn't enough data in my org. (I have absolutely no way to evaluate whether this is true.) A Working Prediction! With my cloned prediction active, let's look at the scorecard EPB provides: It appears that EPB thinks that having a value in Preferred Email (home, work, alternate email selection) is the single most relevant field for predicting if someone is likely to become a first time donor. I have...opinions...on that. (But what do I know? Maybe the machine learning has spotted something interesting that we humans wouldn't assume. That's the whole point of this exercise, so I'll give it the benefit of the doubt.) I then enabled the prediction and went away for the weekend while it churned through the data. On Monday, a contact report showing my target field had results! In the image, I have grouped the report by the result values of the target field, so the Record Count is how many contacts have that number in the result field. What threw me at first was that a Yes/No prediction ("Is this contact likely to become a first time donor?") results in numbers, rather than true/false. The numbers are the percentage likelihood (of becoming a first time donor.) That totally makes sense now, but it had been a while since I completed the Einstein Trailhead modules, so I needed a moment to understand it. The real problem is that I’m not sure what to do with this now that I have a distribution. I suppose we could target outreach and fundraising efforts to the 29,000 people that are at least 31% likely to become first time donors, but that’s a lot of people. Far too many for individual outreach efforts... This is the point at which I found myself really stuck. I'm still not sure it's worth showing this to the client in whose org I tried it out because I don't think they have the staff bandwidth to use the information. And I'm not sure how I feel about the quality of the prediction. Other Predictions With Try Einstein, the freemium trial, you can have only one prediction activated at a time. My understanding is that you could deactivate this one and activate another one and the values in the target field (what's showing on my report) will persist. So you can switch back and forth to learn from any or all of the predictions you have in your org (those installed by Einstein for Nonprofits, those you create by cloning, or those you build from scratch). However, the inactive predictions won't update as new data comes in. That's probably OK depending on what you might do with the predictions. I haven't actually tried out activating the other two predictions that EFN installs. Partly that's because I would have to go through the rigamarole of cloning them first, due to the activation issue I experienced. But also, in that org I am unsure of to evaluate the quality of—or what we might do with—predictions about who is likely to become a Top Donor or a Recurring Donor. Now Let's Get Into Licensing Yes, we're going to have to get into the weeds of Salesforce licensing and pricing. It turns out that Einstein for Nonprofits isn't exactly "free" and isn't exactly available to every nonprofit. Shocking, I know. First of all, according to the person I spoke to at, Einstein for Nonprofits should not install into an org that does not have the Nonprofit Cloud SKU. What constitutes that SKU is hard to pin down (and is probably going to change multiple times). But the quick-and-dirty is that organizations with just the P10 donated licenses do not have Nonprofit Cloud. A Nonprofit Cloud license is more expensive than a regular "11th" Sales Cloud license for a nonprofit, though I haven't been able to learn how much more. The Nonprofit Cloud SKU comes with Einstein for Nonprofits, Accounting Subledger, and Insights Data Platform Integrity, so doing the math from the crowdsourced pricing table, I think it's a good chunk of change. One strategy might be to buy CRM Analytics+ (CRMA+) [formerly known as "Wave Analytics" and then "Tableau CRM"]. That bundle is reasonably priced, is discounted for nonprofits, and would give you Einstein Predictions and Einstein Discovery. (Einstein Predictions would allow you to build and activate more EPBs at a time.) But CRMA+ should not actually allow you to install the EFN predictions. I'm told that Discovery (which is to say, "Tableau") may actually be more insightful for nonprofits than EPB because it allows them to see more of the "why" behind information. Moving more toward selling CRMA+ and helping nonprofits do more with visualizing their data is likely part of's future direction. Your Installation Mileage May Vary (Mine did.) Finally, let me note that I was able to install EFN into two orgs that absolutely do not have the Nonprofit Cloud SKU. You might be able to as well... but Salesforce's intention is to fix that. (Maybe they already have.) Consider grabbing EFN while you can! You also might want to clone the predictions in case they disappear if/when Salesforce fixes the licensing tie-in.

  • Naming Convention Flows

    Salesforce has two options for object records: they can either have a text Name field or be auto-numbered. Accounts, of course, have an Account Name field. Contacts, technically, only actually require LastName, which is obviously text. Opportunities are named. Payments (an NPSP object) are auto-numbered. Auto-numbering is nice because nobody has to think about it, but it’s kinda’ impersonal. Text fields, however, can be the Wild Wild West. If you’ve spent any time in Salesforce, I bet you’ve already encountered this with Opportunities. Some organizations train people to stick to a standard, but more often than not even adherence to a policy is spotty. You can end up with an opportunities list that is just impossible to sort through, whether it’s opps that all have extremely similar names or each has a very specific name that somebody painstakingly typed in, whether or not it gives all the information you might want at a glance. (And these were examples from Opportunities, which already lend themselves to some structure around naming!) We Need Naming Conventions Sometimes we need records where the name tells us something but where it would be annoying to make your users type it in. We need naming conventions. For example, I have clients creating records of students applying to colleges. Wouldn’t it be great to be able to glance at a list view and know most of the important information about that record? Something like: Harry Potter - Wizarding U - SY24 - Waitlist Hermione Granger - Wizarding U - SY24 - Accepted It would be bad enough trying to get users to type all of that. They would never be consistent with school names or school years. And they would definitely not keep the stage up to date! But thanks to Flow, it’s also not very hard to do it for them. Build a Naming Convention Flow These are some of the easiest flows you’ll ever build. Take the example of the college application tracker: You can actually just have one step: This is a Before Save flow (the setting in Flow Builder is “Fast Field Updates”), which means it runs before the record is even saved to the database and it runs very fast. In this case all you have to do is assign a new value to the Name field of the record. It’s going to overwrite whatever value was there with the most up-to-date information. This kind of naming convention is really easy and helpful for all sorts of objects. I use it regularly for applications, report cards, test scores, financial aid awards, and the like. If you want to impose a naming convention for opportunities, you might want to build in more logic, perhaps distinguishing between naming for individual gifts, corporate gifts, recurring donations, etc. That takes a slightly more involved flow, but there’s really nothing too complex here. It’s just a handful of decisions to decide the type of gift and then use the right naming based on that. So we’re done! Build a quick flow that updates the name–probably in a before save context–activate and move on. One Small Gotcha - The New Button I do have one pet peeve related to this: The Name field is still going to be required when you click New. 🤦 A really conscientious user (my favorite kind!) is going to see the required New College App field and type out something that matches what they’ve seen in other records of this kind. And they’re going to hate that they have to type so much. A not-conscientious user, of course, is just going to put the name of the school, or the student, or the year, but definitely not all three. You can teach people they can just put anything in the field and it will be overwritten, but that offends my sensibilities on more than one level. (Not how it works in other places. Not a good habit to get people into. Unnecessary extra typing...) And you can’t even give any sort of on-screen hint about this. (You can't put help text on a Name field. Probably nobody would read it if you did.) So you have a few choices here, none of which make me happy. Training - Just tell people that for the object in question the name field is going to be overwritten and they should put any character in that field and move on. I don’t love this option because it relies on them knowing to ignore the field in this case but you probably don’t want that kind of behavior in other places… You can make things slightly better by setting the record Name to be “_object name_ (filled by flow)” or the like. But then "(filled by flow") is also going to show up in the headers of your reports and list views, which I kinda' hate. A Quick Action - It’s very easy to make an object-specific Quick Action and then remove the standard New button on related lists. The benefit of an action is that you get a custom mini Page Layout for the action and you can remove the name field from that layout (which you can’t do from a regular page layout that would be used for the New button. You can also remove other fields that are perhaps not relevant at the moment of first record creation and can prefill fields to save your users time. My disappointment with this approach is that you can’t actually have this new action override the New button in related lists. This new action will show up at the top of the page on the parent object, next to the Edit button and the like. You have to remove/hide the New button on the related list. I think it’s confusing for users that some related lists have New right in context on the list and for others you have to look elsewhere on the page. Is this a huge deal? No. Is it poor user experience and user interface design, yes! And by the way: Your Quick Action will be used when creating a record from the related record’s page. But not if you click New on a List View… A Screen Flow - If you want to get really fancy you could make a screen flow for creation of new records. Like with a custom Action, you don’t have to ask users for the Name in your screen flow. It’s then possible to create a new list button and reference the flow’s API name as a URL. (/flow/FlowAPIName?InputVar1={!Mergefield1}&InputVar2={!Mergefield2}etc…) Then you can use that list button to override the New button both in list views and related lists. By default a screen flow isn’t going to redirect to the created record. (There are ways to fix this using UnofficialSF components, but still…) The screen flow is going to look different than other New button behavior. And the screen flow is also going to run more slowly. Let's also not forget that building a screen flow takes more than a little work, even for a simple one, and leaves you with yet one more custom thing to maintain. In real life I’ve usually gone with option #2, but sometimes just lived with #1. Option #3 just feels bad all around.

  • One Form To Rule Them All

    One of my favorite ways to bring the programmer’s mantra of Don’t Repeat Yourself! (DRY) into my Salesforce work is to build the One Form To Rule Them All. By playing with URL parameters you can take a simple form and use it more than once. A single Field Trip Form that can be used for any and all field trips throughout the year. A single Pre/Post Survey A multi-purpose Waiver form And so much more. The trick here—the little mental shift that can really open up possibilities—is that “questions” on the form can be used as the text or content. Let me say that again: Questions on the form can be text, instructions, or content of the form, rather than something you are asking the user to fill out. Then you put the values you want in those questions into the URL that you send out and—BAM!—the form is customized. Saves time (and time is money) building and managing webforms. Works with Any Form Tool You should be able to do this with just about any form tool because all you’re using is URL parameters. My tool of choice is usually FormAssembly. But URL prefill is pretty standard functionality and should be available with most form tools. (I’ve written here about FormAssembly’s prefill connector and the amazing powers it can bring to bear. But that’s not what we’re doing here.) The Master Field Trip Form Let me use The Academy Group to show an example. They take their students on lots of field journeys and they want to get consent each time, notifying parents/guardians about where the kids will be and what they’ll be doing. It was such a pain to create a whole new form for each trip. But now they have the Master Field Trip form. All they have to do is customize the URL they’re sending out and the form works for every trip. I have a simplified example form for us to look at in my FormAssembly account. If you go to the form without anything added to the URL it’s not very useful. But if you use this format to customize the URL you can do all sorts of things:[reason for trip]&tfa_23=[location] Replace the bracketed sections with your desired text and it goes into the fields. (The tfa_# references tell the form which questions to put the text into. You’ll have a similar syntax for other form tools.) You can fill more than just two fields if you need to. (URLs do have a maximum length of 2,048 characters, so don’t write a novel in your prefill. But if you’re being reasonable you can do quite a lot.) Here’s an awesome field trip that I would highly recommend based on my own recent travels. The URL is: see the ice cave, learn about the unique geology of Iceland, and get an introduction to the Northern Lights. We will eat lunch in the rotating cafe with a view overlooking the city.&tfa_23=Perlán Science Museum, Reykjavik, Iceland But you can use the same form for a more realistic trip. This URL is: visit a local business.&tfa_23=Mike’s Sandwich Shop, 123 Main St. ) You can create all sorts of waivers, permission slips, and notifications this way. Pre/Post Survey In One Now let’s look at one more example that can really save time: Use the same form for both a pre- and a post-program survey. To measure program effectiveness you usually want to ask program participants the same questions before and after, in order to measure the effect of our intervention. You might think you would want to build two different forms for this. But Don’t Repeat Yourself! If you have two forms and then you decide to add a question, you have to modify two forms. Instead, if at all possible, I want to use the same form for both pre and post. All that’s going to take is: A dropdown menu question for Pre or Post. (You can probably hide this question.) Setting the value of that question via the URL. (Optional) Conditionally hide or display different instructions or even questions that are used only in the pre or post context, questions based on the value of Pre/Post. See what I mean here. Don’t Repeat Yourself in the Connector Either I don’t want to go too deeply into the step of sending your form data to Salesforce. In fact, this trick doesn’t necessarily assume you’re doing anything more with your form data than having the responses sit on your form provider’s server. But if you are building a connector to send the data into Salesforce, keep the same DRY principles in mind. You can map the question that you prefilled into the created record, for example, as the name of the field trip this permission was for. Or if you need a more succinct name, you could use an hidden prefilled field (“Perlán Trip, Iceland.")

  • Sprinty's Community Resources

    I'm excited to share with you that a new resource launched today, created and maintained by Open Source Commons volunteers. Sprinty's Community Resources is an online library of crowdsourced community content for nonprofits and educational organizations using Salesforce. It's a fast way to find articles and How Tos that others have found useful in the past. Resources submitted by the community are searchable and sortable so you never have to wonder "Is there are good article about the true cost of Salesforce for nonprofits? Where could I find that link?" Of course, you can also just browse through the listings to learn all sorts of things. And best of all, you can contribute more resources to the list! Simply click on the submission tool and fill out a quick form with a link to whatever podcast, blog post, article, or video you've found helpful. Each resource is reviewed by a volunteer "curator" to ensure quality content is featured on the site. Sprinty's Community Resources has a Trailblazer Community Group if you want to stay in the loop on updates and ask questions. This project wouldn't have been possible without the vision of Jodi Nemser-Abrahams and Rebecca Tasetano and the input of a team of dedicated volunteers I'm proud to call my colleagues and friends: I hope you, too, will add to the resource collection and even consider getting involved with the team!

  • i++ (or For Loops) in Flow

    If you’ve learned any programming (in just about any language) you’ve learned about For Loops: For as long as (i is less than (the point I want you to stop at) ) do something. It’s how you do something X times. (Really, i times, but you get me. 😉) Like print out your name 1,000 times so it fills the screen. Really common in programming and pretty easy. And perhaps not that useful, but it definitely has its moments. But guess what’s not straightforward in Flow? Flow loves to do loops over collection variables, where you already have the records you’re going to work through and do something about or to each of them. Most of the time that’s what you’d want to do in a flow, sure. But what if you want to do something a particular number of times? Super easy in programming, not straightforward in Flow. There’s no simple Flow equivalent of “i++”, the syntax for incrementing a counter to control your For Loop. If you make a flow collection variable that holds numbers, which you could technically loop over, you would have to add each number to that collection. The work of doing that would be more effort than it's worth. Maybe when you get a new volunteer you want to automatically set up ten check-ins with them to see how their volunteer journey is progressing? Or a table sponsorship entitles the purchasor to 11 additional guests at the gala, for a whole 12 seat round table. There are all sorts of reasons you might need to either do something X times or create X of something. Let’s look at how you can actually accomplish this in flow. We’re going to look at an example that does something ten times. You could easily replace ten with any other number. 1. The first thing we need is our counter. That’s going to be a number variable that we’re going to use to increment. I’ve called mine “i” for the purposes of this blog. But I will note that my normal naming convention for variables in flow is to start them with “var.” So in real life I would call this something like “varCounter” or “varIterationCounter.” But in programming a For Loop we would use just i, so that’s how I’m showing it here. 2. The first actual step in the flow (or this loop portion of the flow) is a Decision. You have to check if your counter has reached the stopping point. We want to see if i < 10 or if it has reached 10 or more. Note: I started my counter, above, at zero, so I have to stop one lower than my desired number of iterations. Computers like to count "0, 1, 2, 3, …" If you reach 10, you’ve done something eleven times. If this confuses you, you could start your counter at 1 instead of zero (set the default above to 1, rather than 0.) You also could start a counter at the top and work your way down. That would be i-- rather than i++, which is totally acceptable though less commonly used. (Also, by the way, word processors really want to autocorrect “i minus minus” into “i em-dash.”) 3. If we are still in the loop (our counter has not reached 10), then we “do something” and iterate our counter. (This would be the code block within the For Loop.) The “do something” part of your flow, of course, depends on what you need to do. For our example we are making a flow that will create 10 accounts, so we have to assign a record variable for one of those accounts. To iterate the counter you are going to use an Assignment element to add 1 to the variable i. (I know I said that Flow doesn't do i++, but that's exactly what this step is doing. It just doesn't read that way.) 4. Then we have to add the account record variable we just assigned to a collection variable (for insertion later.) You can actually use a single Assignment step to both add the record variable to the collection and to increment the counter. But I wanted to show the entire idea diagrammed out. Also: You can not assign your record variable's fields and assign that record variable to the record collection variable in the same step. That will not work. 5. We loop back to the decision element to see if we need to do this again. 6. If we have reached the limit, then we break out of our loop and use the Create Records element to actually put 10 new accounts into the database. Et voila! Here are all the accounts that were in our collection variable now available in the database. Summing it all up, here’s the most simple representation of what we’re doing, though for it to work in Flow you need at least one additional assignment element, as I noted above. Once you know how to do it, this is really not such a big deal! But if you're new to Flow or haven't needed to solve this particular challenge before it takes a moment to puzzle it out.

  • Dynamic Gauge: The First “Dynamic” Feature I’m Using

    Salesforce has come out with several features recently with “dynamic” in their names, including Dynamic Forms and Dynamic Actions. It’s clear the “dynamic feature” bandwagon is where a lot of future development effort is directed. But I’ll admit: I haven’t used them. I’ve followed the development of Dynamic Forms, but for the moment I’m still sticking with the “Related Record Hack.” And Dynamic Actions looks interesting, but I just haven’t taken the time to delve into it because I haven’t had a compelling need. Mostly, I’ve looked at those Lightning page advancements and thought, “they’re adding so many layers of [possible] complexity. How can admins ever find the time to sort through it all?” But Dynamic Gauge Charts actually caught my eye because they solve a problem I’ve run up against in my work. (Classic WIIFM. “What’s In It For Me?”) The problem they solve is that often you want to put a component on your dashboard that compares a report (such as a count of something) to a target number that isn’t itself derived from that report. The perfect example would be a chart of progress toward your fundraising goal. It’s simple to make a report of this year’s closed won opportunities, total up their Amount fields, and see how much you’ve raised. But if you want to compare that to your budgeted goal you have to put that goal into the dashboard gauge component manually. If the goal changes, you have to edit the dashboard to edit the component to change the breakpoints and target on the gauge. On the first day of the fiscal year, when the report underlying the dashboard rolls itself over (Close Date = “This Year”), the dashboard is suddenly way off because it’s comparing the total raised this fiscal year with the target left over from last fiscal year! Dynamic Gauge Charts allow you to make a gauge where you set the top end and breakpoints based on the value of a field on a particular record. Pretty cool! Now you can have a record for “Current Fundraising Goal” and point the gauge at that so that it stays current as long as that record is current. Set yourself a reminder to switch the number in the Current Fundraising Goal record on the first day of the fiscal year and all gauges on all dashboards are suddenly looking at the right new goal. Which brings us to keeping those target records current. Some target records are inherently manual, like the “Current Fundraising Goal” example. That’s a number that the executive leadership picks sometime before the end of the fiscal year and it’s not going to shift based on other records in Salesforce. It could be updated if circumstances change, but that’s a decision made by people. But I frequently get clients that want an organizational metric along the lines of 75% of all students will maintain a GPA of 3.5 or better, 100% of alumni will have full-time employment, or 90% of 8th graders will apply to competitive High Schools. Those percentages are derived from an equation. In the first example, the equation is: Count of students with GPA > 3.5 ____________________________ Count of Enrolled Students The number of students, alumni, or 8th graders—the denominator, in that equation—changes. It might change on any given day (if kids are accepted or removed from the program.) The numerator of the equation takes care of itself: You get it by running a report—and reports are always up-to-date the moment they’re run. But we have to update the denominator whenever kids come or go. On a dashboard without dynamic gauge charts we would have to edit the gauge target and breakpoints every time a student leaves the program! (Or at least every time we are going to refresh the dashboard to show executives how we are doing against our organizational goals.) So much for having the dashboard at your fingertips to impress execs. And let’s not forget that the three examples I listed all have different denominators! It’s entirely reasonable for a single program to have organizational targets like those–and probably a lot more. That could become a lot of target records to maintain. In fact, my client The Academy Group asked me to help them put together a dashboard with exactly that kind of organizational metrics. They work with kids from 4th grade through graduation from college. So they had metrics about the % of elementary school kids, % of middle school, high school, college, plus some specific to 12th graders, etc... It became clear very quickly that there are going to be a lot of different denominators required. I knew that just creating each of those metrics was going to be a bit of a pain in the neck: Step 1. Create report counting students in the category. (8th graders with GPA > 3.5.) Step 2. Put that report onto the dashboard as a gauge component. Step 3. Run a different report to figure out the denominator. (How many 8th graders are there?) Step 4. Edit the gauge target and breakpoints. Rinse and repeat. Manually updating each of those metrics even once a year would be no fun. And I know that sometimes students are removed from the program mid year and that Academy Group’s leadership is going to want to see the percentages based on the right denominator as soon as those kids leave. It occurred to me that this might be the time to try out those Dynamic Gauge Charts. And that’s when I realized that there aren’t any records in Academy Group’s Salesforce that could serve as the targets! We have a report of, for example, enrolled students by grade, but no record for those counts. It’s not even something you could make a DLRS rollup for. (There’s no parent/child relationship.) You could make some kind of Flow to get each of those counts, but you’d basically be making a custom flow for each target denominator. That’s not sustainable! Introducing DynamicGaugeTargets I figured it was time to sit down and build a thing. I knew it could be a thing that was reusable for some of my other clients. And then I realized I could release it as an unmanaged package so other organizations could also use it.. So I give you DynamicGaugeTargets, a completely free, unmanaged package that gives you the ability to keep those denominators up to date automagically. The package includes a custom object (called DashboardTarget) for storing all of your dynamic gauge chart target numbers. You can have dashboardtargets that you manually update when needed. But if you want auto-updated targets all you need is a little bit of [easily Googled] SOQL know-how and flows will keep you up-to-date. Do Try This At Home I would love to hear your feedback! Please try DynamicGuageTargets in your sandbox or dev org. Try out my installation and post-install instructions and let me know what you think. If you find this useful, go ahead and install it in your production instance and impress your colleagues with your dynamic dynamic gauge components. 🤯 If you find an issue or a problem, the GitHub Issues tab would be a great place to log that for me. For questions or comments, please post on the Trailblazer community or in Ohana Slack so that perhaps others can join the discussion. Or, of course, you can email me directly. I particularly want to hear your successes!

  • Simple, Readable, Fun 🥳 - 💦 Sprinkle Emoji in your Salesforce ☁️

    Salesforce is serious. 😒 It’s where we do our work. 📤 But, I don’t think there is an actual requirement that work be boring. Why not take the opportunity to make the system we work in a little more fun? Add joy (😊), affirmations (👍) and celebrations (🎉) anywhere you can! Adding emoji is nearly as easy as typing a character. On a Mac you bring up the emoji keyboard (⌨️) with Ctrl-Command-Space. On PCs it's Windows-Period. And emoji are text as far as computers are concerned--they're part of the unicode standard--so they work just about anywhere you can use text. Maybe it’s just decoration, but sometimes you get those proverbial "thousand words" by using a picture. That can mean pages that are more functional. Studies have shown that readers process visual information much more quickly than plain text. Besides, many emoji bring color as well as shape, so they brighten up your screen instantly! 🌈 Let’s look at some of the great places you can use emoji: 🗣 In Chatter (Of course.) We’re hardly breaking any new ground here, since it’s similar to putting them in your texts. 📇 Record names Now we’re having some fun! Could you put a stack of bills (💵) into your opportunity naming convention? [OK, that might not be serious enough.] Are you an animal shelter with records for cats (🐈), dogs 🐕, and rabbits 🐇? Put the type right into the record name and your users will instantly know something about Muffin! 📘 Description fields Any free text field is fair game! ⎶ As picklist values Setting a record’s progress to a Red🔴/Yellow⚠️/✅Green status field? Why not include the color in your picklist? Or perhaps you have radio buttons on a quick form–particularly useful on mobile. Instead of a Yes/No or a Good/Poor binary, why not 👍/👎? Instantly recognizable! 🎛 Dashboard or report names I hadn’t really thought of this before I started this blog post. But I’m definitely going to start renaming some more dashboards. No more “Organizational Goals Dashboards.” They’re all going to be “🎯 Goals Dashboard” from now on! ⍯ In formula fields On Related Lists I’ve written elsewhere about making a custom formula to combine fields for display on a related list. Emoji here can make your list pop, allowing users to instantly distinguish different types of records in the list. Visual Flags on Records We often want image badges on records and even the NPSP docs from years ago recommended a way to use static resources and formula fields. But for several years now I’ve preferred to make my image flags with emoji. Instantly readable on a record page and truly a lifesaver when you’re looking at a large report! 🏳️‍🌈 In banners I already posted about banners on Lightning record pages and you can see that I use emoji there. There are all sorts of possibilities when it comes to banners on your pages! 🖥 Flow screen instruction headers and sections Screen flows are a versatile tool (though sometimes quite time-consuming to build!) for building a custom interface in various parts of Salesforce. Whether the flow is a survey or call script, a custom New button in a specialized area, or just a way to display dynamically generated information in one place, I love to dress those screens up with emoji. When you put instructions on the page, start them off with a nice emoji to draw the eye. Differentiate sections with other emoji. 🛑 Error messages and validation rules Nothing says, “Stop!” better than a ⚠️ or a ⛔️, does it? Dress up your validation rules with a visual warning. (Or soften the blow with a smile. 😉) ⚡️ Lightning pages (like Home page rich text elements) Emoji can be welcoming additions to an instruction section or draw the eye to actions you can take on the page. I know that lots of people just zip right past the home page as soon as they log into Salesforce. But if you put some effort into it you can make Lightning App Pages functional and save your users time by allowing them to work right from the instant they log in. 🔘 Action buttons This is probably my favorite! Why settle for boring buttons like New and Edit? If you’re going to the trouble of making an action or a button, add some pizazz! I’m hardly the first to think of using emoji in Salesforce. Marc Baizman wrote a blog post back in early 2018. But I still l don’t think it’s as common as it should be. 🙁 One final note: Emoji and screen readers don’t always play nicely together. Generally the screen reader is going to read out the emoji's alt text, which you can see by hovering your mouse over one on the emoji keyboard. Keep your user base in mind and make sure the meaning of the icon is clear in context so that even when rendered as voice it doesn't get in the way. I've also seen advice to lead with text, putting the emoji at the end. (Good advice, though I didn't quite follow it to the letter for this post.) And don't forget to keep contrast in mind for the visually impaired. Now get to work, friends, and make those Salesforce pages colorful and fun! 🎈🎊

bottom of page