Posts Tagged ‘CRM’

Service Objects integrations can help improve your contact data quality, help with data validation, and enhance your business operations.

Salesforce Data Quality Tools Integration Series – Part 2 – Validation Plug-ins in Flows

Welcome back to our blog series where we demonstrate the various ways you can achieve high data quality in Salesforce through the use of Service Objects’ validation tools.

In the first part of this series, we showed how to create a trigger and an Apex future class to handle calls to our web service API. That blog described a common way, in code, to call our services in Salesforce. But, there are more simple ways to do this.

In this blog, we are going to step back and demonstrate an integration that requires little more than drag and drop to achieve your goals by implementing a plug-in on a flow. Because – why write code if you don’t have to? When we are done you will be able to create a flow and drop our plug-in anywhere in your process and wire it up.

We are going to start by looking at some basic setup. Then we will step through the code. What code? You are probably wondering, “Why do we need to look at the code, if we don’t need to write any”. Well, if you wanted to implement any additional business logic or tailor the plug-in, you will be equipped to do just that.

After we review the code, we will jump into creating a sample flow that allows a user to enter either a US address or Canadian address and process that data through our US and Canada validation APIs and then display those results back to the screen.

In this section, we will need to do some setup before we get started. We will need to register the Service Objects’ endpoint with Salesforce so that we can make the web service calls to the address validation API’s. We went over this in the previous blog but it is worth repeating here.

The page where we add the URL is called “Remote Site Settings” and can be found in Home->Settings->Security->Remote Site Settings. This is the line I added for this blog.

Be aware that this will only work for trial keys. With a live/production key you will want to add a URL for for ws.serviceobjects.com and wsbackup.serviceobjects.com. As a best practice, you’ll want to add both endpoints with your live key, to take advantage of our fail-over capabilities. We named this one ServiceObjectsAV3 because we are really just reusing it from the previous blog but you can name it whatever you want.

No need for custom fields to get this example to work but you will likely want to create the same ones or similar as those in the previous blog, seen here.

This section shows the structure of the plug-in class for the US address validation, that we are naming AddressValidationUSA3, and the signature of the invoke and describe methods. The invoke method will contain the business logic and the call to our web service API. The describe method sets up the input and output variables for the plug-in that end users will be able to connect to on a flow.

The describe method also allows you to supply definitions and descriptions of the variables on the plug-in itself that will appear in the flow interface. The definitions used here are important because it can save a lot of time for the end user that is developing a flow. I would resist skipping this to save time. The following is just a snippet of the code.

There really isn’t much else to the describe method, most of the business logic happens in the invoke method. In the invoke method, we will gather the inputs to the plug-in and do some initial formatting to make sure we have valid characters in the call to our API. In gathering the inputs, we make sure to use the names of the inputs that we used in the describe method.

Since we will be making a path parameters call to the API, we want to account for anything that could break the URL like missing parameters. A missing parameter using our API should break the call but on other APIs it could simply change the meaning of the call and end up returning unexpected results. To make sure there are no issues with missing parameters, we simply replace any ones that are missing with space character. Just as in the previous blog, there will be minimum field requirements before it even makes sense to call the operation. The operation we are using is GetBestMatches and these are the requirements.

  • Combination 1
    • Address
    • City
    • State
  • Combination 2
    • Address
    • Zip code

If you do not have some combination of these inputs then it is best to add code to avoid the call completely, since there is no way to validate an address otherwise. By “avoid the call,” we mean avoid even hitting the plug-in at all, since it would not be necessary. A decision component in a flow can help with the process.

In an effort to simplify the code, I pushed the logic for calling our API into a method called CallServiceObjectsAPI which should make things easier to read. Using this method, we pass in the input parameters that we cleaned up.

Below, I show how to setup the HttpRequest/HttpResponse and the request URL. After that, I add in some basic error checking to check for a couple results. First, I am checking to see if there was an error with the API call and/or the result from it. If other errors happen outside of that then we catch the general exception, which I would assume is a connectivity issue at that point. You can try to catch more specific exceptions on your own but what I have here will work in a general fashion. In the instance of an error or exception on the call to our API, we also demonstrate our standard best practice of failover by adding code to call another endpoint in these situations. In the case of this blog, we are walking you through a trial key scenario, so failover in this case will not be true failover since it is failing over to the same trial.serviceobjects.com endpoint. But in a live key scenario, you would be encouraged to use ws.serviceobjects.com for the first call and wsbackup.serviceobjects.com for the failover call.

Another thing you may have noticed in the code above is that the input parameters to the API call are URL encoded and “+” are switched to “%20”. The reason for this is that certain characters are not allowed in a URL or a path parameters call, so the built-in Apex function urlEncode cleans that kind of thing up. One side effect that the encoding has is it replaces spaces with “+” symbols. Though in a URL “+” symbols are becoming the norm, path parameter calls still have issues with them. So the proper way to make the call work is to replace the “+” symbol with a “%20” which will be deciphered correctly as a space in the end. The method returns a string response from the web service and based on the call made, it is more precisely returning a JSON string response which we save in the variable ServiceObjectsResult.

The first thing we do after we get the response from the method is deserialize it into a Map object so we can start processing the result. Here is the rest of the code.

This section of the code is checking to see the type of response that was returned. The response could have been either an address, error or network error response. Based on those variations, we populated the corresponding output values in Map variable called “result”. In the Map, we map the outputs from the service to the expected outputs described in the describe method. Those values are the output values of the plug-in and are directly interfaced with in the flow. Adding code anywhere in the method we just went through would be appropriate based on your own specialized business logic.

Now that we have gone over the code, we are ready to jump in and show an example of our plug-in in a flow. For this example, I also created a Canadian address validation plug-in to make it a little more interesting. However, I do not see any service we offer that would not make for an appropriate and powerful example.

As I mentioned on the outset of this blog, I will show you a demonstration of a flow where the end user will be presented with a data entry screen. They will have options for adding either a US address or a Canadian address. From there, we will wire it up to either the US address validation plug-in or the Canadian address validation plug-in and then finally display the results to the screen. This flow will be more of an example on how to wire up the plug-ins rather than creating an input and output screen. Though doing something with screens is not out of the question, it will be more realistic to have a flow that manipulates Contact, Account or Custom objects without an interface.

Start by creating a new flow in the Process Automation section. I am just going to open the one I already had saved. Dragging on the Screen object is the first step and be sure to set it to be the starting interface by setting the green down arrow on the object. A flow cannot be saved without one part of it being a starting point.

Inside this interface you will setup the input fields that you want to retrieve from the user. In this example, we forced the Address 1 field to be a required field and at the bottom we added a radio button selection for the desired country, defaulted to USA.

Once we have the inputs from the user, we need to find some way to route the variables to either a US address validation or a Canadian address validation. For that we can use Decision Logic Tool. We will configure it to look at the country field and make a decision on which way it should continue to process.

The actual logic simply decides to go down the US address validation path if USA is found otherwise it will assume it is a Canadian input address.

Now we are ready to drop our US and Canada address validation plug-ins on the screen. On the left, in the Tool area you can find the plug-ins under the respective names you gave them in the creation of the plug-in.

You can and will be forced to drag them onto the flow canvas one by one and set them up individually. Inside you will be mapping the inputs from the user data entry to the inputs for the plug-ins. Additionally, you will be mapping the outputs to variables you create or objects variables in the system. This can take some time depending on how many of the address validation outputs you want/need to use.

When you are done with that part, you will wire them up in the flow to the decision tool you added earlier as shown below.

In this last part, we will setup two output screens, one for the US address validation results and one for the Canadian address validation results. This time instead of adding text boxes to the interface, we just add a display object for each field we want to show.

After wiring the last screens, the completed flow will look like this.

From here you can choose to save the flow and then add layout options on other screens that will give you access to executing the flow, schedule the work flow to run at a specific time (not useful in our example though) or you can run it directly from this interface by clicking Run. We will demonstrate it by clicking Run. We’ll start with a US address and then view the results on the output screen. In this example, you can see there are several issues with this address. Street name is incomplete, it uses unit instead of suite, city name is misspelled and even the postal code is wrong.

Upon validation, we see that the system was able to correct the address and had a DPV score of 1 meaning the result is a perfect address. The DPV score is one of the most important fields to pay attention to in the output. It indicates the validity level of the address. You’ll see other information in the response that will give you an idea about what was changed or if there were any errors. You’ll also have access to the fragments of the address so you can process the response at a more granular level. More details about the fields can be found here.

In the last example, we will use a Canadian address. In this case the only thing wrong with the address is the postal code, so we’ll see how the system handles that.

Sure enough that address validated and the postal code was corrected. In the US address validation service, the DPV score and error result indicated the validity of an address. In the Canadian validation service, we really only need to look at the error fields. Empty or blank error fields will mean that the address is good and can receive mail.

In conclusion, we learned how to create a plug-in and then use it in a flow. Though you do not need to know how the plug-in was made to be able to use them, it is very helpful to know the details in the case that your business logic requires a more tailored solution. And you can see that in this demonstration adding additional code does not take much effort. These plug-ins allow you to get up and running very quickly without needing to know how to code. As I mentioned earlier, the flow created here is definitely a use case but more often than not I would imagine Salesforce administrators creating flows to work on their existing objects such as Contact, Account or some other custom object.

A quick note on the license key, you will want to add you own license key to the code. You can get one free here for US address validation and here for Canadian address validation (Each service will require a different license key).

The last thing I want to discuss about the license key is that it is good practice to not hard code the key into the code. I would suggest creating a custom object in Salesforce with a key field. Then restrict the permissions on the field so that it is not view-able by just anyone. This will help protect your key from unwanted usage or theft. At this point, we have the code for these two address validation plug-ins, but Service Objects will continue to flush out more for our other services and operations. With that said, if there is one you would like to request, please let us know by filling out the form here and describe the plug-in you are looking for.

A New Role: The Marketing Technologist

Once upon a time, life was simple. There was marketing, and there was IT. The former did creative work to drive the product creation and sales process, and the latter kept the computers, software and networks running. In large organizations, the former had a Chief Marketing Officer and the latter had a Chief Information Officer. And if the two departments talked, it was usually about things like software licenses or password resets.

Fast forward to 2017. Marketing is now a heavily data-driven field, where success involves things like marketing automation platforms, CRMs, big data analytics, social media analysis, content personalization, and data governance. Technology and automation software now play key strategic roles in the marketing process. Which leads to a new buzz phrase that is now here to stay in the industry: marketing technology.

Content management firm Docurated defines marketing technology as “tools and platforms used by sales and marketing organizations to effectively complete their duties.” These marketing/sales tools and platforms are becoming increasingly complicated to deploy and administer while new ones are being introduced at an exponential rate. To manage these technologies, many organizations now have a formal leadership role, embedded within the marketing organization, to oversee its use of technology: the marketing technologist. According to marketing blogger Scott Brinker, over 60% of firms have now restructured their marketing and/or IT departments to better leverage marketing technology, or plan to do so over the next 12 months.

According to McKinsey, marketing technologists are much more than IT people who have been moved to a new office: “They’re passionate about re-imagining what marketing can do in a digital world. They help nontechnical marketers craft better campaigns, programs, and customer experiences that effectively leverage software and data … They’re hybrids, who speak both marketing and IT, and naturally see the connections between them.” Whatever their formal title, they are part of a closer integration between IT and marketing, often reaching all the way up to the C-level suite.

It is important to know that the marketing technologist has emerged because of larger trends in the software industry. People didn’t just wake up one morning and decide to create this role – it evolved in response to the growth of inexpensive, scalable, cloud-based tools such as Salesforce and Marketo, as well as other trends leveraging big data and social media. In less than a decade, the automated marketing environment has gone from the province of expensive enterprise solutions to becoming a competitive necessity used by almost everyone.

One of the key roles of a marketing technologist is managing the data quality of an organization’s information assets – and where possible, creating automated processes to ensure this data quality. This dovetails with a broader portfolio of responsibilities integrating component technologies that form the basis for an organization’s marketing automation strategy.

At Service Objects, we help marketing technologists automate their data quality and leverage the maximum value from their data assets. We do this with tools ranging from our flagship address validation capabilities, which clean and validate your contact data against continually updated USPS, Canada Post and international address databases, all the way to lead validation and lead enhancement tools that make sure all your contacts work hard for you. And we make it easy to automate these capabilities, using either our enterprise-grade APIs or custom-built cloud connectors for the most popular CRMs and marketing automation platforms.

Are you a newly-minted marketing technologist? Talk with us and see how we can help build your success!

What Role Does Data Quality Play in the General Data Protection Regulation (GDPR)?

With the arrival of 2018, many businesses are realizing the deadline to comply with the General Data Protection Regulation (GDPR) is looming in the not too distant future. In fact, according to Forrester Research, 80% of firms affected by the GDPR will not be able to comply with the regulation by the May 2018 deadline. GDPR has many different components that a business will need to meet to achieve compliance; one of which is making sure contact data is kept accurate and up-to-date.

Understanding the important role that data quality plays in complying with GDPR, Service Objects will host a webinar to provide information to those looking to get started. This live, informational webinar on the topic of “The Role of Data Quality in the General Data Protection Regulation (GDPR),” will take place on January 24, 2018 at 9:00am.

The interactive event will feature data quality expert Tom Redman, President of Data Quality Solutions, who will provide an overview of the regulation recently enacted by the EU, and why US businesses need to pay attention. He will also discuss how engaging in data quality best practices not only makes it easier to achieve compliance, but also makes good business sense.

According to Geoff Grow, Founder and CEO of Service Objects, businesses can incur fines up to $20 million Euros or 4% of a company’s annual revenue for not complying with the GDPR. The recently enacted EU regulation requires that every effort is taken to keep contact data accurate and up-to-date. Service Objects’ data quality solutions can help businesses meet this requirement, as well as identify which contacts reside in the EU and therefore fall under the GDPR.

To register for this complimentary webinar, visit https://www.serviceobjects.com/gdpr-webinar.  All those who register will also receive a link to the recording after the event.

Service Objects integrations can help improve your contact data quality, help with data validation, and enhance your business operations.

Salesforce Data Quality Tools Integration Series – Part 1 – Apex Insert Trigger

With Salesforce being the dominant platform in the Customer Relationship Management (CRM) universe, we are excited to demonstrate how our tools can benefit the data quality of your contact data in this robust system.  Salesforce is highly customizable and one of the great things about it is that you don’t need to be a developer to create a rich user experience on the system.  Although, being a developer, will help with understanding some of the coding concepts involved with some of the components. With all the data in your organization don’t you want it to be clean, accurate, and as monetizable as possible? The Service Objects suite of validation API tools are here to help you achieve those goals. Our tools make your data better so you can gain deeper insights, make better decisions and achieve better results.

This blog will demonstrate various ways that a Salesforce administrator or developer can boost the quality of their data.  We will look into various flows, triggers, classes, Visualforce components, and Lightning components, and more as we demonstrate the ease of integration and the power of our tools in this series.

To get started, we will focus on demonstrating an Apex trigger with a future class.  I will show you the code, part by part, and discuss how it all fits together and why.  You will also learn how to create an Apex trigger, if you didn’t already know.

Apex triggers allow you to react to changes on a Salesforce object.  For our purposes, those objects will typically be objects that have some kind of contact information like a name, address, email or phone, IP, etc.  Additionally, the objects can contain various combinations of these as well.  In this example, we will use the Contact Salesforce object and specifically the mailing address fields, though the Account object would have made just as much sense to demo.  Service Objects has services for validating United States, Canadian and international addresses.  In this example, I am going to use our US address validation API.

We are going to design this trigger to work with both a single Contact insert and bulk Contact inserts.  This trigger will have two parts: the Apex trigger and an Apex future class.  We will use the trigger to gather all of the Contact Ids being inserted and the future class will process them with a web service call to our address validation API.

It is important to note a couple things about the future class method and why it is used.  First, using a future call tells the platform to wait and process the method at the next convenient time for the platform. Remember that Salesforce is a multi-tenanted platform, meaning organizations on the platform share, among others, processing resources.  With that in mind, the platform tries to govern processing so that everyone on the platform gets the same experience and not one organization can monopolize the system resources. Typically, the future calls get initiated very quickly but there are no guarantees on timing but you can be sure that the future method will process soon after calling it. Second, callouts to a web service cannot be executed within a trigger, so the future method acts more like a proxy for the functionality.  There are plenty more details and ways that a web service can be called and you can dive deep into the topic by going through the Salesforce documentation. If for no other reason, it forces you to separate the call to the web service from your trigger, in turn, exposing your future call to other code you may want to write.

Once we are finished the trigger and the future class, we will test out the functionality and then you will have some working code ready to deploy from your sandbox or development edition to your live org.  But wait, don’t forget to write those unit tests and get over 75% coverage…Shoot for 100%.  If you don’t know what I am talking about with unit tests I suggest you review that documentation on Salesforce.

The results of the future method will update mailing address fields and custom fields on the Contact object.  For this example, here are the fields you will need to create and their respective data types.  We will not need these until we get to the future method but it is a good idea to get them created and out of the way first.

  • Field name
    • Internal Salesforce name
    • Type
    • Service Objects field name
  • dotsAddrVal_DPVScore
    • AddrValDPVScore__c
    • Number(1, 0)
    • DPV
  • dotsAddrVal_DPVDescription
    • AddrValDPVDescription__c
    • Text(110)
    • DPVNotesDesc
  • dotsAddrVal_DPVNotes
    • AddrValDPVNotes__c
    • Long Text Area(3000)
    • DPVNotesDesc
  • dotsAddrVal_IsResidential
    • AddrValIsResidential__c
    • Checkbox
    • IsResidential
  • dotsAddrVal_ErrorDescription
    • AddrValErrorDescription__c
    • Long Text Area(3000)
    • Desc
  • dotsAddrVal_ErrorType
    • AddrValErrorType__c
    • Text(35)
    • Type

The last thing we will need to setup before we get started is registering the Service Objects’ endpoint with Salesforce so that we can make the web service calls to the address validation API.  The page to add the URL to is called “Remote Site Settings” and can be found in Home->Settings->Security->Remote Site Settings. This is the line I added for this blog.

 

Be aware that this will only work for trial keys.  With a live/production key you will want to add one for ws.serviceobjects.com and wsbackup.serviceobjects.com.  You’ll want both endpoints with a live key and we’ll explain more about that later.  We named this one ServiceObjectsAV3 but you can name it whatever you want.

Let’s get started with the trigger code.  The first thing needed is to setup the standard signature of the call.

The method will be acting on the Contact object and will execute after an insert.  Next, we will loop through the Contact records that were inserted pulling out all the associated Contact Ids.  Here you can add logic to filter out contacts or implement other business logic before adding the contact Id to the Id list of contacts to update.

Once we have gathered all the Ids, we will send them to the future method which is expecting list of Ids.

As you can see, this will work on one-off inserts or bulk inserts.  Since there is not much code to this trigger, I’ll show you the entire sample code for it here.

So that was painless, let’s move to the future class and we will see how easy it is to make the call to the web service as well.

Future methods need to be static and return void since they do not return and values.  They should also be decorated with the @future annotation and callout=true.

It will be more efficient to update the newly inserted records all at once instead of one at a time and with that in mind, we will store the results from our address validation web service in a new list of Contacts.

Based on the results from the service, we will either update the mailing address on the Contact record and/or the DPV note descriptions or errors, as well as, the Is Residential flag.  Some of these fields are standard on the Contacts object and some are custom ones that we created at the beginning of this project.  Here is a sample of initiating the loop in order to loop through the Contact Ids that we passed into this method from the trigger and then the SOQL call to retrieve the details.

In case you are wondering why we just didn’t create a list of Contacts and send those in from the trigger instead of building up the list of Contact Ids, the reason is there is a limitation to @future calls. You can only pass in primitive objects such as Ids, strings, integers and so on.  So we went with a list of Ids where in Salesforce Id is its own primitive type.

Demonstrated in the code, which is shown in the next screen shot, are our best practices for service failover to help ensure 100% uptime when making calls to the API.  Note, that with a live production key for our API, the URL for the first trial.serviceobjects.com would need to be ws.serviceobjects.com and the second one, the one inside the “if” statement, would need to be wsbackup.serviceobjects.com.

I left both of these as trial.serviceobjects.com because most people starting out will be testing their integration with a free trial key.  In the screen shot you will see that I have added the API license key to the call “ws-72-XXXX-XXXX”.  This is factitious. You will need to replace that with your proper key based on the live/production or trial endpoint your key is associated with.  A best practice suggestion for the key is to “hide” it in a custom variable or custom configuration instead of exposing here in this method.

Once we get a response back from the call to the API and everything is okay, we setup some variables and start parsing the response.  There are several ways to parse JSON and definitely better ways than the one described here but this is not an exercise in parsing JSON, it is an example in how to integrate.  In this example, we loop through the JSON looking for the field names that we are interested in.  We are looking for:

  • Address1
  • City
  • State
  • Zip
  • DPV
  • DPVDesc
  • DPVNotesDesc
  • IsResidential
  • Type
  • Desc

But the service returns many other valuable fields, which you can learn about from our comprehensive developer guide found here, which has other helpful information along with the fields mentioned.  Remember, if you do end up using more fields from the service and you want to display them or have them saved in your records, you will need to create corresponding custom fields for them.  The next screen shot is just a part of the code that pulls the data from the response.

In practice, you may want to make decisions on when to update the original address using more criteria, but in this example we are basing that decision on the DPV score result alone.  You can find out more about the different DPV codes back in the documentation.  When the DPV value is 1 then we are returning a valid mailing address.  Corrections to the address may have occurred so it would be best to update the address fields on the Contact record and that is what we are doing here just before adding the updated Contact to our new list of Contacts.

Once we have looped through the entire list of Ids that we sent into this method, we are ready to do the update in Salesforce.  Before this point, nothing yet would have been saved.

And there you have it, go ahead and add some new contacts with addresses and test it out.  Over at the Contacts tab I add a new contact and then refreshed the page to see the results.  I will purposely make an error in the address so we can see more of the validation results.

The address we added is for our office and there are several errors in the street name, city and zip code.  Let’s see if our system gets it right.

The address validation API fixed the address and returned that the fixed address is the correct address to use based on the inputs.  Next, we will demonstrate a bad, non-salvageable address. You will see more than a few things wrong here.

There were so many problems that the address was not salvageable at all.

Let’s try one more, but this time instead of a completely bad address, we will add a bad (not completely bad) address but missing key parts.

The input address is still not good enough to be good but this time we were able to get details back that can help with fixing the problems.

From the results, we can see that the address is there but perhaps a unit number or something key to the address is missing to get full delivery by the USPS.

In conclusion, there are a few things to take into consideration when integrating data quality tools into Salesforce. First, you need to do more error checking.  These were simple examples to show how to integration our service and the error checking was the bare minimum and not something I would expect to see in a production environment. So, please…please, more error checking. Second, don’t forget to write the unit tests and try to get 100% coverage.  Salesforce only requires 75% to be able to deploy your code, but we highly recommend striving for 100%, it is definitely attainable.  Salesforce has this requirement for several reasons. One being that when Salesforce makes updates to their platform, they can run all the units in all the organizations on the platform and ensure they are not going to break anyone’s system. It is just good practice to do so.  There is tons of documentation on Salesforce that will help you down the right path when it comes to testing. Lastly, we didn’t make any considerations in the code for the situation where a contact being inserted doesn’t have an address to validate or enough address components.  Clearly, you would want to add a check to the code to see if you have a valid combination of data that will allow for an address validation against our API.  You will want to see if any of these combinations exist. These represent the minimum or required fields.

  • Combination 1
    • Address
    • City
    • State
  • Combination 2
    • Address
    • Zip code

You can find the sample code for this blog on our web site with the file names TestTrigger.trigger and TestUtil.cls at this link.

What Has Changed in Customer Service?

Every week, I’m asked, “What is changing in customer service?” The expected answer is that I’ll talk about all the new ways customer service and support is conducted – and I do. There’s self-service solutions that include robust frequently asked questions and video. There’s social media customer service with multiple channels like Facebook and Twitter. And, AI (Artificial Intelligence) that the experts – myself included – say will potentially change everything.Yes, there is a lot that is changing about how we deliver customer service, so I’m about to make a bold statement. If you look at what customer service is, it is the same as it was fifty years ago. And, it will be the same fifty years from now. Customer service is just a customer needing help, having a question answered or a problem resolved. And, in the end the customer is happy. That’s it. When it comes to the customer’s expectations, they are the same. In other words:

Nothing has changed in customer service!

Okay, maybe it’s better said a different way. When it comes to the outcome of a customer service experience, the customer’s expectations haven’t changed. They just want to be taken care of.

That said, there are different ways to reach the outcome. What has changed is the way we go about delivering service. We’ve figured out how to do it faster – and even better. Back “in the day,” which wasn’t that long ago – maybe just twenty or so years ago – there was typically just two ways that customer service was provided: in person and over the phone. Then technology kicked in and we started making service and support better and more efficient.

For example, for those choosing to focus on the phone for support, there is now a solution that lets customers know how long they have to wait on hold. And sometimes customers are given the option of being called back at a time that is more convenient if they don’t have time to wait. We now have many other channels our customers can connect with us. Beyond the phone, there is email, chat, social media channels and more.

So, as you are thinking about implementing a new customer service solution, adding AI to support your customers and agents, or deciding which tools you want to use, remember this:

The customer’s expectations haven’t changed. They just want to be taken care of, regardless of how you go about it. It starts with someone needing help, dealing with a problem, upset about something or just wanting to have a question answered. It ends with that person walking away knowing they made the right decision to do business with you. How you get from the beginning to the end is not nearly as important as how they feel when they walk away, hang up the phone or turn off their computer.

It’s really the same as it’s always been.

Reprinted from LinkedIn with permission from the author. View original post here.

Shep Hyken is a customer service expert, keynote speaker and New York Times bestselling business author. For information contact or www.hyken.com. For information on The Customer Focus™ customer service training programs go to www.thecustomerfocus.com. Follow on Twitter: @Hyken

New CRM or ERP? Reduce Your Migration Risk

Birds and data have one thing in common: migration is one of the biggest dangers they face. In the case of our feathered friends, their annual migration subjects them to risks ranging from exhaustion to unfamiliar predators. In the case of your data, moving it to a new CRM or ERP system carries serious risks as well. But with the right steps, you can mitigate these risks, and preserve the asset value of your contact database as it moves to a new system.

In general, there are two key flavors of data migration, each with their own unique challenges:

The Big Bang Approach. This involves conducting data migration within a small, defined processing window during a period when employees are not actively using the system – for example, over a long weekend or holiday break.

This approach sounds appealing for many sites, because it is the quickest way to complete the data migration process. However, its biggest challenge involves data verification and sign-off. Businesses seldom conduct a dry run before going live with migration, resulting in the quality of migrated data often being compromised.

One particular issue is the interface between a new enterprise system and internal corporate systems. According to TechRepublic, enterprise software vendors still suffer from a lack of standardization across their APIs, with the result that every integration requires at least some custom configuration, leading to concerns about both data integrity and follow-on maintenance.

The Trickle Approach. Done with real-time processes, this approach is where old and new data systems run in parallel and are migrated in phases. Its key advantage is that this method requires zero downtime.

The biggest challenge with this approach revolves around what happens when data changes, and how to track and maintain these changes across two systems. When changes occur, they must be re-migrated between the two systems, particularly if both systems are in use. This means that it is imperative for the process to be overseen by an operator from start to finish, around the clock.

Beyond these two strategies, there is the question of metadata-driven migration versus content-driven migration – another major hurdle in the quest to migrate genuine, accurate, and up to date data. IT might be more focused on the location of the source and the characteristics of each column, whereas marketing depends upon the accuracy of the content within each field. According to Oracle, this often leads to content that does not match up with its description, and underscores the need for close inter-departmental coordination.

Above all, it is critical that a data validation and verification system be in place before moving forward with or signing-off on any data migration process. The common denominator here is that you must conduct data validation and verification BEFORE, DURING, and AFTER the migration process. This is where Service Objects comes into play.

Service Objects offers a complete suite of validation solutions that provide real-time data synchronization and verification, running behind the scenes and keeping your data genuine, accurate, and up to date. These tools include:

One particular capability that is useful for data migration is our Address Detective service, which uses fuzzy logic to fill in the gaps of missing address data in your contact records, validates the result against current USPS data, and returns a confidence score – perfect for cleaning contact records that may have been modified or lost field data during the migration process.

Taking steps to validate all data sources will save your company time and extra money. With Service Objects data validation services, we’ll help you avoid the costs associated with running manual verifications, retesting, and re-migration. And then, like the birds, it will be much easier for you and your data to fly through a major migration effort.

ERP: Data Quality and the Data-Driven Enterprise

Enterprise resource planning, or ERP for short, integrates the functions of an organization around a common database and applications suite. A brainchild of the late 20th century – the term was coined by the Gartner Group in the 1990s – the concept of ERP has grown to become ubiquitous for organizations of all sizes, in what has now become over a US $25 billion dollar industry annually.

ERP systems often encompass areas such as human resources, manufacturing, finance, supply chain management, marketing and customer relationships. Their integration not only automates many of the operations of these functions, but provides a new level of strategic visibility about your business. In the ERP era, we can now explore questions like:

  • Where your most productive facilities are
  • How much it costs to manufacture a part
  • How best to optimize your delivery routes
  • The costs of your back office operations
  • And many more

Its functions often interface with customer relationship management or CRM (discussed in a previous blog post), which provides visibility on post-sale customer interactions. CRM is often integrated within ERP product suites, adding market intelligence to the business intelligence of ERP.

ERP data generally falls into one of three categories:

Organizational data, which describes the infrastructure of the organization, such as its divisions and facilities. For most firms, this data changes very slowly over time.

Master data, which encompasses entities associated with the organization such as customers, employees and suppliers. This data changes periodically with the normal flow of business.

Transactional data, based on sales and customer interactions. This data, which is the lifeblood of your revenue pipeline, is constantly changing.

Note that two out of three of these key areas involve contact information, which in turn can come in to the system from a variety of sources – each of which is a potential source or error. Causes of these errors can range from incorrect data entry to intentional fraud, not to mention the natural process of changing addresses, phone numbers and email addresses. And this bad data can propagate throughout the system, causing consequences that can include wasted manpower, incorrect shipments, missed sales and marketing opportunities, and more.

According to one research paper, data quality issues are often a key driver for moving to ERP, and yet remain a concern following ERP implementation as well. This leads to a key concept for making ERP work for you: automated systems require automated solutions for data quality. Solutions such as Service Objects’ data verification tools ensure that good data comes into the system in the first place, leveraging constantly updated databases from sources such as the USPS and others. The end result is contact data quality that doesn’t depend on human efforts, in a chain that has many human touch points.

ERP is part of a much larger trend in business computing, towards centralized databases that streamline information flow, automate critical operations, and more importantly have strategic value for business intelligence. With the advent of inexpensive, cloud-based software, the use of these systems are spreading rapidly to businesses of all sizes. The result is a world that depends more than ever on good data quality – and the need to use tools that ensure this quality automatically.

What Is Data Onboarding – And Why Is It Important?

What is the best marketing database of all?

Statistically, it is your own customers. It has long been common wisdom that existing customers are much easier to sell to than new prospects – but what you may not know is how valuable this market is. According to the Online Marketing Institute, repeat customers represent over 40 percent of online revenue in the United States, while being much less price-sensitive and much less costly to market to. Moreover, they are often your strongest brand advocates.

So how do you tap into these customers in your online marketing? They didn’t share their eBay account or their Facebook page with you – just their contact information. But the science of data onboarding helps you turn your offline data into online data for marketing. And then you can do content channel or social media marketing to people who are not just like your customers, but are your customers.

According to Wikipedia, data onboarding is the process of transferring offline data to an online environment for marketing purposes. It generally involves taking this offline data, anonymizing it to protect individual privacy, and matching components of it to online data sources such as social media or content providers. Beyond monetizing customer information such as your CRM data, it has a host of other applications, including:

  • Look-alike marketing, where you grow your business by marketing to people who behave like your customers
  • Marketing channel assessment, where you determine whether your ads were seen and led to increased sales across multiple channels
  • Personalization, where you target your marketing content to specific customer attributes
  • Benchmarking against customer behavior, where you test the effectiveness of your marketing efforts against actual customer purchasing trends

This leads directly to the question of data quality. The promise of marketing technologies such as data onboarding pivots around having accurate, up-to-date and verified data. Bad data always has a cost in time and resources for your marketing efforts, but this problem is magnified with identity-based marketing: you lose control of who you are marketing to and risk delivering inappropriate or confusing brand messages. Worse, you lose the benefits of customizing your message to your target market.

This means that data validation tools that verify customer data, such as email addresses, help preserve and enhance the asset value of your offline databases. Moreover, you can predictively assess the value of marketing leads through cross-validating data such as name, street address, phone number, email address and IP address, getting a composite score that lets you identify promising or high-value customer data at the point-of-entry.

Online marketers have always had many options for targeting people, based on their demographics, activity, or many other criteria. Now your existing customer database is part of this mix as well. As your data becomes an increasingly valuable marketing asset, taking a proactive approach to data quality is a simple and cost-effective way to guard the value of this information to your marketing – and ultimately, your bottom line.

The Difference Between Webhooks and APIs

The word “webhook” sounds really cool, and it is, but what exactly is it? What does a webhook do and how is it different from an API? What do webhooks have to do with Service Objects’ APIs?

What is an API?

In order to better understand webhooks, let’s first define API, or “application program interface.” APIs are pieces of code developed to execute some sort of logic. They are building blocks used to create other software, which can include creating other APIs. They serve as a common interface between two separate applications, allowing them to interact by sending and receiving data.

APIs can be used internally to support other processes within a company or they can be provided to the public as a paid or unpaid service. Google Maps API, for example, makes it possible to embed a Google map on a webpage. By using the Google Maps API, a web developer has a simple way to include an official Google map pinpointing their business’s exact location. There’s no need for any special coding or reinventing of the proverbial wheel.

Another example would be Service Objects’ address validation API, which makes it possible for our customers to input an address and receive standardized and corrected address information from our USPS CASS Certified database engine.

What is a Webhook?

Webhooks, for the most part, use APIs. Applications that have a mechanism for webhooks will often use a webhook when an event requiring custom logic has been triggered. Events are typically triggered by a workflow or some type of data entry, but other event types exist.

Webhooks give external developers opportunities to implement some sort of custom logic that either executes and returns results or executes custom logic for processes or purposes outside of the given application or both.

Using Marketo marketing automation software and Service Objects’ data validation APIs as an example, when an address is saved or added to a contact in Marketo, a webhook could be used to automatically validate the address using one of our address validation APIs such as our DOTS Address Validation – US 3 API. When the data is available, it is sent to an API call. In this case, there’s no polling from an external application of the host to check back periodically for data to validate.

So, from the standpoint of providing a webhook to a third party such as Service Objects, the intent is to enable that third party to push data back into your Marketo instance or trigger other operations outside of it. In our example, Marketo can send address data to Service Objects and trigger the address validation API thanks to the webhooks that provide this ability. Learn more about Service Objects for Marketo here.

In contrast, from the standpoint providing an API without webhooks, the intent is to enable others to trigger responses from your API to use strictly in their own applications.

There is no purpose to a webhook without an API, but the reverse is not true.

What Does All of This Mean to You?

It means that data validation is easy! Sign up for a free trial and find out just how easy and effective our webhook-powered APIs are.

The Importance of Data Quality for CRM

What are your customers telling you about your business?

This question has always been the key argument for customer relationship management, or CRM for short. Capturing data about your customers can tell you how many people eat steak at your restaurant on Thursdays, or who buys polo shirts at your clothing store. It provides visibility about who is calling with customer issues, so you can improve your products and service delivery. And in an increasingly interconnected world, related tools such as identity graphs can now track customer behaviors across different vendors and channels – for example, who bought a product, with what credit card, after seeing it on a specific social media platform.

Perhaps most importantly, CRM does what its name implies: it helps you understand and manage the relationship between you and your customers. Good CRM also benefits the customer as well as your business. Done correctly, it represents a single view of the customer across all departments in the organization, to build a cohesive experience for these customers. Knowing who your customers are is strategic and personal at the same time, and its impact ranges from remembering their birthdays to driving customer growth and retention.

So how important is CRM data quality? Bad data isn’t just an annoyance – it is a real, make-or-break cost for many companies. According to a Gartner survey, users of one major CRM system disclosed that poor data quality cost their companies over US $8 million per year on average, with some respondents citing costs of over $100 million annually ! These costs range everywhere from time and money spent catching and managing these mistakes, all the way to losing customers to poor service or missed opportunities. For companies of all sizes, the amount of inaccurate or outdated CRM data ranges from 10% to 40% of their data per year.

Where does bad CRM data come from? A number of sources. For example:

  • Fraudulently entered data: for example, customers who enter “Donald Duck” or key in a phony phone number to get a customer perk or avoid customer registration
  • Errors at the data entry level
  • Duplicate information
  • The natural moves and changes that take place in business every year

Whatever the sources of bad data, simply waiting and letting it accumulate can quickly degrade the value of your CRM database, along with concomitant costs in human intervention as a result of invalid or incorrect customer records. And without a database management plan in place, and specific stakeholders taking ownership of it, the economic value of this data will continue to degrade over time.

While ensuring data accuracy is important, it is also one of the least favorite tasks for busy people – particularly for information such as CRM data. This is where companies like Service Objects come in: our focus is on automated validation and verification tools that can run through an integrated API, a batch process or a web-based lookup. These tools range from simple address and phone verification all the way to lead and order validation, a sophisticated multi-factor process that ranks a customer’s contact information with a validity score from zero to 100. All of these tools validate and cross-verify a contacts’ name, location, phone, email address, and device against hundreds of authoritative data sources.

CRM data truly represents the voice of your customer, and it can serve as a valuable asset for strategic planning, sales growth, service quality, and everything in between. Using the right tools, you can painlessly make sure that this data asset maintains its economic value, now and in the future. In the process, you can leverage technology to get closer to your customers than ever.

Name Deduplication Techniques

Identifying Duplicate Records

The bane of any Database Administrator is maintaining duplicate records. They take up unnecessary space and generally do not provide any added value to contact records. A more challenging task for Database Administrators is how to identify and merge records which might be duplicates, and in particular, duplicate names.

There may be variants for a given name which might not be easily identified in a query, but they are invariantly linked. A common example might be Joe Smith vs Joseph Smith. Both could be referring to the same person depending on how the user may have entered their name.

Name Variants, Finding the Common Name

A particularly useful feature of the Name Validation 2 service is the Related Names output field. This field provides a comma separated list of first name variants for a provided name. For example using the given name; Joe, related names returned include Joel, Joeseph, Joey, Josef, Joseph, and José.

With this information, it becomes easier to identify names which are related but in a different form. There may be cases, however, where names cannot be identified as related but can be linked from similarity. Some examples include names that are misspelled or alternate names which are not related but similar. These names can still be identified through the Similar Names output fields of the Name Validation 2 service.

Similar Sounding Names

DOTS Name Validation 2 employs sophisticated similar name matching algorithms to match names drawing from a database of international names with up to 1.4 million first names and 2.75 million last names. First and last name similar results are returned in a comma separated list which can be used to compare against names that already exist in the database.

An example similar name result for the given name; Robert Smith, would return similar first names Rhobert,Róbert,Robertt,Roebert,Roibert,Rubert,Robbert, and similar last names Smyth,Smithe,Smiith,Smiyth. Of the similar names that are found, names are returned in order of most common to least common.

Merge and Promote the Winning Record

Using these results, a query can potentially link similar or related names and identify records which are duplicates. Once duplicate records are identified, the question becomes which should be promoted as the winning record? This decision can depend on factors based on business logic, perhaps a record which contains other vital contact points such as address or phone number or perhaps entry date is chosen as the winning record. Once a winning record is chosen, a merge process is incorporated to merge contact fields from identified duplicates to build a complete record.

Conclusion

Ridding your database of duplicate contact records can be an arduous task, but with the help of Name Validation 2, it doesn’t have to be. Leveraging the vast quantity of names that Name Validation 2 draws upon yields a top quality solution to identifying duplicates through related and similar names.

For more information about Name Validation 2 service, or to receive a free trial key, click here.

For developers, our Name Validation 2 documentation can be found here.

Impacts of Bad Data Lead to Negative Consequences

Marketing Automation and Your Contact Records: A Five Part Series That Every Marketer Can’t Miss (Part 2)

Data integrity is beyond a doubt “the” cornerstone of your marketing campaigns. Think carefully about this. ALL company decisions, both operational and strategic, are based on the information you collect from your data. And we’ve already acknowledged that bad data exists in your marketing automation platform. So just how deep do these impacts go? Let’s dive right in, looking closely at the top two offenders: Mistrust and Costs.

Mistrust:

“If you can’t trust the data, what else will you base your decisions on? If you can’t trust the data, how much effort will be wasted checking and rechecking certain results to figure out which is correct? And how much effort will be spent in investigations, root cause analysis and corrective actions required to fix data problems?” Carol Newcomb, Consultant with SAS Data Management Consulting

It’s no joke. Mistrust happens both internally and externally. Remember in our last post we stressed the importance of establishing trust between your marketing and sales teams (internal)? Well, that same nurturing needs to happen with your vendors and customers (external). Especially since their data is tied to everything in your organization. I would venture to guess that your CRM is connected to accounting. What if, after all your hard work to gain your customer, their invoice was sent to the wrong address? It’s going to get back to you and YOU are going to need to fix it. But here’s the bigger problem: when mistakes like this become repetitive, your reps stop believing that the data in their contact records is genuine, accurate, and up to date. They oftentimes feel compelled to update their own records. What’s worse, your customers and vendors will become annoyed, being asked over and over again, from different sources, for their contact info. Savvy marketers must realize that small missteps like this can lead to bigger issues, putting your company’s reputation and brand directly in harm’s way. Left unaccounted for, mistrust will weaken your company’s foundation in much the same way that a trickle of water slowly erodes bedrock, causing irreversible damage.

This leads us to Costs:

“Most organizations overestimate the quality of their data and underestimate the impact that errors and inconsistencies can have on their bottom line.” The Data Warehouse Institute

Brace yourself…you ARE wasting money. That is, bad data is causing a loss of revenue. How many emails, direct mail, and phone calls go out on any given day? Knowing that 25% of your customer data is inaccurate due to duplicate accounts, intentional and unintentional data entry errors, lost contacts, aging contact records, there is a serious loss of productivity happening here. To demonstrate the cost of quality, let’s apply the 1-10-100 Quality Management rule by George Labovitz and Yu Sang Chang: it costs $1 to verify a record as it is entered, $10 to fix it later, and $100 if NOTHING is done, which leads to loss upon loss upon loss. And, we know that corporate data is growing at an average rate of 60% per year and climbing, so it’s all the more important to screen your data going in and then maintain and manage it over time. The takeaway: quality IN equates to quality OUT, saving you time, resources and money all the way around.

If you’re like me and enjoy a detailed checklist, you’ll appreciate the following. Business2Community has compiled a spot on list highlighting many of the consequences of bad data:

  • Lower customer satisfaction and retention
  • Loss of Revenue
  • Misinformed OR under-informed decisions
  • Dissatisfied sales and distribution channels
  • Lower productivity
  • Higher consumption of resources
  • Invalid reports
  • Failure of your marketing automation initiatives
  • Higher maintenance costs
  • Errors in product/mail deliveries
  • Increased churn rate
  • Distorting campaign success metrics
  • Higher spam counts and un-subscriptions
  • Negative publicity on social media

Hungry for more? Great! Let’s begin focusing on where to find the problems and inconsistencies so we can clean up the mess. In Part 3 of our series, we’ll get up close and personal with a lead form, how these can go wrong, and then how to correct them.

If you are interested, Service Objects provides a free Data Quality Scan that will give you insight into the quality and accuracy of your contact data. Click here to get your free scan.

Up Next: Part 3: Data Breakdown Happens. Know the Reasons “Why” and Protect it from the Start

Make Marketing Automation Work for You

Marketing Automation and Your Contact Records: A Five Part Series That Every Marketer Can’t Miss (Part 1)

Are you keeping up with the competition? If yes, then you’re likely using one of these excellent marketing automation platforms to run your campaigns:

Marketing Automation2No doubt, the planning, coordinating, and executing of your marketing efforts has never been easier. But, here’s the kicker: these platforms are only as good as the quality of your contact data.

There’s a common misconception that the quality of contact record data is automatically maintained within marketing automation platforms, but, in reality, these platforms are not built to correct contact records for accuracy nor genuineness.

Without accurate, genuine, and up-to-date contact records, even the most sophisticated and expensive platforms are handicapped from performing well. According to SiriusDecisions, it is estimated that over 25% contact records contain inaccurate or critical errors. In addition, 70% of your contact records will become obsolete/change over the year. Over time and quite systematically, the integrity of your contact records will become suspect, translating to missed lead opportunities across multiple marketing channels. So how can you decipher whether the quality of your contact records is good or bad? This is something we will look at in this five-part series.

Reaping The Benefits

Switching gears, let’s imagine that your marketing automation is working at its full potential. You’re in a position to see some serious return on investment, along with reaching other important marketing and sales goals. We’ve listed three key benefits below to demonstrate why good data is at the heart of maximizing your marketing automation performance:

  • Cutting expenses—Some marketing platforms charge by the contact record. Simply correcting or eliminating bad records will result in cost savings. And this is just the tip of the iceberg. We will cover this in more detail as part of the series.
  • Increasing revenue—Handing off accurate contact records to your sales team means better contact rates and more sales, not to mention a happy sales team.
  • Accurately tracking and monitoring marketing campaigns—Good decision-making is based on good data. Accurate, measurable data is ESSENTIAL in order for these campaigns to be successful. Data needs to be precise, starting with your contact records.

Sounds so simple right? It absolutely can be, again, as long as your data is accurate, genuine, and up-to-date. And, we’ll repeat this point as many times as it takes: in doing so, you WILL save time and money!

Working Smarter, Not Harder

Think about it: managing contact records is labor-intensive, especially in a large company. Marketing and sales teams need to establish a good workflow to efficiently turn leads into profits. And as we’ve discussed, even the best resources might need a little support themselves in order to get top results. How frustrating is it to hear complaints from your sales team, time, and time again, about bad contact information, incomplete profiles, or lead rejection? As the size, speed, and values of information increase, you might soon be spiraling out of control. The last thing you want is to contribute to the poor quality of customer data which costs U.S. businesses a staggering $611 BILLION annually. Bottom line: there is no room for error. Teams must be able to trust each other and work efficiently together. Period.

So, consider this your wake-up call as we embark on a 5-part series into the realm of contact data integrity and the role it plays in your company. It’s imperative that marketing teams understand what “bad data” is and how it affects their bottom line. We’ll wrap up the series with some solutions about how you can take action and rid your marketing automation platform of bad contact data. Before signing off, here’s something to chew on until our next post:

It’s not a question of “IF” you have bad contact data, it’s a question of “HOW MUCH”.

If you are interested, Service Objects provides a free Data Quality Scan that will give you insight into the quality and accuracy of your contact data. Click here to get your free scan.

Up Next: Part 2: Impacts of Bad Data Lead to Negative Consequences

Increase Your Company’s Worth With Friendly APIs

Customer relationship management (CRM) software has become a powerful business tool. With a CRM tool such as Salesforce, Microsoft Dynamics CRM, Infusionsoft, or Oracle CRM On Demand, all customer data — including interactions and insights — is centralized for easy access and management.

As powerful as CRM applications may be, they are built for a single purpose: customer relationship management. Integrations, such as a data validation plug-in, can extend the functionality of CRMs, but only if APIs are created, and more importantly, well documented so that the third party developers can actually understand them.

CRM Basics

CRMs provide a centralized location for storing customer data and interactions. Early CRM applications, such as ACT! and Telemagic, were basically databases that stored customer phone and address data along with notes about the customer, recent orders, and so forth. Typically shared over a network, these programs allowed other employees to review contact notes as needed. They were also commonly used to create mail merge documents.

Today, modern CRMs are hosted on the cloud and loaded with robust contact relationship management, marketing automation, and social media features. These sophisticated applications are tightly focused on managing the customer’s journey. That’s what they’re designed for, and that’s what they do best. They are purposefully built to this end.

It’s a huge undertaking to create a piece of software to manage both the customer data (names, addresses, and contact information) and every single interaction across a multitude of channels. Small or large, CRM developers maintain a laser focus on their core product and its purpose. They’re concerned about making sure their software lives up to its core promise. They’re not necessarily concerned about extending their software to accommodate various users’ wish lists.

How User-Friendly APIs Ultimately Improve CRMs

While a CRM may have a plethora of tools built into it, the possibilities become endless when the CRM has an API that can be used by third party developers. For example, when Service Objects is able to integrate with a CRM’s API, we are able to create a data validation plug-in to clean up, standardize, and validate the data contained within the CRM.

This is valuable to everyone involved including:

  • The CRM developer — They don’t necessarily have the time or desire to add functions like data validation because their priorities are focused on the core product. With an API, valuable functions can be added without the developer having to expend resources on them.
  • The third party developer— Third party developers benefit by being exposed to the CRM developer’s customer base.
  • The end user — End users are happy to have external tools available through their company’s CRM platform where they can easily add the unique functions they want.

Creating an API for developers opens the door to new possibilities. A company like Service Objects can use the API to access the client’s data within the CRM, validate it, and then push it back in. With data validation plug-ins, the process is seamless for end users, the data quality improves, the business can operate more efficiently with less waste, and operating costs go down.

But there’s a catch: an API has to be available for a developer to use — complete with meaningful and current documentation. Integrations, such as a data validation plug-in for CRM, are magnitudes easier if the API documentation is up to date and organized.

We implore API creators to work hard to make a good API and supporting documentation. Doing so helps us all, and, most importantly, it helps all of our clients.

Moving To A New CRM? Clean Your Data FIRST!

Are you moving to a new customer relationship management (CRM) solution? While you’ve likely included various IT costs and end-user training in your migration budget, have you considered cleaning your data before your move it into your new CRM? Cleaning your data before a CRM migration is a best practice, one that will solve data quality problems, reduce costs, and position your new CRM for a successful implementation.

The Pitfalls of Migrating Dirty Data

Bad data, whether it’s a wrong address, misspelled name, duplicate or redundant, or flat-out fraudulent, is costly — and chances are, you have a lot of it. In fact, an analysis by DataBluePrint, The Cost of Bad Data by the Numbers, estimates that anywhere from 10 to 25 percent of data is inaccurate. This can seriously impact your bottom line time and time again.

For example, let’s say you have a mailing list of 100,000 and that 20 percent of your addresses are bad. That’s 20,000 mailers that disappear into the void, costing you print and material costs, postage, and manpower. Not only that, this waste happens every time you run a direct mail campaign. Meanwhile, you may have inaccurate customer data, resulting in lost or delayed orders, unhappy customers, and bad PR.

Why Fix the Data Problem During the Data Migration?

First, it’s much cheaper to fix data quality issues as part of the migration than it is to let them fester. Rather than continually sending mail to bad addresses, cleaning the data will immediately solve the problem (and many others). Data hygiene quickly pays for itself anytime you do it.

When migrating data, it makes sense to fix the data problem as PART of the migration project. This is the perfect opportunity to focus on data hygiene. Just as most people clean out their junk before moving to a new home, cleaning bad data before you move creates a fresh start.

Cleaning data during migration is much easier than doing it after your new CRM goes online. After all, your old system is familiar, making it much easier to find and resolve data quality issues. If you wait until the data is in your new CRM, you’ll probably have a much harder time because everything is new and different.

It’s generally easier to approve a data hygiene project as part of the CRM upgrade than as an after-the-fact add-on. When included as part of the upgrade, the data hygiene element is understood to be a contributing — and necessary — success factor. If you tag it on after the migration is complete, you’ll probably encounter pushback and repercussions for lacking the foresight to clean the data before the move.

Fixing Bad Data During Data Migration is Easier Than You May Think with the Right Tools

As part of the data migration, you will need to export your data from your legacy system and then import it into your new CRM. However, before you import it, you have the opportunity to clean it using an external data validation tool such as Service Objects’ data verification API. Service Objects’ CASS-certified data hygiene tools quickly identify and clean bad data by comparing records against a series of massive databases containing millions of accurate, standardized contact records.

This intermediary step is well worth doing as it instantly weeds out the bad data, leaving only accurate, verified data to import into your new CRM. If you’re planning a move, don’t forget to do a deep cleaning — of your data.

Sources: The Cost of Bad Data by the Numbers

Why ‘Address Line 2’ Should Never Be Offered In Address Forms

You see address line 2 all the time. Your own web forms probably even have a field for it. However, did you know that address line 2 doesn’t really exist — at least in the U.S. Postal Service’s eyes? Not only does the USPS not require an address line 2, it doesn’t even acknowledge its existence.

USPS Addressing Standards

According to the USPS’s postal addressing standards, a complete address consists of just three lines:

Recipient Line
Delivery Address Line
Last Line

An example of a complete address using the three-line standard is:

John Doe
123 Main Street, Unit 21
New York City, NY 10001

Note that placing “Unit 21” on its own line, commonly referred to as “address line 2,” would result in a non-standardized address. While a human should be able to figure out that John Doe lives or works in unit 21, automated processing systems could have trouble.

John Doe2

Though address line 2 does not technically exist, the USPS does allow for additional information in a secondary address line (such as “deliver to dock 23.”) However, that information should be considered more like a comment area; it should not contain any deliverable address information. Our address validation software does scan address line 2 for this type of information, but there’s no guarantee the software will know what to do with it.

Suites and apartment numbers should be placed at the end of address line 1 while recipient details like name and company should go above the address.

What’s Wrong with Including an Address Line 2 Field on Your Online Forms?

Businesses commonly include an address line 2 field on their online forms, inviting end users to split address information as they see fit. When presented with two address lines, it’s only natural for users to separate floor, suite, and unit numbers into two separate lines. Some users will use address line 2 to add additional information such as “ATTN: John” or “Cross street: 2nd Avenue.”

In short, too much information can be mixed up in address line 2, making parsing out important information difficult and inconsistent. For example, if the recipient’s name is mixed into address line 2 along with an apartment number or letter, it may not be entirely clear to the address validation system what the intention of the address is since the name should have been the first line (above the address) and the apartment number should be placed in the address line itself. Situations like this can often be fixed with address validation software, but the likelihood of getting a perfect address match is reduced since there are so many ways address line 2 can be filled in.

Another issue with presenting an address line 2 for end users to complete is it invites them to mistakenly enter an alternative address line 1 (for example, their home and work addresses if both are in the same city). If both address lines 1 and 2 contain complete, proper addresses, the address validation system cannot determine the originally intended destination.

As the saying goes, garbage in, garbage out. The closer to USPS standards you can get initially, the more likely it is for an address to be cleanly validated, and the more likely it is for your mail to arrive at its proper destination. Even though our software is constantly updated and improved to handle and fix improperly structured addresses, it’s always best to strive for clean input data when possible.

Should You Eliminate Address Line 2 from Your Online Forms?

If you want to invite garbage in, by all means keep asking for an address line 2. If you’d rather cut the confusion and get cleaner data from the start, stop using address line 2. USPS doesn’t require it — and doesn’t necessarily know what to do with it.

Some end users don’t know that they need to enter apartment or suite numbers to the main address line 1. You can help make address input more obvious to end users by adding an optional field to the web form labeled “unit number.” You could then append the unit number to address line 1. End result: less confusion, more consistent address validation, and better deliveries.

24 Hours to Improving Your CRM

24-hour-contact-verificationSo you’ve invested in a customer relationship management (CRM) software and implemented its use across your business organization. Like any investment, a CRM deployment takes time to yield a positive return. What’s within your control, however, is the speed by which you get to that sweet ROI point. Here are four effective tips to instantly boost your CRM initiative.

1. CRM Education and Training for Employees

Enabling business organizations to systematize and synchronize customer data, manage targeted customer interactions, and give employees access to customer data through numerous platforms and devices. According to Gartner, “CRM will be at the heart of digital initiatives in the coming years.” In addition, “Hot areas for CRM investment include mobility, social media and technologies, Web analytics and e-commerce.”1

However, you won’t reap the benefits of a CRM deployment unless your employees–the very people who will be using the system to interact with your customers and persuade them to buy your product or sign up to your service offering–do not understand how your chosen CRM solution works. Employee training is crucial. A fast way to go about educating your employees is to first identify the tech-savvy ones and to have them comprise the pioneering batch of your CRM trainees. These formally trained employees can then be delegated to teach their colleagues.

2. “Social” CRM

Make sure to integrate your company’s CRM with the different social media platforms. By doing this, your employees can just as easily append a new customer’s email address, as well as automatically uncover social media profiles that are related to that email address. If your sales reps, for example, can view what your customers frequently post on their social media accounts, then they can better position their offers.

3. Lead Follow Up

It is costing your company a lot of money to house and maintain customer data on your CRM. So, ensure that your employees are reminded to follow up on solid leads and to nurture those showing promise. A CRM makes this easy because an employee has access to timely customer information, history, and preferences, as well as current price lists and other product data. Your employees also have quick and easy access to email templates that can be touched up and sent depending on need. So, there’s really no excuse not to follow up regularly, because a CRM makes touching base with leads an effortless task.

In addition, a CRM builds accountability among your employees. Their user logins are reflected on their every customer interaction, enabling them to see which ones still need a slight nudge toward a sale.

4. Standardized and Verified Customer Information

The success of your CRM initiative is largely dependent on the quality of your customer data. You cannot formulate good business decisions, let alone foster meaningful interactions with your customers, if you are working with dirty data replete with customer name misspellings and incorrectly entered addresses. Imagine having your employees send email messages containing salutations that refer to customers in a wrong gender. Also, think about the time and money wasted in sending messages to email addresses that were inadvertently mistyped.

Keep in mind that a CRM is a digital tool–albeit a high-performing one–and you have to wield it properly to get the most out of it. Your CRM is only as good as the customer information it contains and the employees leveraging such information. The latter you can address via appropriate CRM training sessions. As for contact verification, you have data quality services to help you. Because customer data is constantly changing (people move to new addresses, etc.), hiring someone to manually clean, verify, and update your customer information database may not be a cost-effective move. Contact verification can be automated via a suitable data validation API. Data quality services like Service Objects’ name, phone number, address, email address, and IP address validation solutions keep the information in your CRM always current and reliable.

1 Gartner Press Release, Gartner Says CRM Will Be at the Heart of Digital Initiatives for Years to Come, published February 12, 2014, http://www.gartner.com/newsroom/id/2665215

Why Your CRM is Ineffective

Ever wonder why your CRM isn’t as effective as you had hoped it would be? Here’s the reason: Bad data! According to a recent article published on Health IT Analytics, the following takes place about every 30 minutes:  

  • 120 business addresses change 
  • 75 phone numbers change 
  • 30 new businesses are formed
  • 20 CEOs leave their jobs 

With data changing this quickly, it’s no wonder your CRM database is loaded with bad data. 

How Does Bad Data Effect Your CRM?

crm-iconFor starters, bad data is unnecessarily wasteful. Let’s say that you have a sales pipeline filled with hot B2B C-level leads along with a direct mail nurturing campaign. Some of those CEOs will leave their jobs while another portion will move to new addresses. If you don’t update your CRM database accordingly, your mailings will be a waste of paper, postage, and manpower. As time goes by, more CEOs will leave or move, rendering your hot lead database even more ineffective. 

It’s not just marketing that suffers. Customer service can suffer too. For example, it’s not uncommon for customers to provide online retailers with incorrect or misspelled addresses. While the original error may have been the customer’s, who do you think they will blame when their package never arrives? 

Unnecessary waste, increased costs, and reduced customer satisfaction erode operational efficiencies and adversely impact the bottom line — and it all begins with bad data.

How to Improve Your CRM’s Effectiveness 

If bad data is responsible for your CRM’s ineffectiveness, the solution is simple: improve your data quality. But how? Few companies have the luxury of hiring a full-time staff member to confirm and correct contact data. Not only would that, this would be difficult-to-fill, tedious and never-ending job. Fortunately, it’s a job perfectly suited to automation. 

By integrating a data validation API from Service Objects into your point-of-sale or CRM software, it becomes possible to validate, standardize, correct, and append contact information in real-time. If a contact moves to a new address but doesn’t notify your company, your database will not degrade. This bad data will be corrected by the data validation API, which compares the existing contact information against a massive USPS database containing the latest address changes.

Service Objects offers several data validation APIs covering everything from address and phone number validation, email address validation, and Geo codes to lead validation, order validation, BIN validation, IP address validation, and demographics. 

Data changes at a rapid pace. Is your CRM able to keep up? Improve its effectiveness by combating bad data at the source.

Source: 

Health IT Analytics, Battling Bad Data to Grow Market Share with Big Data/

 

Service Objects integrations can help improve your contact data quality, help with data validation, and enhance your business operations.

New Contact Validation Design References for Microsoft CRM, Microsoft SQL & Oracle DB

Our development team has been working hard on some new design references in an effort to make our contact validation services easier to integrate.  Many services can be strengthened with Service Objects products for address validation, email validation, NCOA service, phone number validation and much more.  Our design references are intended to make the integration process easier for web developers and to remove the guess work of what additional downloads and preparations may be needed for success.

Our NEW additions to our design reference library are:

Visit the above links to learn more and to download the Design References for your project.

You can use a FREE Trial Key for any of our products to test them out on your system.

Our goal is to help you prevent fraud and mistakes before, and even after, they enter your system.