Service Objects’ Blog

Thoughts on Data Quality and Contact Validation

Email *

Share

Posts Tagged ‘Tech Support’

Follow This Checklist to Ensure a Smooth API Integration

There can be a lot of “i’s” to dot and “t’s” to cross when integrating with any API.  Here at Service Objects, we certainly recognize there can be a lot on the to-do list when starting an integration project. Integrating with our APIs is pretty straight forward, but we have developed a quick checklist that will ensure it is as easy as possible to follow our best practices.

Failover, Failover, Failover
Service Objects prides ourselves as having 99.999% server uptime. However, in the unlikely event that we do experience an issue with one of our servers, implementing a failover configuration is arguably the most important aspect of integrating with any of our APIs. Proper failover configuration will ensure that your application continues to operate unhindered in an event that the primary Service Objects web server is unavailable or not responding as expected. Below is an example (using C# syntax) of proper failover configuration.

The example above is for our DOTS Address Validation 3 – US service, but this scenario will be relatively similar for our other services. The main thing to note is that the primary call is pointed towards ws.serviceobjects.com and the backup call within the catch statement is pointed to wsbackup.serviceobjects.com.  In the event that the primary web server is unresponsive, producing strange errors or behaving abnormally, then the backup URL will be called and your application will continue to function as expected.  Another important item to note is that proper failover will check the web service for an error response with a TypeCode of 3. This indicates that a fatal error has occurred in the web service and that the backup URL should be called. If you are using one of our older services, then the error object that service will return may be different (there will be only “Number” and “Desc” fields present in the Error object) and you will need to check for a Number value of 4 to indicate a fatal error.

URL Encoding

Properly encoding the URL you are using is also an item you will want to place on your to-do list for integration. If you are using a path parameter to access our services, then you’ll need to use what’s called RFC3986 encoding to encode your URLs. If you are using query string parameters to hit our services, then you can use the RFC3986 encoding or the older RCF2396 encoding. What do both of those RFC standards mean? Well in short, if you are using a query string URL, spaces can be acceptably replaced with “+” in the URL. If you’re using a path parameter URL, then spaces will have to be encoded with its hex equivalent %20. Using the RFC3986 standard encoding is generally the safer bet since it is the most accepted and newest.

Logging (we’re not talking cutting down trees)

We also highly recommend implementing some code that will log the requests and responses that our services provide. For the sake of customer privacy, we do not log information on our end. Logging can be a big help when troubleshooting your code. It can also be a big help to us if you ever need technical support. Since we do not log customer requests, it is very helpful for us to have the exact inputs or URL used when contacting our services. This will allow us to provide the stellar customer support that comes with integrating with a Service Objects’ API. If you do run into any issues, please send us your request and response as this will help us get to the bottom of whatever issue you are encountering as quickly as possible.

Null or Nothing Strings
Our output structure for most of our services is consistent and won’t change without notice.  For the most part, our structure will stay the same and we’ll return the elements in our output structure as blank strings as opposed to null objects. That being said, we still highly recommend that your application performs null checks before using the response or any of the nested elements from our service. Even though the output structure for our services is very consistent, appropriately null checking our response can save you and your application a lot of headaches if something unexpected occurs.

URLs, IP Addresses and Whitelisting
Some clients or prospects will ask us if they can access our web services by IP address or whitelist the IP address in their firewall. Well, you certainly can, but we highly recommend whitelisting or hitting our web service by the domain URL. Most modern firewalls will support whitelisting domains by names and we highly recommend utilizing this. The reason being that IP addresses can and will change. We have a lot of backups and redundancy set up in our web services that will go unnoticed if you are accessing the service via a domain. If you absolutely need to hit our service by IP address or whitelist them like that, please reach out to our support team and we will be happy to make recommendations on best practices and provide you the information you will need.

Use Cases
If you are curious to know if you are understanding the results correctly or want to know if you are using the right operation to get the functionality you want, our developer guides can help provide more clarity about certain inputs and outputs from the service of choice.

Summary
As discussed, integrating with any API can bring up a lot of questions. If this list didn’t cover your questions or particular use case, please feel free to send your requests or questions to support@serviceobjects.com and we will be happy to help you get up and running with any of our 24 data validation APIs.

Service Objects’ Average Response Time Ranks Higher Than Google…

When comparing SaaS providers, one of the key metrics that is often measured is the service response time. That response time, or latency, is a succinct measurement of the approximate time it will take for a response to be returned for a given query. Often, the major challenge in reducing response time is determining which service component is adding latency. At Service Objects, we are continually scrutinizing application optimization, network congestion, and monitoring real-time, real-world API calls to ensure our SaaS response times are second to none.

Our goal is to exceed availability and response times in our industry as a SaaS provider. We’ve invested in bank grade infrastructure and security, with data centers operating throughout the US. All of our databases are operating on the latest flash storage technology, returning query responses in less than 0.1s, and we are constantly enhancing and expanding our web server application pools. Bundle all of that with robust VMware clusters, multiple layers of network redundancy, and one of the industry’s only financially back service level agreements of 99.999% uptime and the result: we don’t just achieve industry standard availability and response times, we’ve raised the bar.

Third-party monitoring providers have ranked many of our DOTS Web Services average response times within the same echelon as leading tech companies, such as Apple and Google. In many cases, we are better than some of the biggest and well-known technology companies. Just how fast are we? If you are connecting from Los Angeles, our DOTS Address Validation service hosted in San Jose, CA boasts an incredible 0.089s response time. If your business is connecting from New York, we have you covered, with a lightning fast average response time of 0.27s from our New Jersey data center.

Service Objects recognizes how important it is to our customers to have little to no downtime. We are so committed to achieving this goal that we made Outstanding Network Performance one of our Core Values. We are continually monitoring our servers and measuring our response times, and as the graphic below illustrates, the results speak for themselves.

 

What Has Changed in Customer Service?

Every week, I’m asked, “What is changing in customer service?” The expected answer is that I’ll talk about all the new ways customer service and support is conducted – and I do. There’s self-service solutions that include robust frequently asked questions and video. There’s social media customer service with multiple channels like Facebook and Twitter. And, AI (Artificial Intelligence) that the experts – myself included – say will potentially change everything.Yes, there is a lot that is changing about how we deliver customer service, so I’m about to make a bold statement. If you look at what customer service is, it is the same as it was fifty years ago. And, it will be the same fifty years from now. Customer service is just a customer needing help, having a question answered or a problem resolved. And, in the end the customer is happy. That’s it. When it comes to the customer’s expectations, they are the same. In other words:

Nothing has changed in customer service!

Okay, maybe it’s better said a different way. When it comes to the outcome of a customer service experience, the customer’s expectations haven’t changed. They just want to be taken care of.

That said, there are different ways to reach the outcome. What has changed is the way we go about delivering service. We’ve figured out how to do it faster – and even better. Back “in the day,” which wasn’t that long ago – maybe just twenty or so years ago – there was typically just two ways that customer service was provided: in person and over the phone. Then technology kicked in and we started making service and support better and more efficient.

For example, for those choosing to focus on the phone for support, there is now a solution that lets customers know how long they have to wait on hold. And sometimes customers are given the option of being called back at a time that is more convenient if they don’t have time to wait. We now have many other channels our customers can connect with us. Beyond the phone, there is email, chat, social media channels and more.

So, as you are thinking about implementing a new customer service solution, adding AI to support your customers and agents, or deciding which tools you want to use, remember this:

The customer’s expectations haven’t changed. They just want to be taken care of, regardless of how you go about it. It starts with someone needing help, dealing with a problem, upset about something or just wanting to have a question answered. It ends with that person walking away knowing they made the right decision to do business with you. How you get from the beginning to the end is not nearly as important as how they feel when they walk away, hang up the phone or turn off their computer.

It’s really the same as it’s always been.

Reprinted from LinkedIn with permission from the author. View original post here.

Shep Hyken is a customer service expert, keynote speaker and New York Times bestselling business author. For information contact or www.hyken.com. For information on The Customer Focus™ customer service training programs go to www.thecustomerfocus.com. Follow on Twitter: @Hyken

C# Integration Tutorial Using DOTS Email Validation

Watch this video and hear Service Objects’ Application Engineer, Dylan, as he presents a 22 minute step-by-step tutorial on how to integrate an API using C#. In order to participate in this tutorial, you will need the following :

  1. A basic knowledge of C# and object-oriented programming.
  2. Visual Studio or some other IDE.

Any DOTS Validation Product Key. You can get free trial keys at www.serviceobjects.com.

In this tutorial, we have selected the DOTS Email Validation web service.  This service performs real-time checks on email addresses to determine if they are genuine, accurate and up-to-date. The service performs over 50 tests on an email address to determine whether or not it can receive email.  If you are interested in a different service, you can still follow along in this tutorial with your service of choice. The process will be the same, but the outputs, inputs, and objects that we’ll be dealing with in the integration video will be slightly modified.

Enjoy.

Why Providing Feedback is Important to Improving Software

Developing user-friendly software and an amazing user experience requires listening to users. As a software developer, we rely on user feedback to continuously improve our data validation APIs. As a user, you may not feel compelled to provide feedback to software developers, but if you value a great experience, your role is an essential one.

Examples of User Feedback Collection Mechanisms

You’ve likely encountered user feedback collection features in various forms. For example, Microsoft Office prompts you to click on a happy or sad face in order to express user suggestions. This simple menu is available from the File tab, allowing you to tell Microsoft what you like, what you dislike, and your suggestions.

You may also find that user feedback is naturally baked into installed software, accessible via the Help menu.

If you’re a .NET developer, you’re probably familiar with the Visual Studio Experience Improvement Program. The little helper icon in the top right of the program has become the ubiquitous symbol for feedback, client support, and general help desk tasks. With just a click of a button, users can instantly share their experiences with the software.

What about when an application crashes? You’ll often be prompted to send a crash/bug report to the developer. These reports may even contain hardware/software configurations, usage patterns, and diagnostic information helpful to developers — and all you need to do is click a button.

These are but a few of the many ways that modern applications send information back to the software company. Obtaining user feedback as well as any crash/bug report information is crucial to the development of a piece of software or service. This information helps software developers isolate where and why problems occurred, leading to product updates.

User Feedback Challenges: Privacy Concerns

But what about “Big Brother” or other potential snoops? With the various means of providing feedback and the different collection schemes (opt in / automatic), privacy concerns are valid. With these data collection tools baked into the software, it is hard to know how much information is actually being sent back to the company. It could range from the harmless crash/bug report from a software crash or diagnostic information to controversial GPS breadcrumb data.

However, many people don’t want other entities collecting data on them or analyzing their usage patterns. While not all software is intentionally spying on you, it would be nice to know what exactly is collected. More often than not, it’s unclear what’s collected and how it’s used. This lack of transparency concerning data collection inevitably leads to unease, which is why many users opt not to participate in “Experience Improvement Programs” and other data collection schemes.

 

Another challenge for developers is that not all companies have installed software on client’s devices, making data collection challenging, even when users are willing to opt in. For example, the normal avenues for collecting data, such as hardware/software configurations, from users are not seamlessly integrated with web-based technologies such as web services or certain SaaS. Many companies struggle with this and must use other means of getting user feedback.

 

Despite privacy concerns and a lack of openness, the bottom line is that user feedback is valuable. When utilized properly, the information can be used to fix existing problems in the software as well as lead to new features. The reason why subsequent versions of software are so much better than 1.0 is directly related to user feedback.

How Service Objects Gathers User Feedback

Service Objects does not collect data on clients, so the privacy concerns discussed above are irrelevant. Potentially sensitive data processed through our services is not monitored or collected. This is a highly sought after data validation “feature” for our clients, but at the same time, it presents a challenge for us to gather detailed user feedback.

We offer several ways for our customers to provide feedback: You can connect with us via phone (805-963-1700), email, and via support tickets.

Any user feedback we receive is taken very seriously and can lead to bug fixes, updates, and even new services/operations. A great example of this is the FindAddressLines operation in DOTS Address Validation 3. The operation was initially born to help a particular client and has been utilized to great effect to clean up messy address data.

If you have any feedback you would like to share to help us improve our data validation services, we encourage you to reach out to us at anytime.

Demonstrating JavaScript and NuGet with DOTS Address Validation 3

In this demonstration I am going to show you how to implement our DOTS Address Validation 3 web service with JavaScript and no Asp.Net markup tags using Visual Studio. We are going to create a form for address inputs, similar to what you would see on a checkout page, which will call our API to validate the address. Although, we could call the web service directly from JavaScript, we would then be forced to expose our license key. So we will use an aspx.cs page to handle our call to the service API, keeping our license key safe from prying eyes. In this demonstration, I will also show you how to add the NuGet version of DOTS Address Validation 3 to the project so that you are always using our best practices and speeding up your integration.

First, create a new empty web site, which I am going to call, cSharpJavascriptAjax. Then I am going to add a new html page called, Addressvalidation.html.

Next, we are going to add the markup for the form. Add the following code in between the body tag.

This form was designed using the table structure for the layout. You will likely want to create the layout using divs, which is more appropriate. Here is what the page looks like in the browser.

To speed things up a bit, I am going to include jQuery in the header so we can utilize some of the short cut functions. Of course, all of this can be done in pure JavaScript.

For the action we will take when the Complete button is clicked, we will create the submit function. This will take in all the values from the form, call the web service and then display the response to the screen. I added an alert and left the URL blank, this will allow for a quick test to make sure what we have is working. Later we will update the failure and success sections as well. For now, they will simply display an alert for testing.

First, let’s start by making sure we are getting all the inputs into the submit function and are getting all the inputs back. I am going to display an alert with DataValue variable so we can see what we have.


Good, now that we know that is working, we will need to make the call to the web service. Like I mentioned before, we could call the service directly but using JavaScript would make the license key visible. So we are going to leverage asp.net for security by creating and empty aspx page and add a method to the aspx.cs page to handle the call. Let’s go ahead and add the aspx/aspx.cs page to the project and call it, ServiceObjectsAPIHandler.

Here is our empty aspx.cs page, where we will be adding our function. It will go below the Page_Load method.

Let’s call the method, CallDOTSAddressValidation3_GBM. There are two things to notice with the signature for this method. We decorate the method with the WebMethod attribute so that our new method can be exposed as a web service method. And, we declare it as static because of the stateless nature of the static methods. An instance of a class does not need to be declared.

To test things out, for now we will just take one string input and return it to the Ajax call where we will display it. In order to do this, we will need to adjust our DataValue call in the script on the html page.

We will also need to add the URL and method to the Ajax call.

I also adjusted the alerts so we can see the success or fail.

Great! That worked! Now, let’s write up the code in the aspx.cs page to make the call to the service. We could grab some sample code from the web site and walk through it, but we have documentation and tutorials already on how to do that. Instead, I am going to take a short cut and grab what I need from NuGet. We offer NuGet packages for most of our services, which include best practices such as failover configurations and speeding up the integration time.

To do this, we will need to open up the NuGet Package Manager in Visual Studio under the Tools menu option. When browsing for our services, you can type “Serviceobjects”. You should see a list of available Service Objects NuGet packages to choose from.

Select DOTSAddressValidation3US, you will be prompted to select the project you want to install it under. Select the project and then click install.

Once the installation is complete, a ReadMe file will automatically open that will contain a lot of information about the service. You will also notice a few things were added to your project. First, a DLL for DOTSAddressValidation3US was added to the bin folder and was referenced in your project.

Next, an AV3LicenseKey app setting was added to your web.config file with the value “WSXX-XXXX-XXXX”. You will need to substitute this value for the key you received from Service Objects.

Also in the web.config you will see several endpoints were added. These will be used in the pre-packaged code to determine how to call the service based on your key.

Now that we have the NuGet package added to the project, we can use it. Next is the code that will use the DOTSAddressValidation3US DLL by loading it up with the input parameters, making the call to the service, then working with the response and finally sending the data back out to our JavaScript.

First we get the license key from the web configs file.

Then we make the call to the service. You will notice that we throw in a Boolean parameter at the end of the call to GetBestMatches for live or trial. This is an indicator that will tell the underlying process which endpoints to use. A mismatch between your license key and this Boolean flag will cause the call to the service to fail.

After we make the call to the service, we will process the response and send the data back to the JavaScript. If there is an error, we will return the error details. Otherwise, we will return a serialized version of the response object. Note that you can also just deal with the response completely here or completely in JavaScript. I mixed it up so you can see a bit of both.

Now we will turn our attention back to the JavaScript and update our submit function. All we really have to deal with now is the response from the aspx method call. Here is the portion of the Ajax call implementing that.

Success or failure here does not mean that the address is good or not. It is really just a success or failure of the call to the aspx web method. On failure, we simply make an alert stating something went wrong. On success, we examine the response value, determine if the address was good or not and display the appropriate response. Here is the whole submit function.

 

Here is a good address.

 

And here is a bad address.

 

In the submit function, we did the most basic check to see if an address was good or not.

But many more data points from the response can be used to make any number of conclusions about an address. One thing you will notice is that AddressResponse.Addresses is an array. A check to see if multiple addresses are returned can be valuable because in certain situations more than one address may be returned. One example is when an East and a West street address are equally valid responses to an input address. In this case you may want to display both addresses and let the user determine which to use. You may want to evaluate the individual address fragments or the corrections that were made to the address.

The data points associated to the DOTS Address Validation US 3 service can be found on our development guide page for the service.

The following is commented out code that I added to the submit method for easy access to the various output data points.

And here is the same thing but for the Error object in the aspx page.

I hope this walk-thru was helpful in demonstrating how to implement our DOTS Address Validation 3 web service with JavaScript and no Asp.Net markup tags using Visual Studio. If you have any question, please feel free to contact Service Objects for support.

Best Practices for List Processing

List processing is one of the many options Service Objects offers for validating your data. This option is ideal for validating large sets of existing data when you’d rather not set up an API call or would simply prefer us to process the data quickly and securely. There is good reason to have us process your list: we have high standards for security and will treat a file with the utmost care.

As part of our list processing service, we offer PGP encryption for files, SFTP file transfers, and encryption to keep your data private and secure. We also have internal applications that allow us to process large lists of data quickly and easily. We have processed lists ranging from tens of thousands of records to upwards of 15 million records. Simply put, we consider ourselves experts at processing lists, and we’ll help ensure that your data gets the best possible return available from our services.

That said, a few steps can help guarantee that your data is processed efficiently. For the best list processing experience – and the best data available, we recommend following these best practices for list processing.

CSV Preparation

Our system processes CSV files. We will convert any file to the CSV format prior to list processing. If you want to deliver a CSV file to us directly, keep the following CSV preparation best practices in mind:

Processing international data – If you have a list of international data that needs to be processed, make sure the file has the right encoding. For example, if the original set of data is in an Excel spreadsheet, converting it to a CSV format can destroy foreign characters that may be in your file. When processing a list of US addresses, this may not be an issue but if you are processing an International set of addresses through our DOTS Address Validation International service, then something like this could highly impact your file. One workaround is to save the file as Unicode text through Excel and then set the encoding to UTF-8 with BOM through a text editor. Another option is to send us the Excel file with the foreign characters preserved and we will convert it to CSV with the proper encoding.

Preventing commas from creating unwanted columns – Encapsulating a field containing commas inside quotation marks will prevent any stray commas from offsetting the columns in your CSV file. This ensures that the right data is processed when our applications parse through the CSV file.

Use Multiple Files for Large Lists

When processing a list with multiple millions of records, breaking the file into multiple files of about 1 million records each helps our system more easily process the list while also allowing for a faster review of the results.

Including a unique ID for each of the records in your list helps when updating your business application with the validated data.

Configure the Inputs for the Service of Choice

Matching your input data to ours can speed up list processing time. For example, some lists parse address line 1 data into separate fields (i.e., 123 N Main St W would have separate columns for 123, N, Main, St, and W). DOTS Address Validation 3 currently has inputs for BusinessName, Address1, Address2, City, State and Zip.  While we can certainly manipulate the data as needed, preformatting the data for our validation service can improve both list processing time and the turnaround time for updating your system with freshly validated data.

These best practices will help ensure a fast and smooth list processing experience. If you have a file you need cleansed, validated or enhanced, feel free to upload it here.

NCOA Integration Tutorial

The reality about any set of residential customer data is that given enough time, addresses and the people living there are bound to change. Occasionally, businesses and organizations can rely on the customer to notify them of changing addresses but when people move, this often times falls to the wayside on the list of priorities.

For cases like these, accessing the USPS National Change of Address database can provide a helpful solution to ensure that mail gets delivered to the correct person. The USPS maintains a large data set of address forwarding notifications, and with the DOTS NCOA Live service, this information is right at your finger tips.

Our DOTS NCOA Live service is a bit different than the rest of our products. Most of our other products process validation requests in a one at a time manner. NCOA is a little different in that in order to start a request, a minimum list of 100 addresses must be sent to the service, and from there anywhere from 1 to 500 records can be processed at a time. To show you how it works, we’ve put together a quick step by step tutorial.

What You Will Need

  • C# experience
  • Visual Studio ( VS 2015 is used for this example)
  • A DOTS NCOA Live license Key. A free Trial Key can be obtained here.

Project Creation and Setup

To get started, go ahead and launch Visual Studio and select File->New->Project.  From here you can choose whatever project will meet your needs. For this example, we’ll be creating a very basic console application.  So if you want to follow along step by step, you can choose the same project details as shown below.

Select OK and wait for Visual Studio to finish creating the project. Once that is done, right click the project, select Add and then select Service Reference. Here, we’ll enter the URL to the WSDL so that Visual Studio will create all the necessary classes, objects and methods that we’ll use when interacting with the DOTS NCOA Live service. To successfully do this, add the necessary information into the pop up screen as shown in the screenshot below.

3

Select OK. Now that the service reference is successfully set, open up the App.Config. Below is a screenshot of the App.Config that has been modified.

We’ve added the appSettings section and within that we’ve added two key value pairs. The first is the license key field where you will enter your key. Storing the license key in the app or web config files can be helpful and easy when transitioning from a trial to a live environment in your application. When you are ready to use a production key, changing it in the app config is an easier option than having to change a hard coded license key.

We’ve also put the path to a csv that will contain the address and name information that we will be sending to the NCOA service. You may not want to read in a CSV for your application but the process of building the input elements for the service will be relatively similar. For this example, we’re just going to put the file in the BIN folder of our project, but you can add any path you want to the file.

We’ve also increased the maximum buffer size in the http Binding. Since we’ll be sending a list of 100 addresses to the DOTS NCOA Live service, we’ll indeed need to increase the buffer size.

Lastly, we’ve changed the name of the original endpoint to “PrimaryClient,” made a copy of the endpoint and changed that name to “BackupPoint.” Currently, both of these endpoints point to the trial Service Objects environment, but when a production key is purchased, the PrimaryClient url should point to http://ws.serviceobjects.com/nl/ncoalive.asmx and the BackupClient should point to http://wsbackup.serviceobjects.com/nl/ncoalive.asmx.

Calling the DOTS NCOA Live web service

The first thing we’ll do is instantiate two static strings outside of the Main method. One will be for the input file, and the other will be for the license key that we’ve placed in the app.config. Inside the scope of the Main method, we’ll instantiate a NCOAAddressResponse object called response and set it equal to null. We’ll also create a string called jobID and set that equal to null as well.  This jobID will be passed as a parameter to our NCOA service call. A JobID can be seen as a unique identifier for all the records that are run against the service.

Now we’ll create the following method that will read our input file.

This method will return a List of the NCOAAddress object that will have all the inputs we need to send to the service. In my particular file the fields are as follows: Name, Address, Address2, City, State, Zip.  Your code will need to be modified to read the specific structure of your input file. This code reads in the file and then loops through each of the lines and adds the appropriate values to the fields of the NCOA address object. After the line is successfully read, we add the individual NCOAAddress objet to the list called inputAddresses and then return that object once the code has finished looping through the file.

Now we’ll insert a try catch block into the main method. Within this try catch block we’ll create a List of the NCOAAddress object and call the readInputFile method to fill it. We’ll also make a JobID with today’s date appended to the end of it. You will likely want to customize your job id to fit into your business application. Jobs close on Sunday at 11:55 PM so that is also something to take into consideration when designing your code.

Failover Configuration

Now that we have all our inputs successfully set up, we are able to call the NCOA web service. The first step we’ll take is create another try catch block to make the web service calls in. We will also create an instance of the DOTSNCOALibraryClient and pass in the trip “PrimaryClient” as a parameter. This will ensure that our first call to the NCOA service will point to our primary URL that we have defined in the web config. Next we’ll make the call to the webservice using the library client and set it equal to our response object.

After we get a response back from the service we’ll perform our failover check. We check if the response is null or if an error TypeCode of 3 was returned from the service. An error type code of 3 would indicate that a fatal error occurred with the service. If either of these criteria are met, then we’ll throw an exception that will be caught by the catch block we have created. Within this catch block we’ll set the library client object to a new instance with the “BackupClient” string passed to it to ensure that we call the backup client. The code should look like the following.

This failover logic will ensure that your application stays up and running in the event that a Service Objects web service is unresponsive or in the event that it is not behaving as expected.

Processing the Response

Now that we have successfully called the service, we’ll need to do something with the response. In this tutorial, we’ll take the results from the service and download them as a CSV into our bin folder.

To do this, we will call and create a method called processResponse that will take a NCOAAddressResponse as an input. This method will take the outputs from the service and build a DataTable that we will use to eventually turn into a CSV. This can be done as shown in the following screen shot.

 

 

Now that our output data table has been created, we’ll call and create two methods will loop through our DataTable and convert it to a CSV string, and then write that CSV to the bin folder.  The code to perform this is shown below.

More information on all the elements in the response object can be found on our developer guide for the NCOA service.

Now that our output data table has been created, we’ll call and create two methods will loop through our Data Table and convert it to a CSV string, and then write that CSV to the bin folder. The code to perform this is shown below.

11

Now our code is ready to run and the service is ready to test.

We’re always available for assistance, so be sure to contact us with any questions about integration for NCOA or any of our other services.

Taking Service Objects for a Test Drive

You’ve found Service Objects, you’ve read about our services so now you want to test drive. Well, there are several ways to do just that and as I go through the options you will also get a pretty good picture about how you can integrate our services into your applications or processes. Testing can be a tedious task, delving into the unknown of a third party process, but we make it easy for you to jump right in by giving you several ways to test such as our Quick Lookup Tool, DataTumbler, Batch Processing and our real-time API services.

Quick Lookup Tool
The Quick Lookup tool is used to test one off examples on our website.  It is as simple as navigating to the Quick Lookup page, selecting the particular service you are interested in, filling out the fields and clicking submit. You’ll receive a real time response from the service containing the results of your test.

Since our services often offer multiple operations the Quick Lookup pages will inform you which operation is being used for the particular service in the form. If there are other operations you are interested in testing then we have you covered there as well with links to the other operations.

DataTumbler
The DataTumbler is a PC-based desktop application you can download from our site to run tests on multiple records. If you have ever used Excel then this application will be easy for you to drive. It works like a spreadsheet where you can paste multiple records for processing, in real-time.


Here are the basic steps: Choose your service, choose your operation, drop your data in and click validate. Choosing the service and desired operation is important because often it will change the input columns needed for the records to process properly. In the screenshot above you can see that there are 5 input columns designated by the yellowish cell background. Here we have inputs for Address, Address2, City, State and Zip. If your particular purposes do not require Address2, for instance, then that column can be removed by simply clicking on the “Customize Input Columns” button and removing it from the input.  You can do the same thing for the output columns as well but in that case you would need to access the “Customize Output Columns” popup.  The output columns are designated by the cells with the greenish background.

You can also add additional columns that are not predefined by the application by right clicking a column and selecting “Insert Column”.  This is handy for situations where you want additional data to stay together with your test like a unique identifier or other data unrelated to testing the service.

Once one of the validation buttons at the bottom is pressed, the DataTumbler will make requests to our services in real-time and populate the output columns as the data is being processed.

To get the application please call Customer Support at 1.800.694.6269 or access Live Chat and we will get you started.

Batch Processing
Batch processing is another way you can test drive our services.  When you have a file ready to go for testing you can simply hand it over to us.  We will process it and send back the results to you along with a summary.

This is one of the more preferred ways to test drive our services for several reasons:

  • We can see the data you send us first hand and give you feedback about what the file looks like with respect to items like formatting.
  • By seeing all the fields of data we can quickly recommend the most beneficial service.
  • It gives us an opportunity to see the results with you and go over any interesting data points from a data validation/cleansing expert point of view.

All that is needed for this is a test file containing the records you want to try out. The file can come in several formats including txt, csv and xls to name a few. You can send us the file directly to an email or for a more secure means we can provide you a secure ftp for the file transfer. We can also handle working with encrypted data when an extra security layer is needed. An additional way to get us your test file is through our web site itself. You can drag and drop a file and be on your way. Once we have the file we will process it against the service you are testing and return the results along with a summary breakdown of the processing.

If your test run is a success and you’re ready to move forward with a larger file, we can also run one-time paid batch. Clients often use a this as an initial data scrub before a switching to our real time API or automated batch system which will run batches virtually on demand.

Integrating the API
The last way you can test our services is by implementing our API in your code. Most clients use the API when they integrate with us so testing this way gives you the closest representation of how your production process will work with our services.

When it comes to doing a direct software integration test we have you covered. We make it easy to integrate and get testing quickly by means of sample code, code snippets, step-by-step integration walk-through blogs, developer guides and NuGet for Visual Studio.

We have sample code for C#, Java, PHP, Rails, Python, NodeJS, VB.Net, Classic ASP, Microsoft SQL Server, APEX and Cold Fusion.  This list does not mean that we stop there.  Additional sample code can be requested and our team will review to find the best solution for you.  When applicable, our sample code will be available in both REST and SOAP.  All of our examples will implement best practices and demonstrate failover.

If you are a C# developer and use Visual Studio you will have us at the hands of your finger tips.  Using the NuGet Manager in Visual Studio you can have our API injected into your code and ready to go.

All of our walk-through tutorial blogs and documentation are presented in a straight forward and easy to understand format and as always the Service Objects team is here to assist with any questions or help you may need.

When it comes to test driving our services we give you options to make it easy. A trial key will give you access to all the options I mentioned. Going through these options also gave me a chance to show you how you can integrate with our services.  The beauty behind the way we set this system up is that you can become fully integrated and tested before you even purchase a live production key.  In all of these cases, usually only two things need to be updated when switching from trial testing to live production use.  In the Quick Lookup Tool you will need to switch the “Select Key Type” to “Customer Production Key” and then use your live production key instead of your trial key.  In the DataTumbler you will be similarly swapping those fields out as well.  When it comes to doing a code integration you just need to update your endpoints from trial.serviceobjects.com to ws.serviceobjects.com and the trial key for a live production key.

Whenever you want to test one or two records or a whole file, simply put, our team has you covered.

Service Objects is the industry leader in real-time contact validation services.

Service Objects has verified over 2.5 billion contact records for clients from various industries including retail, technology, government, communications, leisure, utilities, and finance. Since 2001, thousands of businesses and developers have used our APIs to validate transactions to reduce fraud, increase conversions, and enhance incoming leads, Web orders, and customer lists. READ MORE