Author Archive

Tuning your Implementation: What Impacts Speed and Throughput

How fast is fast? How fast should things be? And more importantly, what can you control to make the response time of using our services as fast as possible?

In this blog we will break down the key components of response time, to shed some light on what you can optimize to make your implementation respond faster. Whether it is a call to a Service Objects web service or an API integration in general, there are specific phases that take place: input data preparation and transmission, data processing, and results transmission and processing. For example, a simplified view of a single data validation request is as follows:

  • Your data is securely sent to the Service Objects web server to be processed
  • The data is received by our server and processed/validated
  • The validated data is then securely returned to you

Most people think of response time as being the round-trip time to get your data validated – and this is indeed the primary concern – but total throughput should be addressed as well. And if you are able to get a single validation done as optimally as possible, expanding the workflow to fit simultaneous requests shouldn’t take too much modification to the workflow.

To understand what is going on under the hood – and in particular, understand what factors may be slowing down your speed – let’s compare this process to a trip to the grocery store. It has similar phases to a data validation process: you leave your house, travel down the road, stop at the market, purchase your groceries, and then drive home. Here is how this process breaks down:

Step 1: Data preparation. This is analogous to the steps you take before leaving your home. Did you turn off the lights? Are all of the doors shut? Is the security system armed? Do you have everything on your grocery list?

Each of these steps is similar to the checks that your application goes through in order to leave your company’s internal network. The application has to gather the information to be processed, dot all of the i’s, cross all the t’s, make sure it knows where it is going, and make it through any layers of security your company has in place. Each layer takes time to complete and detracts from your overall speed.

Step 2. Input data transmission. This is like traveling down the road to the supermarket. How much traffic people encounter depends on factors such as how many lanes the road has, how many people are in each car, and how many cars are on the road. From an engineer’s perspective, this is like the concept of single threading and multithreading: if the road that you are traveling down has multiple lanes, the number of cars (e.g. API requests) can be larger before running into traffic.

By creating a multithreaded application, you can effectively widen the road and allow for more throughputs. Your choices include single car + single lane (single value request), single car with multiple people in it + single lane (smartly using one validation client to do multiple requests), multiple cars with single passengers on multiple lanes (semi-smart multithreaded validations), and multiple cars with multiple passengers on multiple lanes (really smart multithreaded apps).

Step 3. Data processing. Once you reach the store and make your purchases, it is time to pick a checkout aisle. The number of aisles that are open act similarly to the lanes on the road. More aisles allow a larger number of people to be processed before a queue builds up.

This part of the process is handed by Service Objects. We have spent the last 15+ years not only making sure to stock our shelves with the best quality products, but also ensuring that there are enough checkout aisles to meet the needs of our customers. In our case, we have multiple data centers (super markets) with load balanced web servers (smartly managed checkout aisles) to keep all of our transactions moving through smoothly.

Step 4. Results data transmission. The return trip from the super market is very much the same as the trip there, but in reverse order. Similarly, entering your house with the newly purchased items (validated data) is the same in the opposite direction. This stage also contributes to the round trip time required to process a request.

Step 5. Unpacking the Results. When you return from the store, you still have to unpack your groceries. Likewise, once you get the data back from the service, you still need code and business logic to unpack the results and store them in a form that meets your needs. (Or, to extend our analogy further, create a delicious meal from the quality ingredients that Service Objects supplies you!)

So total processing time is a careful balance of getting through your front door, navigating down the single or multi lane highway, checking out with your groceries, and then making the return trip home. From an API-call perspective, improvements to your speed start with your network, smart choices when writing your integration code, and juggling the requirements put in place by your business logic. And if you are looking to increase your throughput, a multithreaded application or distributed system will often increase your capacity.

Finally, one more analogy (we promise!). Good grocery stores have staff who can help you make the most of your shopping trip. Likewise, Service Objects has an industry-leading team of technical support experts who know their stuff, and are always happy to help you make the most of your implementation. How can we help you? Contact us any time to discuss your own specific needs.

Why Data Quality Isn’t a Do-It-Yourself Project

Here is a question that potential customers often ask: is it easier or more cost-effective to use data validation services such as ours, versus building and maintaining in-house capabilities?

We are a little biased, of course – but with good reason. Let’s look at why our services are a lot easier to use versus in-house coding, and save you money in the process.

Collecting and using information is a crucial part of most business workflows. This could be as simple as taking a name for an order of a burger and fries, or as complex as gathering every available data point from a client. Either way, companies often find themselves with databases comprised of client info. This data is a valuable business asset, and at the same time often the subject of controversy, misuse, or even data privacy laws.

The collection and use of client information carries an important responsibility when it comes to security, utility, and accuracy. If you find yourself tasked with keeping your data genuine, accurate, and up-to-date, let’s compare the resources you would need for creating your own data validation solution versus using a quality third-party service.

Using your resources

First, let’s look at some of the resources you would need for your own solution:

Human. The development, testing, deployment, and maintenance of an in-house solution requires multiple teams to do properly, including software engineers, quality assurance engineers, database engineers, and system administrators. As the number of data points you collect increases, so does the complexity of these human resources.

Financial. In addition to the inherent cost of employing a team of data validation experts, there is the cost of acquiring access to verified and continually updated lists of data (address, phone, name, etc.) that you can cross-validate your data against.

Proficiency. Consider how long it would take to develop necessary expertise in the data validation field. (At Service Objects, for example, our engineers have spent 15+ years increasing their proficiency in this space, and there is still more to learn!) There is a constant need to gain more knowledge and translate that into new and improved data validation services.

Security. Keeping your sensitive data secure is important, and often requires the services of a team. An even worse cost can be failing to take the necessary steps to ensure privacy, which can even lead to legal troubles. Some environments need as much as bank grade security to protect their information.

Using our resources

The points above are meant to show that any data validation solution, even an in-house one, carries costs with it. Now, let’s look at what you get in return for the relatively small cost of a third-party solution such as ours for data validation:

Done-for-you capabilities. When you use services such as our flagship Address Validation, Lead Validation, geocoding, or others, you leverage all the capabilities we have in place: CASS-certifiedTM validation against continually updated USPS data, lead quality scores computed from over 100 data points, cross-correlation against geographic or demographic data, global reach, and much, much more.

Easy interfacing. We have multiple levels of interfacing, ranging from easy to really easy. For small or well-defined jobs, our batch list processing capabilities can clean or process your data with no programming required. Or integrate our capabilities directly into your platform using enterprise-grade RESTful API calls, or cloud connectors interfacing to popular marketing, sales and e-commerce platforms. You also spot-check specific data online – and sample it right now if you wish, with no registration required!

Support and documentation. Our experts are at your service anytime, including 24/7/365 access in an emergency. And we combine this with extensive developer guides and documentation. We are proud of our support, and equally proud of all the customers who never even need to contact us.

Quality. Our services come with a 99.999% uptime guarantee – if you’re counting, that means less than a minute and a half per day on average. We employ multiple failover servers to make sure we are here when you need us.

We aren’t trying to say that creating your own in-house data validation solution can’t be done. But for most people, it is a huge job that comes at the expense of multiple company resources. This is where we come in at Service Objects, for over 2500 companies – including some of the nation’s biggest brands, like Amazon, American Express, and Microsoft.

The combination of smart data collection/storage choices on your end and the expert knowledge we’ve gained over 15 years in the data validation space can help to ensure your data is accurate, genuine and up-to-date. Be sure to look at the real costs of ensuring your data quality, talk with us, and then leave the fuss of maintaining software and security updates and researching and developing new data validation techniques to us.

Data Collection – Getting It Right

As a data validation company, we think a lot about how information is collected and stored – and we know that the right approach to data collection can ensure that you start off on the right foot. This blog shares some expert tips for channeling your data into your company’s data stores, in ways that give you the best chance to have this data corrected, standardized, and validated.

Let’s imagine a few of the most common data entry points. At the top of the list we find web forms, compiled datasets, and manual entry from locations such as Point-of-Sale (POS), phone calls, and written transcriptions. Each type of data entry is unique from an interface perspective, but there are still two key rules you can use to limit the chance for errors or garbage. Here they are:

  1. Divide and conquer

    First, break down your data into its most simple components. By collecting data in its simplest form you can ensure there is no confusion between what the data is supposed to represent and other fields. This is a principle we call separation of data, where each field only contains the most relevant data for its type.

    In the best case, you have a collection of very specific fields with unique data types – as opposed to the worst case, where you have a group of identical fields collecting your data. Could a user enter, say, his grandmother’s name in the ZIP code field? If so, you still have some refinement to do here.

  2. Set data constraints

    Next, determine what characters are relevant to each specific field. For example, if you are collecting “Age” data, you know that the data field should be a number. It doesn’t make sense to allow for anything other than the numbers 0-9 (and please don’t get cute and allow words such as “thirty-five”). Extending your requirements further, you could even limit the “Age” field to positive numbers to prevent someone from claiming they are -5 years old.

Now, let’s put these two rules to work for two of the most common types of contact data: delivery addresses and email addresses.

United States Delivery Addresses

This is a case where a basic understanding of the data you are collecting will take you a long way. Here we are going to examine the anatomy of a typical US address, as shown in Wikipedia, and then split the individual components into different fields.

https://en.wikipedia.org/wiki/Address_(geography)#United_States

From this US address example, we can see the recommended USPS format. Within the address, we have the house number and street name (Address1), the name of town (City), the state abbreviation (State), and the ZIP+4 (ZIP/PostalCode).

Here are some good and bad examples of address data:

Ideal Input

FieldContentDescription
Address127 E Cota St STE 500Contains a mix of alphanumeric data pertaining to the mailbox # and street
Address2c/o John SmithNon-critical deliverability information
CitySanta BarbaraContains only alpha characters
StateCAStandard 2 character representation of a state chosen from a list of acceptable values
ZIP931015 digit number without extraneous characters

Address 2 Used Incorrectly

FieldContentDescription
Address127 E Cota St STE 500No issues
Address2Santa Barbara, CA 93101City, state and ZIP are not separated

Allowing Abbreviations and Local Language

FieldContentDescription
Address127 E Cota StNo issues
CitySBAbbreviating the city name is not ideal. Spell out full name or use ZIP to determine and populate this field
StateCaliInformal spelling instead of California or CA. State should be selected from a data set
ZIP93101abc Should be a 5 digits with no alpha characters

Email addresses

Now let’s turn to the format of an email address, again from Wikipedia:

https://en.wikipedia.org/wiki/Email_address

Emails are a bit trickier, but constraints can still be placed on the point of entry. The set of allowed characters in an email is far more expansive than say, postal codes, but there are still limitations. By putting the restrictions in place you can still cut out some of the garbage that can be submitted.

In this case, you could examine the individual elements of an email and place restrictions on each. Since an email must consist of a local-part, “@” symbol, and a domain, it is safe to restrict your collected data to only email inputs that conform to the anatomy of an email. For example, an address such as “Input 1: 123ABCSt.foobar@domain.com Thirty-Five” does not end with a legal domain identifier, and also contains illegal spaces within the address and after the closing period.

General Contact Record Data Collection

The same techniques can be applied to all of the data that you collect. Some fields may be easier to apply constraints to, but any attempt to filter acceptable input and organize the data into manageable sections will benefit you down the line. We specialize in data validation and have spent over 15 years refining our methods for interpreting, standardizing, and validating data. If data is gathered in component parts we can at least gain context and, even if the data isn’t correct, apply our specialized knowledge to try and validate this information.

Finally, give some thought to how your data will ultimately be stored in a data store or database. Since this is the bottom level, it is crucial for database administrators to make educated choices about these data fields. If constraints aren’t placed at the database level, the information you have collected could be rendered useless. Smart choices in data type, accepted data length, and constraints can help ensure that your data is stored in its most sensible form.

As with most things in life, an ounce of prevention is worth a pound of cure when it comes to collecting and maintaining contact data accuracy. The techniques described above, in conjunction with Service Objects web services, will help provide you with the most genuine, accurate, and up-to-date data as possible.

Best Practices for FastTax

Service Objects’ DOTS FastTax service is now more powerful than ever – particularly for areas that have multiple tax jurisdictions. This means that a few tweaks to your implementation can do a lot more of the sales and use tax calculation work for you. So here are some pro tips to elevate your tax game:

Determine where you have “nexus”

In order to hit the ground running, you will first want to look into your own business’ tax requirements. Often this is done by determining states where your business has sufficient physical presence to require the collection of sales tax, also known as “nexus”, and then charging the appropriate tax based on that information. After determining where you will need to charge tax, and what types of taxes are involved (such as sales or use tax), it is time to head over to our FastTax API.

Search for tax rates by address

FastTax gives you the ability to search tax rates by city and state, city, county and state, ZIP code, or (for Canadian addresses) by province. However, the current version also gives you the ability to search tax rates by address, using an operation called GetBestMatch.

To get the most out of your FastTax subscription, we recommend using this latest operation, which will take your address and return the best available tax rate match. This service will even go as far as returning a ZIP code level match when the specific address level information is not available. In addition to the city/state/county/special tax rates that are returned, we will provide you with information about the location’s unincorporated status., discussed next.

Account for unincorporated areas

If you are interested in properly accounting for areas within a city that are unincorporated, we have you covered. We offer a flag that is called IsUncorporated, and it will have either a true or false value. If the flag is set to true all you have to do is take the total tax rate and subtract off any existing city and city district rates. You will be left with the unincorporated address’ accurate tax rate.

With all of these pieces of information in hand, you will now able to select the rates that are relevant to your business and apply the specific sales and use tax rates that fit your needs.

Let us do the work for you

With the latest version of FastTax, you can harness a real-time tax rate lookup service based on contact addresses that are validated, geocoded, and then matched with a corresponding tax rate. This means that the quickest way to get accurate tax information is to now look up tax rates by address.

Compared with searching for tax rates by city, county or ZIP code, searching by address takes advantage of the deep-rooted integration between FastTax and our address validation and geocoding engines. This is particularly important because tax rates can vary within a municipality or ZIP code, based on factors such as incorporation status, so letting us validate and geocode your contact addresses helps us provide you with the most accurate tax rate data.

While the saying “garbage in, garbage out” still applies, this service will also do its best to correct your input to its proper form. Minor spelling mistakes, standardizations, or even non-deliverable addresses can sometimes be geocoded or corrected to provide you an accurate tax rate for your input address. In total, the combination of your predetermined nexus information and our tax rate lookup service allows for a quick and seamless workflow for your business.

As always, we are here to answer any questions that you may have. If you would like clarification on any aspect of our FastTax web service, please feel free to reach out to us. Our sales and support teams are here to find a solution that works for your business. And if you are a current customer and find a rate that is incorrect, shoot us an email and we will get the discrepancy addressed immediately.

NCOA Live Best Practices for Contact Address Validation

NCOA Live Best Practices

If you want to use our National Change-of-Address web service, DOTS NCOA Live, for contact address validation but are hesitant to dive in due to the complex nature of the service; this article is meant to set your worries aside. This blog will serve as a comprehensive guide to getting the most out of your NCOA Live subscription while addressing common questions, pitfalls, and recommended workflows. Additional information can also be found in our NCOA Live developer guide. With NCOA Live, businesses can easily update address information to maintain accurate and up-to-date contact records by accessing the USPS dataset of mail forwarding notifications.

Filling Out the Processing Acknowledgement Form

Before you can begin using NCOA Live, the USPS requires you to complete their simple Processing Acknowledgement Form (PAF) to access change-of-address data. Most of the fields in this form will be straightforward. You can look up your North American Industry Classification System (NAICS) code and business address here. To ensure correct PAF filing, we recommend using the USPS lookup tool to confirm your address and some of the additional details that the PAF requires. Please see the image below for reference.

Ensure accurate and up-to-date contact address validation and maintain your competitive edge with DOTS NCOA Live from Service Objects.

Ensuring that your address has a ZIP+4 and a DPV Confirmation Indicator of “Y” will prevent any issues in the filing process.

Getting Your License Key and Service Endpoints

After successful filing the PAF, we will provide a license key and the service endpoint. These items will enable requests to the NCOA Live web service to check for change-of-addresses. Due to the flexible nature of our services, NCOA Live is accessible from almost any tool or programming language that can make a web service call. Specific coding examples for the service can be found in our developer guide’s sample code section.

We have sample code in most of the popular programming languages, including PHP, JAVA, Ruby, Python, ColdFusion, and C#, just to name a few. We can also provide customized code if needed and our Application Engineering team would be happy to answer any questions you may have about integrations and programming language-specific concerns.

Handling JobID Creation

Arguably the most challenging aspect of the NCOA Live web service is the USPS requirement that submissions for change-of-address lookups include an open JobID. The JobID links to your account and keeps track of the transactions you run. Each new JobID remains valid for one week, expiring at 11:50 pm Sunday evening. Opening a new JobID requires the following:

  1. Building an array of 100-500 addresses (100 minimum to create a job)
  2. Creating a personalized JobID (alpha-numeric string of fewer than 50 characters)
  3. Submitting the addresses, JobID, and license key to the “RunNCOA Live” operation

After submitting the initial 100-500 records for the current week’s JobID, anywhere from 1-500 records can be processed per batch. Every transaction run during that week will operate under this new JobID. At the end of the week, the JobID closes, and we update the internal change-of-address data that powers our service. The following week, another NCOA Live operation can be initiated with a new JobID following the steps listed above.

Checking for Errors and Parsing the Response

The first step to safely parsing the response is to check for any root level errors. Root level errors are largely uncommon and generally related to issues with the service or license key. If root errors appear, please don’t hesitate to contact Service Objects and we will work with you to resolve them. If there are no root level errors, you can start working with the valid response data.

The NCOA Live response returns a result with multiple nested fields. See table below for the response fields and a brief description.

RunNCOA Live Outputs

Parent ObjectChildValuesDescription
NameInVariesThe raw input name.
RawInputAddressAddressVariesThe raw input address line.
Address2VariesThe raw input address2 line.
CityVariesThe raw input city.
StateVariesThe raw input state.
ZipVariesThe raw input Zip code.
CASSInputAddressAddressVariesThe standardized address line.
Address2VariesThe standardized secondary.
CityVariesThe standardized city name.
StateVariesThe standardized state.
ZipVariesThe standardized Zip+4
USPSFootnotesVariesA concatenated string of relevant 2-digit USPS "Footnote" codes that give additional information about the input address.
NCOAMatchNameMatchVaries
The name that matched the COA record.
AddressVariesThe primary address line that the resident moved to.
Address2VariesThe secondary address line.
CityVariesThe city name.
StateVariesThe state abbreviation.
ZipVariesThe Zip+4.
CarrierRouteVariesThe Carrier Route code for the COA address.
BarcodeDigitsVariesThe PostNet barcode for the COA address.
COAFoundWhether or not a match was found in the COA data. Does not imply that a valid address could be found.
NCOAReturnCodeVariesThe USPS's NCOALink Return Code providing additional information about the nature of the COA match.
NCOAReturnCodeDescVaries
Short English description of the COA information. Longer descriptions found below.
ExtendedNCOAReturnCode(See below)USPS's Extended NCOA Return Code comprising a series of key/value strings.
DiagnosticsDiscountCode1-4A code representing discount level.
DiscountDescription(See below)An English description of the discount level.
StatusCode2-8A code representing the level of quality of the input address post-validation. Higher is better.
StatusDescription(See below)An English description of the level of quality of the input address post-validation.
ServiceFlagsVariesUSPS Service Flags output explains what additional address services were run such as RDI, eLOT, etc.
ErrorTypeVariesEnglish description of the error type. See "Error Codes" below.
TypeCode1,2,3,4Unique error type code. See "Errors" below.
DescVariesEnglish description of the error. See "Errors" below.
DescCodeVariesUnique code for the error. See "Errors" below.
JobIDVariesThe JobId sent to the service.

The response data comes back as a list of results corresponding to the addresses submitted. If specific address errors are detected at this level, they fall under our Domain Specific errors and apply to individual addresses. Reading the error’s description provides insight into why the service was not able to validate or return change-of-address information. Detailed notes about the individual error codes are available in the developer guide and can be seen in the table below.

Error Type 4: Domain Specific

DescCodeDescription 
1Job not found for this License Key.The job does not exist. Please try again with a different job id. *
2Job has been closed.The job can no longer be used. Please try again with a new job id. *
3First transaction of a job must contain 100 records or more.Please try again with at least 100 unique and valid addresses. *
4Issue connecting to NCOA engine.Please try again. If the issue persists then please contact technical support. *
5Street not found.Indicates that the street name was not found for the general area (city/state or zip).
6Address not found.Indicates that a reliable address candidate was not found. Portions of the address may be incorrect or it may be too ambiguous to return a reliable candidate.
7Street number or box number out of range.The address is invalid. The street and area appear to be correct but the number is wrong.
8Multiple addresses match.Indicates that multiple candidates were found that are equally likely given the input.
9nsufficient address data. Indicates that a reliable address candidate was not found. Portions of the address may be missing or incorrect.
10DPV Lockout. Contact Service Objects immediately.Returned for a specific type of address case known as a false positive.
11Request cannot contain more than 500 addresses. Please try again with no more than 500 addresses in a single request. *
12License Key is not linked to a valid PAF Id. Please contact Service Objects and complete a USPS NCOA Processing Acknowledgement Form (PAF) to register your license key with the service. *
13
Performing weekly NCOA data update. Please try again in a few minutes with a new Job Id.
USPS releases new NCOALink data every week and requires that we use the newest data, so we must close all jobs using the older dataset. *
14Expired PAF agreement. Please contact Service Objects. Your USPS NCOA Processing Acknowledgement Form (PAF) has expired. Please contact Service Objects to renew and continue using the service. *
15Unable to create new NCOA Job. Please try again. If the problem persists then please contact Service Objects.There was a problem creating the new job. Please contact Service Objects and notify technical support of the error. *

* This is not a billable error and it will not count as a transaction against the license key.

The flexible framework of NCOA Live’s outputs allow you to integrate the results into your application to best meet your needs. We recommend exploring the various outputs such as the RawInputAddress, CASSInputAddress, and likely the most relevant information, the NCOAMatch. Because it delivers new address information in real-time, the NCOA Live service can be easily integrated into existing workflows and databases.

Maintain Better Mailing Lists with Easy Contact Address Validation

The USPS National Change-of-Address database provides a valuable resource for organizations who depend on up-to-date contact data. NCOA Live leverages the USPS dataset of forwarding notifications with a flexible API interface to provide you with the latest address information for clients and prospects. More details on all the elements of our NCOA Live service are available in our developer guide. And we are always here to help you with any questions or integration challenges you may encounter. Don’t hesitate to reach out to us today!

 

Integrating your address validation applications should be as painless as possible. A technical consultation can provide insight into the data fields submission and find connections between validated data and the goals and requirements of your business.

DOTS Address Validation 3: Understanding the Tools at Your Disposal

Here at Service Objects, we have a team of engineers standing by to help you get the most out of our data validation services. Our goal is to understand your business needs and use the expert knowledge we have gained from over 15 years in the data validation business, to help meet these needs. We provide in-depth developer guides and offer complimentary technical consultation to help your developers leverage our services in the most efficient and productive manner possible. We also want to make sure that you understand the full power and utility of our services and the best ways to integrate them. Understanding how our services work and what they return makes integration simpler and ensures you are using them optimally.

Before we get started, it is important to know that many of our services have multiple operations. These operations have been created to meet different business needs within the services’ overall purpose. For example, DOTS Address Validation 3 has eight different operations, from full address validation and correction to parsing address elements into fragments. At first, it might seem overwhelming to discern which operation is right for your needs, but once the differences are understood, we can choose the best operation to meet them. For our example, we will use the recommended operation, GetBestMatches, for Address Validation 3 as this satisfies most of our customers’ address validation needs. So, let’s dive in.

More Inputs = Better Address Verification

Because humans are not perfect, the data coming into the service can vary wildly in terms of format. Address Validation 3 takes these varying inputs, standardizes them, and then verifies the address. It also cross-references the optional input, BusinessName, with the address data provided to return the most accurate result. Address Validation 3 uses either Postal Code or City and State inputs to complete its analysis. Ultimately, the more inputs you can provide, the more cross-referencing can be performed, and the address can be corrected and validated to the highest accuracy possible.

Below are the input fields from our recommended operation, GetBestMatches. Some sections can be understood by their name alone, and others may require reading the description to get a better understanding of what they offer.

 

GetBestMatches Inputs

Name Type Description
BusinessName String Name of business associated with this address. Used to append Suite data.
Address String Address line of the address to validate.
For example, “123 Main Street”.
Address2 String This line is for address information that does not contribute to DPV coding an address. For example,”C/O John Smith” does not help validate the address, but is still useful in delivery.
City String The city of the address to validate.
For example, “New York.”  The city isn’t required, but if one is not provided, the Zip code is required.
State String The state of the address to validate.  For example, “NY.”  This does not need to be contracted; full state names will work as well.  The state isn’t required, but if one is not provided, the Zip code is required.
PostalCode String The zip code of the address to validate.  A zip code isn’t required, but if one is not provided, the City and State are required.
LicenseKey* String Your license key to use the service.
Sign up for a free trial key at https://www.serviceobjects.com/products/address-geocoding/usps-address-validation

 

Usually, your inputs are largely fixed based on the data you are collecting, but the outputs can vary based on the operation being used. Understanding the outputs available and how you want to use them ensures you are leveraging the service to its fullest extent. Below is a table showing the available output fields from our recommended operation, GetBestMatches. As you will see, some sections can be understood by their Name alone, while others may require further description to better understanding what they return. These descriptions are provided in the Description field.

 

GetBestMatches Outputs

Name Type Values Description
Addresses Address[] Varies The corrected address candidates.
IsCASS String “true” or “false” Indicates if the unaltered input address is CASS certified. See “What is CASS?” below for more information.
Error Error Varies Error object indicating why the service could not return a result. See “Errors” below for more information.

Address

Name Type Values Description
Address1 String Varies The corrected address line 1.
Address2 String Varies The corrected address line 2.
City String Varies The corrected city name.
State String Varies The corrected state name.
Zip String Varies The corrected zip code + 4.
IsResidential String “true” or “false” Indicates if the address is for a residence.
DPV* String 1-4 Number that correlates to a DPV (Delivery Point Validation) result. An indicator displaying whether or not the address is recognized as deliverable by the USPS.
DPVDesc* String Varies Explains DPV result.
DPVNotes* String Varies Number that correlates to DPV notes description. Service Objects may add or change Note descriptions, but will never modify existing codes.
DPVNotesDesc* String Varies Details about the DPV result. Service Objects may add or change Note descriptions, but will never modify existing codes.
Corrections* String Varies Number that correlates to a Corrections Description. Service Objects may add or change Correction descriptions, but will never modify existing codes.
CorrectionsDesc* String Varies Description of what was corrected in an address. Service Objects may add or change Correction descriptions, but will never modify existing codes.
BarcodeDigits String Varies The post office delivery barcode digits.
CarrierRoute String 4 chars 4 chars: 1 for the route type, 3 for the route code. Identifies a group of addresses when prepended by 5-digit Zip.
CongressCode String Varies The congress code is the congressional district number.
CountyCode String Varies The county code of the given address.
CountyName String Varies The name of the county in which the given address lies.
FragmentHouse String Varies The parsed house number of the given address.
FragmentPreDir String Varies The parsed pre-directional of the address’s street.  “North” in “North Main St West.”
FragmentStreet String Varies The parsed name of the street in the given address.  “Main” in “North Main St West.”
FragmentSuffix String Varies The parsed suffix of the street in the given address.  “St” in “North Main St West.”
FragmentPostDir String Varies The parsed post-directional of the address’s street.  “West” in “North Main St West.”
FragmentUnit String Varies The parsed unit type (e.g. “Apt” or “Ste”)
Fragment String Varies The parsed “Fragment” box, apartment, or unit number. Same as FragmentPMBNumber.
FragmentPMBPrefix String Varies The parsed type of the apartment, box, unit, etc.  For example, “APT” or “BOX.”
FragmentPMBNumber String Varies The parsed apartment, box, unit, etc. number of the given address.

 

Effective Address Validation Begins with Painless Integration

Integrating your applications with Service Objects should be as painless as possible. Our developer guides show the many different services and operations we offer while providing sample code in most of the major programming languages. And if we don’t have what you need, just ask, it is what we are here for.

Some Common Questions We See About Address Validation

Due to the broad nature of our products, the consultations we have with our clients vary depending on need. For beginners, we recommend starting with our online guide, Getting Started with Service Objects. This reference manual includes information about each of our individual services and can help guide you through your Address Validation 3 integration.

While our knowledge extends beyond the FAQ section in our developer guides, the technical questions we receive cover many topics, from 3rd party plugins to concurrency and multi-threaded applications, failover configurations, and response parsing/interpretation.

Below are some questions that we have received while helping clients integrate with our Address Validation 3 service. The inquiries arise from common business requirements and may even help answer questions you have about your integration.

 

Q: How will I know if an address that is validated is deliverable?

A: Delivery Point Validation (DPV) – The DPV codes are extremely useful in determining the deliverability of your address. They are broken down into four different codes:

DPV Code Description
1 Yes, the input record is a valid mailing address
2 No, the input record is not in the DPV database of valid mailing addresses
3 The apartment or rural route box number is not valid, although the house number or rural route is valid
4 The input record is a valid mailing address but is missing the apartment or rural route box number

 

General workflows revolve around accepting address with DPV Code 1s and discarding DPV Code 2s. The DPV Code 1 will indicate that if you sent mail to the address via the United States Postal Service, it would be delivered. DPV Code 2 means the address was deemed not deliverable, and mail would not successfully arrive at that address. DPV Codes 3 and 4 are both indicators that some piece of information was missing on the input, and you will want to create business logic to determine how these cases are handled. For cases like this, we find that many clients will prompt their end user for a corrected house number of apartment/suite/rural route box number, possibly in real-time. 

 

Q: How do I ensure my service requests are successfully processed?

A: Implementing Failover Logic – All of our sample code highlights our recommended best practices and procedures. We show how to code for service outages at our primary data centers, and how to failover to our backup servers. With that said, we guarantee a 99.999% uptime, ensuring the web services are available to you at all times. With proper failover logic in place, failing to get back validated data is a non-issue.

 

Q: I have validated all of my records, but now I want to remove duplicates. Is there an easy way to do this?

A: BarcodeDigits – This little nugget of gold hiding in plain sight. From a programmer’s point of view, the barcode is a perfect primary key for an address. The barcode digit is a unique identifier for a premise. You can use this field instead of doing difficult string comparisons against individual address fragments.

 

Q: We do direct mailings to the addresses we collect, but the shipping company places a character limit on the address fields. Can we use the Address Validation 3 service to adhere to the shipping company’s limitation?

A: Fragments – When an address is validated it is also broken up into its component parts. The parts can be used to get around shipping label limitations by piecing back together the address how you need it. For example, an address label could be constructed like the following:

Line 1: FragmentHouse + FragmentPreDir + FragmentStreet + FragmentSuffix + FragmentPostDir

Line 2: FragmentUnit + Fragment + FragmentPMBPrefix + FragmentPMBNumber

 

Q: I am getting an error back from the service and need help to interpret it. Also, where is my validated address?

A: Errors – Errors come back as their own object in code or parent object in XML/JSON. Digging into the error object fields will help you understand what went wrong with your call to the service, or why your inputs were not valid. If you receive an error object back from the Address Validation 3 service, you can immediately assume that you will not be receiving a validated address. Errors are broken into four categories:

  • Authorization – This type of error revolves around an issue with your license key. Sometimes the wrong key is being used for the service, or you may have exhausted your purchased transactions. If this happens, please reach out to our customer care (or call 800.694.6269) team to resolve the issue.
  • User Input – These occur when the input data is incorrect. Often, this error happens because a field is missing or a parameter name is misspelled.
  • Service Objects Fatal – Very rarely, if ever, will you see this type of error. Chances are we have already been made aware of the issue from our 24/7/365 monitoring services. However, please let us know if you encounter one by sending an email to support@serviceobjects.com. We support our services 24/7/365, and guarantee 99.999% uptime with our financially backed Master Service Level Agreement.
  • Domain Specific – This indicates that the service ran to completion, but the data was not valid. Each of the individual errors within the domain-specific category will help you to understand what part of the address was deemed invalid.

These are just a few tips to help you with your integration of Address Validation 3. If you would like further clarification on any of the fields or have questions about how a service will work to fit your needs, please don’t hesitate to reach out to our support team.

 

Types of Integrations

Searching for the proper tool to fit your business needs can be a daunting task. At Service Objects, ease of integration is engineered in as part of each of our products, ranging from seamless API interfaces to list processing services that work directly on your data files. This article discusses each of our integration strategies in detail, to simplify your research process and to help pinpoint the type of integration that will best suit your needs.

Service Objects products are created as web services. This means that any programming language that can make a web service request, can make use of our services. From programming languages like  PHP, Java, C#, Ruby, Python, Cold Fusion and many more, to CRM systems such as Salesforce, Marketo, Hubspot and beyond. Nearly all major languages and platforms can make use of Service Objects’ web services.

Below we discuss the most common types of integrations we see from our clients . And if you have a platform that isn’t listed below and would like more information on how it could tie in with our services, please reach out to us – we are happy to provide tips, sample code, plug-ins and recommend best practices and procedures.

API integration

This is our most popular option for real time validations, and allows our capabilities to be integrated directly into your software. Our services can be called via web requests either by HTTP GET or SOAP/POST, and the service response can be delivered in XML or JSON format. These protocols and output formats generally allows enough flexibility to meet  your needs. We also offer a web service description language (WSDL) file that can be consumed to auto generate the necessary methods and classes to call our various web services. If you have a specific language in mind, please check out our Sample Code page – chances are we have sample code already written for your needs.

List Processing

List processing involves sending us a list of your data to be validated. We take this list and process it through the appropriate web service and then return the results, appended to each record in your file. From there you can take the data, apply your business logic, and save it to your database.

This type of process is often the best approach for cleaning up existing data in bulk. A large export is generally easier than integrating via the API and processing it manually. However, depending on the resources you have available, both API or list processing are completely viable options and we have a number of clients that use both in concert.

We offer two convenient solutions for list processing: single batch runs for one-time processing, or automated batches. Let’s look at the differences between them:

Single batch runs. A single batch run is one of the simplest ways to have your data processed. You send us a comma separated value (CSV) file and we’ll run it against our services, append the data, and return it to you. It is perfect for cleaning up existing data. Many clients run a single one-time batch process to clean up their existing data and then implement a real time solution into their product, giving them the best of both worlds: clean existing data together with a process to ensure that incoming data is the highest quality possible.

Automated List Processing. Your data can be processed securely and automatically by uploading the data file to our secure FTPS server. Once uploaded, our system will recognize the new list to process and get to work. The input file will be parsed, run through a web service, and the results will be appended to original file. It is nearly identical to the one-time processing service that we offer, with the added benefit that you can upload files at your convenience to be processed automatically.

CRM integration

If you currently use one of the major customer relationship management (CRM) or marketing automation software platforms like Salesforce, Marketo, Hubspot, or others, chances are that our services integrate with it and we likely have sample code or plug-ins for themEach platform has its own level of customizability but they almost universally offer some variation on a plugin, api, or exposed interface to integrate with. Contact us to learn more about integrating our capabilities with your specific platform.

Whether you develop an API interface for your current software, use batch list processing, or integrate our capabilities with your CRM or marketing automation platform, Service Objects is with you every step of the way with support, sample code, tutorials, and the experience that comes with serving nearly 2500 customers. Get in touch with us today and see how easy it can be to integrate best-in-class data quality with your own applications environment.

The Evolution of Service Objects’ Phone Service

Since 2001, Service Objects has been building and providing data validation services for mid to enterprise level businesses. One of our first data quality products focused on phone numbers and the services we could supply around them, including validation, look-up and reverse look-up services.  Over these 17 years, our data quality products have evolved considerably.  Whether this is in response to changes in technology, legislation, internal desire and (especially) customer needs, our products are much better for it.

We thought it would be fun to take a quick walk down memory lane and see where we started and ultimately how these services have evolved.

Early years…

We got our start with compiled phone data sets.  These lists were compiled by buying and aggregating lists of phone data from third parties. These data sets allowed us to translate phone numbers into exchange info, and return information on the provider and consumer, including; provider or contact name, city, state, zip, and line type.  Obviously, this aggregated list data worked but we found that the data quickly degraded and grew stale.

Once we saw the issues with this static, compiled list approach, we moved into scraping various sites and aggregating fresher data. The scraping helped to supplement the ever-aging compiled data sets. This newer approach provided fresher phone data than straight compiled data but was still prone to errors.

The Landline era…

Having had success early on, we were able to invest in relationships with Telecoms and companies that work with Telecoms. This provided us with accurate and up-to-date information surrounding landlines.  At this point, landline data was common and most businesses were guaranteed to be listed using landlines.  On top of that, a large portion of the residences still owned landlines.  Having data for the two markets allowed us to cover a large majority of the Telecom space. With only landlines to consider, it was simpler to provide a solid landline only service, which we called Geophone.

The move to Mobile and VOIP…

With the wide-spread adoption of cell phones, there has been (and continues today) a significant shift away from landlines for residences to cell phones. The same came be seen with businesses and their adoption of voice-over-IP (VOIP). In both cases, data for these has been more challenging to attain. Early on, mobile device information was very hard to come by and we were required to take a step back data-quality-wise to precompiled sources for some of the alternative phone types.  It was a challenging time for us to provide the most accurate and update contacts around these types of phone numbers. We did our best and with the addition of these new phone types and our Geophone Plus service(GPPL) was born,  built as an extension of the functionality of the original Geophone web service.

Ported numbers…

More recently, porting has also become more common.  Porting telephone numbers simply means maintaining your number across carriers and line types. Mobile carrier to mobile carrier is common but in some cases, even landline to wireless carriers. During this era, the Telephone Consumer Protection Act (TCPA) was initialized. The threat of harsh TCPA penalties for calling mobile numbers created the need to know, with confidence, who companies were calling and on what phone type. From this need, our Geophone Plus 2 (GPPL2) was born. The data, like carrier info, phone type, subscriber name and ported date made the service highly desirable. Not only for companies looking to adhere to the TCPA regulations but also for business looking to gain competitive advantages in their respective industries.

New challenges…

As technologies progressed, new forms of phone types emerged. Identifying Google numbers, Skype numbers and other portable VOIP providers became more important. These new technologies brought along with them new potential for fraud. Users could easily sign up for VOIP numbers and use them maliciously or fraudulently and then abandon them. In addition, the identification of fax numbers, robo-callers and prepaid phones have become more important. To address this growing need, we upgraded our Phone Exchange product to Phone Exchange 2 (PE2).  It was built upon the existing functionality of our Phone Exchange service and extends the service’s capabilities to address these new Telecom technologies, as well as detecting and validating international numbers

Going forward…

Every day, we strive to add new data sources to enhance our phone and contact data sets and help identify other potential fraud sources. Our commitment is to keep providing these excellent services and features while accurately appending new data points that link names and addresses to their phone numbers. Some of the extended features are currently being worked into the Geophone Plus 3 web service that we will be launching in early spring. This includes features such as demographics information about locations and people, emails, and business information.

In addition, we will continue to respond to our customers’ needs, advances in phone technology and new legislation as it comes down (like the EU’s upcoming GDPR legislation) to help further improve our products.

The Struggles with Deprecated Services

The word “deprecated” is thrown around frequently in the software development world. It is used to indicate a product or service that is either not going to continue being maintained or it is going to be sunsetted. Often times, when companies roll out a new product or API they decide to give their users a heads up that the older operations are going to be deprecated. This prompts the users to update to the latest version to take advantage of the latest and greatest features that the company is offering.

Marking a service to be deprecated is a warning to the users of the product or service that it will no longer be supported and it is highly recommended to upgrade to a newer, supported service.  Here at Service Objects, we don’t particularly like the practice of deprecating services.  Although we don’t rule it out completely, our mission is to maintain support for our legacy services. This is because we understand that it takes time, resources, and money to integrate with APIs. The time it takes for developers to integrate, test, and deploy new code inevitably costs money. To help solve the issue of legacy services falling behind the advancements, we keep our core code separate from the individual service outputs. A fixed set of output fields enables us to provide our clients with peace of mind that the service they have invested their time and resources into won’t change beneath their feet.

A clear picture of this concept can be seen in our DOTS Address Validation services. We have DOTS Address Validation 1, 2 and 3. The 3rd iteration is currently our primary and most robust address validation service yet. It has the latest and greatest in terms of available output fields. Even though Address Validation 3  is our latest version of our address services, both DOTS Address Validation 1 and 2 are actively supported.

The reason we are able to maintain these is due to the fact that the share a core address validation code set, which is continuously refined to return the most accurate and up to date data available.

By choosing our services, you can rest assured that the service you integrate will not be left to be put out to pasture in the future  and will continue to push to provide you with the best data, regardless of which version of the service you are using.

We invite you to get started testing any of our 23 data quality services today.