Posts Tagged ‘contact data validation’

Spring Cleaning for Your Data

Cleaning up has always been a virtue. Thanks to Netflix and bestselling Japanese organizing expert Marie Kondo, who preaches a mindful approach to better living through tidying up, it has now also become a major viral trend. Today, I would like to help you explore another path to inner joy and peace in your business: cleaning up your contact data.

You see, your contact data assets are a bit like most people’s closets: they start off being functional, but without the right kind of effort, they decompose into clutter over time. (Over 70% of this data changes every year as people move, change jobs, their companies merge, and more.) Unfortunately, this clutter can cost you – in time, wasted marketing efforts, or even severe compliance penalties for unwanted marketing contacts. So here is a three-step process that will help ensure that your contact data is always genuine, accurate, and up-to-date.

First: Getting it Right at the Time of Data Acquisition

What is one of the more common sources of contact data error? Acquiring it in the first place. Most organizations have multiple touch points where contact data enters their system: web pages, inbound customer inquiries, lead processing, and more. Customers fat-finger their addresses or contact information, data entry team members are human and make mistakes, and sometimes fraud or fakery is even involved.

One productive solution to this is to use API-based tools the plug in to your sales, marketing or CRM platforms, to seamlessly help ensure contact data quality on the front end. For example, address validation services check inbound address data against USPS, international and other databases to ensure address accuracy or correct them if needed. For other forms of contact data, phone validation can identify a numbers owner and validity, as well as carrier, line type and geolocation data, while email validation verifies email validity and corrects things like common typos.

Other API tools bundle advanced services such as validating the quality of incoming leads, appending contact phone numbers, or linking your contacts to demographics data for business analysis or compliance purposes. Whatever level you need, implementing API-based services like these in your business automation platforms helps ensure getting the right contact data every time, at the point of entry.

Second: Cleaning Your Database

Congratulations – you’ve now entered accurate, validated contact data. Which leads to the next issue: practically the minute you get up to grab some coffee, your contact data assets are starting to decay. So periodically, it makes sense to bring this data back in line with reality, to maintain its usefulness for functions like market analysis, business intelligence, campaign planning and more.

In situations like these, batch or list processing services often represent a convenient way to clean an entire contact database at once. Our own batch services can process an entire list or database with little or no programming required. Tools like these are often a smart and simple way to make good data hygiene part of your regular routine.

Third: Lather, Rinse, Repeat

How often should you clean up your data? Repeat after me: every time you use it. Here’s why: accurate contact data may be important for things like market planning and business analytics, but it is absolutely critical when you actually get in touch with people on your lists. Direct mail campaigns have human and material costs tied in with bad address data, outbound telemarketing to changed numbers potentially risk severe penalties from the Telephone Consumer Protection Act (TCPA), and unwanted email contact can get you blacklisted.

The same tools you use to validate and clean your data are at your service here, each and every time you run a campaign or contact your customers. But here, the solution is as much organizational as technical: make sure someone is “on first” for ensuring your ongoing data quality and data governance.

Questions? We Can Help!

When it comes to cleaning your data, we actually do one thing much better than Marie Kondo: her bestselling book sprang from an infamous months-long waiting list for her organizing services, but our knowledgeable team of data experts will return your call in just 90 minutes or less! So whether it’s questions on international data, API interfaces, or simply discussing what strategy works best for you, contact us anytime and let us help you discover the life-changing magic of tidying up your data.

Image of earth from space with network concept overlay

Dealing with Difficult Countries in Lead Validation – International

What is DOTS Lead Validation – International?

For over a year now, Service Objects has been validating leads around the world with DOTS Lead Validation – International. This service looks at six key components – name, business, email, IP address, mailing address and phone – testing, analyzing and cross-comparing each to build a certainty score of 0-100 indicating the reliability of the given data.

Lead Validation – International can be used for any type of data quality need, be it validating data in an online web form or shopping cart, cleaning up a CRM or other database, passing leads through a marketing automation platform or just running against a list. It can be integrated with any major programming language, and is compatible with all of the major CRM and automation systems.

Lead Validation or Lead Validation – International?

Lead Validation – International is the global successor to our US and Canada based service, DOTS Lead Validation. Both services generate a lead quality score using our strongest data sets and algorithms to validate and cross-compare key contact data components – name, business, email, IP address, mailing address and phone. Lead Validation – International was built from the ground up with a new interface, tests and datasets. Once this new service was up and running, we circled back and upgraded the US and Canada service to take advantage of this new interface and engine.

For the purposes of validating US and Canadian leads, Lead Validation and Lead Validation – International will return the exact same results. Choosing between the domestic or international service really comes down to your needs and the mix of international leads in your data set. Since the results for domestic leads will be the same, we generally recommend using Lead Validation – International for most lead validation where any international leads may be captured. To make it easier, the new interface is shared between both services so switching from one to the other requires only a minor update to the request URL and a new license key for the other service.

What’s new in Lead Validation – International?

The power of Lead Validation – International continues to grow as we refine our algorithms and add new international datasets. Processing international data is still one of the most challenging things we do at Service Objects, due to the logistics of getting and working with international data sources. Our latest Lead Validation – International update includes more attention to identifying questionable phone numbers, weeding out reserved phone numbers country-by-country, and looking for combinations of phone numbers that are likely to be invalid.

Additional tests have been included in the logic that cross-compares the input and component country information. These new tests allow for better resolution in validating the consistency of leads data by allowing us to check from a few new directions. Above all, the main reason for this latest update was to upgrade how we handle problematic countries.

How does Lead Validation – International deal with problematic countries?

As mentioned, the primary motivation for the recent Lead Validation – International update was to help one of our clients deal with a large uptick in potentially fraudulent leads from questionable countries, that traditionally appear at the top of generic, drop-down country lists, such as Antarctica, British Indian Ocean Territory and Aland Island. The first two have almost no population, while the latter has a small population.

Prior to this update, we considered these unsupported countries. Because it was unlikely that these leads were actually genuine, and a choice of these countries should cause them to be flagged for further scrutiny, the service would return a quick error and not charge the user for the request.

A deeper dive into example data surrounding these leads suggested that quite a lot of fraud takes place using these countries. It seems likely that many leads selecting these questionable countries are coming from automated sources, like bots. Often these leads have additional data points that do not match up well or are clearly fraudulent. Our service is well suited to identify these cases.

What’s next for Lead Validation – International?

Our engineers are constantly refining all of our services, including Lead Validation and Lead Validation – International. The current solutions are great tools and we continue to improve their processes for dealing with problematic countries or whatever the next challenge. We are constantly evaluating and adding more accurate and reliable data and while improving and refining our algorithms to detect questionable leads, businesses, and suspicious input in general.

These are just a few of the things we will continue to explore to help Lead Validation and Lead Validation – International continue to meet your needs. If you have feedback or questions regarding our Lead Validation services, get in touch with our team.

Lead concept with people and gear icons on a blue background

Why Should a Business Validate Leads?

Your business is measuring how many leads you generate with your marketing efforts. You’re even scoring your leads’ likelihood to buy with a marketing automation tool or CRM. But can your marketing and sales tools tell you if your contacts are legitimate prospects? Lead validation fills the gap between marketing automation and sales efforts, creating more visibility in marketing and more efficiencies in sales.

Validating leads can help any B2B or B2C business improve their marketing and sales tactics, but it can be especially transformational for international companies. While all businesses face the issues of incomplete, incorrect, or spam form submissions, there are unique challenges to doing business in the global market. Differences in country address formats and language characters can muddy your data. What’s more, automated validation can provide a solution to international compliance issues.

What is lead validation?

According to Michael Brenner, the CEO of Marketing Insider Group, lead validation is critical to your marketing – without it, you run the risk of grossly overestimating the ROI of your marketing campaigns. We agree wholeheartedly.

Lead validation is the process of verifying and scoring the quality of the leads you generate. It’s risky to assume that each phone call to your sales team or form completion on your website indicates a strong lead. According to a recent study by marketing services firm Straight North, analyzing over 350,000 marketing inquiries, barely half of them were legitimate sales leads. Manually sorting through past inquiries could be considered a form of lead validation, but in practice it’s inefficient. Moreover, you may not recognize typos or spam entries by sight alone.

By implementing a lead validation service, you can use automation to cross-check new lead information and push your most promising leads to the top of your sales team’s to-do list. Our DOTS Lead Validation – International service supports more than 250 countries with the most accurate, current data available and integrates with leading CRM and marketing tools like Salesforce, Marketo, and Hubspot.

How does lead validation work?

Simply put, the contact input you collect from your leads is verified as real, then compared to existing data sources to correct and append the information where needed.

How does our Lead Validation – International service work? It verifies and cross-validates contact data, including a prospect’s name, address, phone, email, and IP address against hundreds of authoritative data sources. Following verification and correction, it provides an overall quality score of 0-100 for the lead itself, and a specific quality score to each data point. Your business logic determines the parameters of how the output works with your processes. Our notes even indicate if a contact is covered by the General Data Protection Regulation (GDPR), so you can be sure you’re in compliance.

Why is lead validation important?

Almost every department in your organization benefits from lead validation. Here are just a few of the ways that lead validation can directly strengthen your business practices:

  • Marketing – Gain a better understanding of effective (and ineffective) marketing practices, pivot campaigns quickly, and create powerful and personal marketing automation. Scores can be connected to campaigns for a more accurate lead gen reporting.
  • Sales – Spend more time and energy focusing on leads that score the highest, gain insights to leverage during sales calls, and deploy effective sales drip automation.
  • Business Development – Collect clean, accurate data to make projections and plan growth.
  • Compliance – Adhere to regulatory guidelines, such as GDPR.

Lead Validation – International can be deployed in four ways

  • Real-time API – Check and score data at the point of entry.
  • Cloud Connectors – Connect with major marketing, sales, and ecommerce platforms.
  • List Processing – Batch services are available to clean up existing databases.
  • Quick Lookups – Spot-check leads that come in through nontraditional channels.

The bottom line: validating your leads ensures your contact records are as genuine, accurate, and up-to-date as possible, and helps your team identify contacts that fall under data privacy regulations. If your business is spending time and money on global marketing and sales efforts, lead validation should be the next tool you add to your toolkit. Get your free API trial key and see how Lead Validation – International provides powerful insights to help your business grow.

Blue phone icons on a screen, one lit up red

TCPA and You: A Look Ahead for 2019

If you do outbound marketing via telecommunications, the Telephone Consumer Protection Act (TCPA) has probably been part of your business agenda – particularly in recent years, as stiffer interpretations of these consumer privacy laws have led to multi-million-dollar judgments against major corporations and others. So what lies ahead for businesses in the next 12 months in terms of TCPA?

The short answer is twofold: in an era of increasing consumer privacy, TCPA isn’t going away any time soon – but as efforts to ease its impact on businesses are making their way through the courts, there is hope for less risk and more leeway in meeting the requirements of TCPA. We will continue to monitor these developments closely, as a key provider of tools for TCPA compliance, but here is a summary of what we are seeing so far.

Three key issues: equipment, consent, and third parties

In a recent video interview with text messaging vendor Tatango, TCPA attorney Ernesto Mendieta highlighted three key issues that are currently the subject of court cases:

  • the definition of automated telephone dialing systems (ATDS)
  • revocation of consent
  • the definition of co-parties.

The ATDS issue is particularly important for many businesses. TCPA prohibits unsolicited calls made via automated dialing equipment; however, a much broader definition of ATDS introduced in 2015 included equipment that could store and dial numbers without human intervention, even if these capabilities were not used. This expanded definition was struck down by an appeals court in 2018, with new FCC guidelines expected in 2019. According to law firm Eversheds Sutherland LLP, businesses are hopeful that these new guidelines will provide a much clearer standard for these devices.

Another key issue for TCPA litigation revolved around whether consumers can revoke consent for contact via ATDS if they have previously agreed to such contact under the terms of a contract. Recent legal cases have tended to rule in favor of businesses, deciding that such contracts override a consumer’s right to revoke this permission, however, case law is not unanimous and further cases are expected to shed more light on this issue in 2019.

Finally, recent court decisions such as this one involving Taco Bell point to more clear boundaries about whether businesses are liable for TCPA violations on the part of third parties promoting their products or services. Here as well, case law is expected to evolve further in 2019.

In general, many of these legal efforts spring from a backlash from businesses affected by recent stiffer interpretations of TCPA, and its fallout in terms of penalties. For example, the National Association of Federally Insured Credit Unions (NAFCU) is publicly urging the FCC to reform TCPA to “separate bad actors who are harassing consumers with unwanted and potentially harmful robocalls from good actors like credit unions contacting their members with valuable information on their existing accounts.”

A new safe harbor for changed numbers

One other major change on tap for 2019 is a new way that businesses can protect themselves against inadvertent marketing calls or texts to numbers that have changed hands. In December 2018 the FCC issued an order calling for the creation of a national database of reassigned phone numbers, for the purpose of reducing unwanted contacts to consumers with these numbers. To encourage its use by businesses, this ruling also includes a TCPA safe-harbor provision for calls to reassigned numbers when the most recent version of this database is checked first.

It is important to note that this new database will not remove the need for good contact data hygiene, particularly the verification of contact phone numbers. Contacting a bad or mistyped number can still open businesses to liability. Given the high percentage of numbers that do change each year, it is still important for the sake of data integrity to verify contact numbers both at intake and before a campaign, using tools such as Service Objects’ DOTS GeoPhone Plus 2 service. However, this new database can help mitigate what has often been a common source of liability.

Summing it all up

The essential purpose of TCPA remains unchanged: businesses still can’t spam consumers via automated telecommunications, particularly wireless devices, without their explicit permission. But there is hope for well-intentioned businesses in 2019, with prospects ranging from clearer legal requirements to better tools and safe harbor provisions for inadvertent marketing contact.

Here at Service Objects, we will continue to keep abreast of how TCPA and its enforcement continues to evolve in 2019. In the meantime, we are always happy to consult with your business to help you find cost-effective solutions for TCPA compliance – contact us anytime.

Woman Checking Email on Laptop

Email Validation Terms You Should Know

No matter the industry, you likely have jargon or terminology that’s specific to the work you do. The email world has a nomenclature all of its own, too. Understanding the lingo can keep you on the right side of the email servers.

Here are some key terms that can help you make the most of your email marketing efforts:

EMAIL SERVICE PROVIDER (ESP)

These are firms that provide email services, ranging from major providers such as Google’s Gmail or Microsoft’s Outlook to more specific offerings. The major firms often combine free offerings for consumers with expanded paid services for business.

DOMAIN NAME SYSTEM (DNS)

An Internet protocol that links a domain name with resources such as IP addresses, mail exchangers, and name servers. In short, it stores where valid domains live on the Internet.

MAIL EXCHANGERS (MX)

Also known as a message transfer agent (MTA), these are computers and software that execute protocols for sending and receiving a domain’s email messages and their attachments.

WHITELIST

A list of email addresses that are automatically approved for delivery, used to make sure that emails from familiar sources get through.

GREYLIST

A technique used to temporarily reject new or unfamiliar email, often by rejecting it at first with a “soft bounce,” and then accepting subsequent delivery attempts. It works because legitimate email servers will normally attempt redelivery, while spammers generally won’t.

BLACKLIST

Email addresses that are tagged to be rejected by a mail server, normally because it is suspected as being spam. Unsuspecting legitimate emails can get blacklisted, however, if the sender isn’t careful: see the definition of honeypots and spamtraps.

HONEYPOTS AND SPAMTRAPS

Unpublished email addresses used to trap spammers, particularly those who “scrape” addresses using webcrawlers. Purchased email lists from the wrong sources may contain such addresses, which in turn can get you banned from sending to their domains.

ROLE ADDRESSES

These are email addresses that belong to a job function, rather than an individual: for example, support@mycompany.com. Role addresses are a potential minefield for your contact database, because emails to addresses for multiple people can easily be flagged as spam.

DISPOSABLE EMAILS

These are email addresses that are generated for unique uses such as signing up for lists, or may expire after a period of time. The good news? They protect users from exposing their primary email addresses to spammers. The bad news? They are often used by Internet trolls, or people who want to sign up for your marketing goodie without being on your mailing list.

ADVANCED EMAIL ADDRESS VALIDATION

An advanced email validation and verification service, such as Service Objects’ DOTS Email Validation, validates email syntax to confirm there is a box name, an @-symbol, and a domain. Additionally, it flags improbable addresses, such as vulgar or famous names. Sophisticated algorithms check the existence of SMTP server and working mailbox, that the mail exchange record is valid and accepting mail, and that domain specific mailbox rules are met.

Why Email Validation Is Important

Email validation is a simple, painless, easy-to-implement capability that improves the quality and functionality of your email lists. By taking advantage of our validation infrastructure, as well as the live experience that goes into our extensive databases of validation criteria, you gain the following benefits:

  • Substantially improving the ROI of your email marketing
  • Reducing email bounce rates by up to 90% and maximizing deliverability
  • Protecting your business from being blacklisted by valuable or important contact domains, by warning against potential spam traps and honeypots
  • Combatting fraud by providing important MX-specific flags including catch-all, wireless, free, disposable, alias, domain quality and more
  • Helping you document compliance with government regulations such as GDPR, CAN-SPAM and others, by ensuring that you have correct contact information
  • Maintaining a good email “sender reputation” that enhances your contact effectiveness
  • Better customer service through improved contact rates

To learn more about how email validation can help you avoid the dreaded blacklist, download our free whitepaper, The ROI of Real-Time Email Validation. The strategies presented will not only improve your response rates and effectiveness, they will help protect your organization from a host of issues including fraud, blacklisting and regulatory concerns.

Hand manipulating gears

Choosing a Web API: REST, Remote Procedural Call or Hybrid

REST is a popular architectural style that has been the go-to design for web-based APIs for years. The style has defining constraints, but it is not a protocol and there is no official standard. REST is like the pirate’s code in the Pirates of the Caribbean movie franchise, in that REST “is more what you would call guidelines than actual rules.” As such, not everyone follows the guidelines exactly or interprets them the same way.

This can lead to some confusion and debate as to what is and is not considered RESTful. At the same time, one can look at this flexibility as an opportunity to pick and choose what will work best for their needs. In this article, we briefly discuss some web API design options and explore some of their strengths and shortcomings.

What REST Is – And Isn’t

For some, REST simply means using a service that is not SOAP. SOAP is another widely used protocol, but some stay away from it due to its complexity and the extra overhead that it requires. You would be hard pressed to find a REST-related article where SOAP is not mentioned. You will likely find comments about how great REST is because REST uses JSON and SOAP uses XML.

People often mistakenly think that REST is defined by the JSON language format, and when they say they want a RESTful service what they really mean is they want a service that supports JSON. REST is not as simple as that, and it is not confined to any one language format: it is an architectural style and not a specific protocol. Conversely, JSON is not exclusive to REST, as JSON is also available in Remote Procedural Call (RPC) APIs.

So, with that out of the way, the rest of this article will assume that the reader is familiar with the general style of REST. If you wish to learn about REST in more detail then feel free to read the dissertation it was based on – or if you’re not into heavy reading, check out the REST wiki.

Comparing Three Web API Options

Now, let’s look at three different approaches you can use for web APIs:

  • REST – Representational State Transfer (REST) web APIs are “resource” oriented. RESTful APIs synergize well with CRUD applications and other forms of data resource management and manipulation.
  • RPC – Remote Procedural Call (RPC) web APIs are “action” oriented. They enable the user to call a remote method to complete an action as though it were local. This is similar to importing and using the methods from a local library API.
  • Hybrid – Hybrid designed web APIs incorporate both the RESTful style and RPC-based design methods. Some RESTful APIs will include RPC design to add action methods where strictly RESTful resource-oriented methods are not appropriate.

Since REST is resource oriented it is also useful for resource discovery. For example, if a Service Objects client was interested in all the cities or ZIP codes for a state then a RESTful API for discovering those resources and drilling into them for details would be ideal. This approach will also work well for suggestion and IntelliSense applications where several queries to find an address would be acceptable.

By comparison, users looking for address validation could find that having to make several queries would be cumbersome – especially if their data is incomplete or partially invalid. Most of our users do not always have complete data, and lean on us heavily to parse through and correct any garbage or mistakes to fill in the gaps and return an intuitive and standardized response.

RESTful Example: Address Validation

Let’s briefly explore what creating a RESTful address validation service would look like. We’ll start by coming up with a way for the user to submit an address for validation.

For this exercise, the theoretical service will be hosted on the following domain and path.

Domain: example.serviceobjects.com

Path Format: /AddressValidation/{State}/{City}/{ZIP}/{AddressLine1}/{AddressLine2}

The above format lends itself more to address discovery and would be better suited with a /FindAddress path name; however, this application’s primary job is validation and not discovery so let’s continue.

Let’s test the service using the Service Objects office address as an example.

Path Example: /AddressValidation/CA/Santa Barbara/93101/27 E. Cota St #500/

Unfortunately, the above example will fail to validate correctly due to the pound/hashtag (#) character. We know that the # symbol is a special URL character and that it is used as a fragment identifier. Commonly referred to as anchor in HTML. Its inclusion means that the URL path is prematurely terminated at the # symbol.

In this case, the API can respond in a few ways. It can return an HTTP Status code of 400 (Bad Request), but that isn’t too helpful to the user. It can also accept the request, but since the service is unable to see anything past the # symbol, the full address will not be validated. This is problematic, not only because the full address is not being validated, but what would be an acceptable response from the service? Validating the address as-is would be misleading to the user, and the service itself is unaware that parts of the address are missing. In the end, for this example, it would be best for the service to return the 400 status code for safety.

Now let’s try a different path format:

Path Format: /AddressValidation/{State}/{City}/{ZIP}/{AddressLine1}/{UnitNumber}/{AddressLine2}

Path Example: /AddressValidation/CA/Santa Barbara/93101/27 E. Cota St/500/

The above path format and example work, but it requires the user to parse out the apartment number from the address beforehand, and it is still susceptible to malformed requests due to special URL characters.

As we have just demonstrated, the path parameter approach has some shortcomings. More details and examples are available in a past tutorial. Fortunately, REST is not limited to one way of making requests; however, depending on one’s interpretation this is where this service and many others start to deviate from RESTful guidelines.

Foregoing the path parameter approach entirely, we could use query string parameters or POST requests.

Query String Example: /AddressValidation?AddressLine1=27 E. Cota St #500&AddressLine2=&City=Santa Barbara&State=CA&PostalCode=93101

In the above example, the query string parameters can be URL encoded, and there is no fear that the full address will not be passed in. There is also the option to use POST, but using POST instead of GET to read a resource is generally not done as it goes against REST guidelines. However, it is not uncommon to see many services use POST in ways outside of the REST guidelines.

For presentation sake, let’s continue with our example using POST and JSON.

JSON POST Request: 

A strictly RESTful response would be as simple as an HTTP 200 status code with a response like this:

If we provided an invalid address, for example by changing the suite number from 500 to 5,000 which does not exist, the RESTful response would then potentially return an HTTP status code of 404 with an output of:

The above output examples are too simplistic and not informative enough. The service could return a status description of “Address Not Found”, but the user is still left with very little information to work with. Our users rely on our services to not just validate their data but to also help inform them on how to proceed with using their data.

Following the RESTful style guidelines, the output could also contain an Address ID that can be used to reference the address and list details for it.

This RESTful style is not quite user friendly in that it requires the user to perform two queries: one to verify the address, and another to retrieve details for it. A big part of the problem stems from REST being “resource” oriented whereas Validation and Verification services are “action” oriented; therefore, lending themselves more towards the RPC-based design.

That’s not to say that the RESTful service cannot be made more user friendly. A hybrid approach can be taken, and the service could be expanded to be more informative by adding parameters to retrieve additional data so that only one request query is needed.

Output:

However, using an RPC-based design and forcing it to try and conform to the RESTful style doesn’t necessarily lead to a better user experience.

Why Flexibility Is Important

REST, RPC and hybrid designs each have their own merits and uses. It’s important to choose a design that works best for the given situation. That’s why Service Objects chooses a hybrid approach for our services that leans more toward the RPC-based design, since it suits the action-oriented usage. With validation services there are often many gray areas and edge cases to deal with that work best with custom action-based verb commands.

In a RESTful style, custom verb commands are to be avoided, and some would argue that any use of them would, strictly speaking, not be REST. However, many services do make use of both in a hybrid approach and even Google’s API Design Guide states that , “both RPC APIs and HTTP REST APIs are needed for various reasons, and ideally, an API platform should provide best support for all types of APIs.” So, whether you want to use HTTP GET, POST or SOAP with XML or JSON, the Service Objects DOTS Web Services have you covered.

Data Migration and Compass on Keyboard

Why Data Quality is Important for Data Migration

Change is a constant in life. And when it comes to your data, even more so. Given enough time, migrating your data to new systems and platforms will be a fact of life for most businesses. Whether it involves a corporate merger, a new application vendor, or other reasons, data migration is one of those predictable “stress points” that can put your contact data assets at risk without the right strategy.

According to a recent post by Dylan Jones on Data Quality Pro, data quality issues are one of the key reasons for the high failure rate of data migration projects. He cites a recent survey showing that 84% of these projects run over time and/or budget – and in his view, an important part of avoiding this involves advance planning and modeling.

Data quality best practices for contact data migration

From our perspective, there are at least four best practices you should consider for preserving contact data quality during a data migration process:

Measure twice, cut once. Jones describes the use of what he calls landscape analysis, or a “migration simulation” in plain English, to anticipate problems before the migration begins in earnest. This involves testing a subset of your database against planned conversion rules and protocols, to help ensure that the results are likely to go as planned.

Validate addresses before conversion. The term “garbage in, garbage out” applies here, with clean data being an important factor on the front end of the migration process.

Validate addresses after conversion. What is the one thing that is worse than bad data? Reformatting data to make it even worse. Bad things can happen when old data goes into new fields, and if John Smith at 123 Mayberry Street becomes “John” at “Smith 123 Mayberry” in the new database, his value as a contact can go completely out the window. Never blindly trust converted data without a validation and review step to flag bad contact data in real time.

It ain’t over until it’s over. Yogi Berra’s famous baseball saying applies equally to data migration, because you aren’t finished with data quality when the initial conversion is done. Contact information is a perishable resource that goes stale over time, as people come, go, and change jobs and addresses. This means that your contact data migration isn’t really finished until you have implemented and tested an infrastructure for ongoing contact data validation and database cleaning.

Data migration: Not just a technical problem

According to a recent Infosys white paper, treating data migration as an “IT problem” is often a fatal mistake in terms of data quality – in their words, “Business has to not just care about data migration, but command it.” Put another way, no one can sweat the details on data quality like the stakeholders who will be using this data in the long run.

This raises one more important issue: a major data migration might also be the time to start thinking about a more formal data governance strategy, if one isn’t in place already. We’ve discussed this issue on our blog before, and it is particularly relevant here: major changes such as a data migration can often serve as a catalyst to build professional data expertise at a business-wide level. Either way, putting data quality front and center is one of the most important factors in creating a smooth transition to a new environment.

Have any additional questions on the important role data quality plays in the data migration process? Contact us and we will be happy to answer your questions.

Service Objects’ Top 5 Business Growth and Marketing Blogs

Service Objects takes our customers’ success very seriously, which is why we regularly create content to help organizations make heads or tails of their contact data and offer advice for implementing data quality solutions. Some of our blogs strike such a chord that they continue to attract attention far after publication.
Here are 5 of our most popular Business Growth and Marketing articles to date, click through to read more.

Power Up Your Ecommerce

Some things are just better together. Like milk and cookies. Or peanut butter and jelly. Or, if you do online sales and marketing, ecommerce platforms and data validation services. Integrating live, real-time validation services right into your ecommerce platform is easy to do, and gives you a whole host of benefits including promoting sales, preventing fraud and ensuring top-notch customer service and product delivery. This article explores a rich smorgasbord of benefits you can engineer into your own shopping cart platform – adding any of them will make your life easier. Read More

Email Marketing Tip: Dealing With Role Addresses

Do you have any friends named “info” or “customerservice”? If you do, our sympathies, because their parents were probably way over-invested in their careers. But in all likelihood, you probably don’t. Which leads to a very important principle about your email marketing: you always need to make sure you are marketing to real people. Email addresses like “info@mycompany.com” or “customerservice@bigorganization.com” are examples of what we call role addresses. They are not addressed to a person, but rather to a job function and generally include a number of people on the distribution list. Read More

People, Process, and Technology: The Three Pillars of Data Quality

For many people, managing data quality seems like a daunting task. They may realize that it is an important issue with financial consequences for their organization, but they don’t know how to proceed in managing it. With the right strategy, however, any organization can reap the benefits of consistent data quality, by focusing on three core principles: People, Process, and Technology. Taken together, these three areas serve as the cornerstones of a structured approach to data quality that you can implement and manage. Read More

Online Fraud is Growing. What Can Your Business Do?

Estimates vary, but recent figures from DigitalCommerce360 project the value of eCommerce fraud nearly doubling from US $10 billion to $19 billion between 2014 and 2018, as the eCommerce market continues to grow from a historic peak of US $2.3 trillion in 2017. One particular area of fraud, account takeovers, jumped 45% in Q2 of 2017 alone according to the Global Fraud Index, and these fraudulent pirated accounts represent one of the top three types of online retail fraud. Read More

A New Role: The Marketing Technologist

Once upon a time, life was simple. There was marketing, and there was IT. The former did creative work to drive the product creation and sales process, and the latter kept the computers, software and networks running. In large organizations, the former had a Chief Marketing Officer and the latter had a Chief Information Officer. And if the two departments talked, it was usually about things like software licenses or password resets. Fast forward to 2017. Marketing is now a heavily data-driven field, where success involves things like marketing automation platforms, CRMs, big data analytics, social media analysis, content personalization, and data governance. Read More

Our most popular blogs have one thing in common: they offer insight to help your team leverage data quality to enhance your business practices. View all of our Business Growth and Marketing content or reach out to let us know what you’d like to see more of.

Many emails flying into a trash bin

Identifying Disposable Email Addresses: A Better Approach

Disposable email addresses – also known as burner emails, throwaway emails, temporary emails or fake emails – are commonly touted as a useful tool for keeping one’s personal or business email address private and clean of spam. Not to be confused with alias email addresses (which generally forward to a primary email address, and are therefore more likely to be read), there are different types of disposable email addresses, and they can work in a variety of ways.

In general, a user will submit a disposable email address instead of their real one, which in theory should help keep one’s own email protected from spam without their primary email and/or private data being exposed. (Note that we say “should”: there are some unscrupulous disposable email providers out there, so as with all things concerning the internet, users must be careful.)

Disposable email addresses may sound great for end users, but they can be problematic for legitimate businesses and marketers. One could easily argue that disposables are successfully doing their job when it prevents a marketer from emailing an end user, but this also means that businesses are forced to adapt their marketing strategies. One such strategy: trying to identify these disposable email addresses up front, to have a more accurate view of your email marketing assets.

A simple (but flawed) strategy: email lists

Disposable email addresses are commonly identified by static lists. There are many online communities that pool together their own lists of known disposable domains and email addresses. However, static lists are a poor long-term solution, as they can quickly become stagnant. Some communities do their best to keep their lists up to date, but there are still many potential problems with this strategy:

  • Lists often lack standardization, which can lead to implementation issues. There are many disposable services available worldwide, and some community driven lists and solutions are dedicated to just a single disposable service.
  • These lists frequently contain legitimate records for domains and addresses that are not disposable.
  • In order for a disposable to make it on to a list it first needs to be reported. By the time that happens, and the data makes it way into a solution, the list may already be partially outdated. Moreover, disposables frequently change and not all disposables are reported.
  • Using a list strategy requires constant vigilance. It’s trouble enough staying up to date on just one disposable service, but trying to stay on top of multiple others as well as new ones as they pop up is often a losing battle.

Lists of disposable email addresses are a reactionary solution at best. Worse, they only scratch the surface of the problem. Disposables are constantly changing, with new ones appearing and old ones disappearing all the time. It is impractical to rely on a simple list strategy to try and successfully identify a disposable.

A better approach: organic data aggregation

At Service Objects we like to look beyond simple lists. Instead of looking at one list to perform a simple straightforward disposable lookup, we take advantage of our wealth of data and our years of experience to not only dig deeper, but to also cast a wider net. Our email validation service doesn’t just look at lists, it looks at the whole picture as well as the nitty-gritty.

We observe various behavior patterns to better identify specific activities and ties to these activities, not just for disposables but for a variety of email types – malicious or otherwise. This allows us to assign values to these activities and even compare them against other activities. Using complex algorithms along with machine learning we can intelligently determine if a value is directly or indirectly related to a particular issue, such as being a disposable address.

As sophisticated as this solution is, note that we won’t always be able to successfully identify a disposable address. Sometimes all the variables don’t match up just right, and sometimes there just isn’t enough data. However, the service will still often be able to identify such email address as being malicious or potentially malicious, in which case you would likely want to reject the email address anyway.

The sophisticated solution

Disposable email addresses are a real headache for businesses and marketers. As with most things regarding email addresses, they are a much more complicated problem than one would normally think. A problem that requires more than a simple list as a solution. They call for a sophisticated solution.

Our DOTS Email Address Validation service keeps tabs on millions of domains. It monitors various behavior patterns and leverages multiple sets of data. As domains and data continue to grow, so does the service – becoming smarter and better. The service can adapt to the constantly changing disposables, making it better suited to identify them as they pop up. Not because it’s trying to keep up with them, but because it’s anticipating them.

No image, text reads Service Objects Tutorials

Tutorial: Address Suggestor using Google’s Places API & Service Objects’ Address Validation API

Accurate address autocomplete

At Service Objects, our mission is to ensure contact data is ‘genuine, accurate and up-to-date.’ This can be challenging when it comes to our clients wanting to use an address suggestion or address autocomplete tool while ensuring accuracy and deliverability.

What is address autocomplete?

Address autocomplete is a real-time web service that suggests addresses while a user is still typing it out. The majority of them are built using Google’s Place Autocomplete, which Google defines as:

“a web service that returns address predictions in response to an HTTP request. The request specifies a textual search string and optional geographic bounds. The service can be used to provide autocomplete functionality for text-based geographic searches, by returning places such as businesses, addresses and points of interest as a user types.”

It sounds great on the surface but when you dig a little deeper, a few problems are uncovered:

  • Google Places API often does not suggest locations at the apartment or suite level.
  • The locations the Google Places API suggests are often not mail deliverable. For instance, a business may have a physical location at one address and receive mail at a completely different address.
  • When Google is not sure of a location or address, it will make approximations as to where an address should be.

If your business mails or ships, these issues are going to create poor customer experiences and potentially generate expensive customer service issues. So, how can you can you enjoy the benefits of an address autocomplete tool while maintaining a high rate of accuracy and deliverability?

Address suggestion + validation = The super suggestor!

This is where Service Objects comes in. By combining the power of our Address Validation tools (US, Canada, and International) web services with Google’s Place Autocomplete API, we can identify if an address is complete, accurate and deliverable. Furthermore, where addresses are incomplete or not deliverable, they can be corrected in real-time by our address validation services. In fun, we call this the Super Suggestor (insert favorite superhero theme music here), providing the best of both worlds, time-saving, customer-friendly address suggestion with the confidence that the address selected is validated, accurate and deliverable.

So how do you combine these two APIs? Watch the step-by-step tutorial below to learn how to create your own Super Suggestor.

We have also provided the complete transcript here.

Service Objects’ Top 5 Technical Blogs

Customer Service Excellence is one of Service Objects’ core values, which we support in a number of ways, including creating a variety of technical content. Our engineers regularly contribute to our blog to help organizations implement data quality solutions and stay on top of trends. Many of our blogs continue to attract attention far after publication.

Here are 5 of our most popular technical articles to date, click through to read more.

Geocoding resolution – ensuring accuracy and precision

When geocoding addresses, coordinate precision is not as important as coordinate accuracy. It is a common misconception to confuse high precision decimal degree coordinates with high accuracy. Precision is important, but having a long decimal coordinate for the wrong area could be damaging. It is more important to ensure that the coordinates point to the correct location for the given area. Accurately geocoding an address is very complex. If the address is at all ambiguous or not properly formatted then a geocoding system may incorrectly return a coordinate for a location on the wrong side of town or for a similar looking address in an entirely different state or region. Read More

How to identify incorporated and unincorporated places in the united states

The US Census Bureau uses the term “place” to refer to an area associated with a concentrated population, such as a municipality, city, town, village or community. These statistical areas have a defined boundary and they may or may not have a legal administration that performs some level of government function. The US Census Bureau uses class codes to classify different types of places and areas. The Bureau currently lists 70 different codes; however, all places are either a legally incorporated place or a Census Designated Place. Read More

Looking beyond simple blacklists to identify malicious ip addresses

Using a blacklist to block malicious users and bots that would cause you aggravation and harm is one of the most common and oldest methods around (according to Wikipedia the first DNS based blacklist was introduced in 1997). There are various types of blacklists available. Blacklists exist for IP addresses, domains, email addresses and user names. The majority of the time these lists will concentrate on identifying known spammers. Other lists will serve a more specific purpose, such as IP lists that help identify known proxies, TORs and VPNs or email lists of known honey pots or lists of disposable domains. Read More

Catch-all domains explained

Imagine launching an online business and associating your email address with your business domain. For example purposes, let’s say your domain is XYZ.com and your name is John. Your email address would be john@XYZ.com. Now what if someone entered jon@XYZ.com? If you had a “catch-all” domain, you’d receive email messages sent to ____@XYZ.com — even if senders misspelled your name. In fact, that was originally part of the allure of catch-all email addresses. With a catch-all domain, you could tell people to send email to anything at your designated domain such as: sales@, info@, bobbymcgee@, or mydogspot@. No matter what they entered in front of the @ sign, you’d still get the message without having to configure your server or do anything special. Read More

Can Google Maps be used to validate addresses?

In November of 2016, Google started rolling out updates to more clearly distinguish their Geocoding and Places APIs, both of which are a part of the Google Maps API suite. The Places API was introduced in March 2015 as a way for users to search for places in general and not just addresses. Until recently the Geocoding API functioned similarly to Places in that it also accepted incomplete and ambiguous queries to explore locations, but now it is focusing more on returning better geocoding matches for complete and unambiguous postal addresses. Do these changes mean that Google Maps and its Geocoding API can finally be used as an address validation service? Read More

Our most popular blogs have one thing in common: they offer insight to help your team leverage data quality to enhance your business practices. View all of our blog content or reach out to let us know what you’d like to see more of.

IP Reputation and the Nationwide Bomb Threat Hoax

The bomb threat hoax from Thursday, December 13, 2018 was easily detectable as fraud.

There were several smoking guns that could have quickly identified the bomb threats as bogus. The leading indicator of fraud was the IP block from which the emails were sent. Emails associated with this bomb threat hoax were sent from the 194.58.x.x address range. This address range is well known in Internet security circles as malicious. IP reputation databases show this range was identified as fraudulent as early as March of 2015. Internet traffic originating from this range was commonly known to place fraudulent orders on e-commerce sites. This IP range was also seen often in fake reviews, instances of click fraud, and hacking attempts.

Further investigation of the 194.58.x.x address block shows with near perfect certainty the range was a manually banned, well-known public proxy. Said another way, this IP range was known to be among the worst of the worst, and clearly its originating messages should have been ignored.

Internet security professionals need to use IP reputation services to determine if an IP address is a proxy or VPN. Lookup tables with this information have existed for years. IP reputation services use machine learning and probability theory to infer a trust score on IP addresses. IP addresses with poor trust scores are behaving badly in an automated manner. Online merchants and video streaming services already utilize advanced mathematical and modern data science to identify malicious Internet addresses, and e-mail providers should too.

In the Internet of Things (IoT), with 11 billion smart devices connected, we can’t allow hospitals, schools, and emergency services to be distracted because of a dozen rogue devices like this. In life reputation matters, but on Thursday we let a few well-known bogus devices in Russia cause panic and fear.

We can do better — the data already exists.

Photo of a barcode

Address Deduplication Using USPS Barcodes

When are two addresses actually the same? And when can you remove one of them from your contact database?

The answer isn’t as simple as it sounds. Suppose you have two addresses as follows: 429 East Figueroa Street, Apartment 1, Santa Barbara, California, 93101 versus 429 E Figueroa St Apt 1, Santa Barbara, CA 93101. Or that for only one of these two addresses, the street address and the apartment number are on separate lines of the address. Simple text or line-by-line comparisons aren’t going to work in this case.

However, the United States Postal Service (USPS) can come to the rescue here, thanks to its standards for delivery barcodes.

What is a barcode?

Barcodes are unique identifiers assigned to each deliverable address by the USPS. A set of digits between 00 and 99 are assigned to each address and then, when that number is combined with the address’ zip+4, a sequence is created to uniquely identify the delivery point. The complete barcode consists of a zip+4, a 2 digit code identifying the premise, and a checksum digit to allow barcode sorters to verify the zip, zip+4 and delivery point code’s correctness.

Barcode Example: 931011445011

931011445011
Zip+4Deliver Point Code Checksum Digit

How barcodes help you clean up duplicates

In short, barcodes can be leveraged to help identify duplicate records in your address database. The uniqueness of the barcode helps to solve the age old problem of identifying duplicate data. Let’s go back to the example we mentioned above:

Address AAddress B
429 East Figueroa Street429 E Figueroa St Apt 1
Apartment 1
Santa Barbara, California, 93101Santa Barbara, CA 93101

On the surface, these addresses seem very similar. They would both be deemed deliverable by the USPS despite their spelling differences. On one hand, you have Address A spelling out “East”, “Street”, “Apartment”, and “California”. On the other hand, Address B abbreviates these same fields. If you were to address an envelope with either of the spellings, it would reach the same destination.

As a human, looking at the two addresses above, it is easy to figure out that these two addresses are really the same delivery point. As a developer, however, figuring out that the two are the same is a nightmare without some sort of unique identifier. You would break these addresses into their component parts – address, address2, city, state, and zip – and then compare each field for Address A versus Address B.

If you came across any field that didn’t match up perfectly, you would assume the addresses were different and handle them accordingly. At this point it is easy to see that this approach is inadequate and would lead to the misidentification of the Address A/B example above. And even if you tried to write a smarter program, you would quickly discover that this a complex problem involving fuzzy matching, distance algorithms, and various other string comparison algorithms. If only there was a unique identifier that could be assigned to an address…

This is where Service Objects’ DOTS Address Validation products shine. On top of the validation of each input, every deliverable address is matched up with its USPS barcode. With these barcodes in hand, it is easy to compare two addresses without having to worry about spelling or standardization differences.

 

Mailing address input:

Example of full address input  Example of abbreviated address input

 

Service Objects’ return with barcode:

Example of full address return from Service Objects' address validation tool with barcode highlighted  Example of abbreviated address return from Service Objects' address validation tool with barcode highlighted

Detecting duplicate mailing addresses using the address’ USPS barcode is a simple, elegant solution to a complicated problem. If you’d like to try any of our address services, sign up for a free trial key and get your first 500 transactions free.

No image, text reads Service Objects Tutorials

Address Autocomplete Tutorial: Video Transcript

Below is a full transcript of the video Tutorial: Address Suggestor using Google’s Places API & Service Objects’ Address Validation API.

Today we are going to be building out an address suggestion input form tool. I’ve already built this out, but I will take you from the beginning, and show you step-by-step what you must do to implement this. In this demonstration, I will be using Visual Studio. This is the type of form that will predict the address you are trying to type. Besides address suggestion, it is also referred to as type-ahead, or predictive typing, and a couple of other things as well. We will use the Google Places API in conjunction with our DOTS International Address Validation to the predictive addresses, and address validation.

First, why is a form like this helpful? Why would someone want this? The main reason is so that you can have clean, accurate data added to your systems up front, instead of trying to clean the data up later, which can be very costly to do. Another reason, is to help speed-up the data entry, and data accuracy on the form. The data entry could be coming from your prospects and clients on a form you have published on the web. Maybe someone is purchasing a subscription, or ordering a product or service. The data could be coming from call-center staff, who are on the phone entering the data for the same types of reasons. There are really many reasons that an address suggestion form can help.

Now that we have some idea as to why we would want to implement an address suggestion tool, let’s switch over to how it works. There are three main parts to this integration. The Google Places API, the DOTS Address Validation – International API, and the web form itself. Some people wonder why we just can’t stop at the Google Places API, and the form, and exclude the Address Validation portion. One of the paramount reasons that the Google Places API is not enough is because it does not give you mailing accuracy. The second reason is that it makes approximations as to where an address could be, versus if it exists or not.

On the first point, accuracy, Google does not verify, or suggest addresses that are mailable. They typically return an address’s physical location on a map, but not if it is a valid mailing address. The second point is the guessing that the Google Places API does. It approximates where an address could be based on other addresses on the street. 508 Kings Road, Brighton, UK, looks like a legitimate address, but when you look at it on Google Maps, then it does not look like the address is actually associated with anything. Those are the main reasons why the Google Places API isn’t enough. You need to bring DOTS Address Validation – International to save the day, and make the solution truly useful.

In this demonstration, we are going to get a Google API account, and get an API key for the Places API. After that, we are going to get our DOTS Address Validation – International key from Service Objects. Here is an important note, you can alter the validation sections we are going to go over, to meet your specific needs. Here, I’m demonstrating how this can be done with DOTS Address Validation – International.

However, if your need is just in the US, then you can, instead, get a key for DOTS Address Validation – US, DOTS Address Detective, DOTS Address Insights, or DOTS Address Validation Plus. If all you need is Canadian address validation, then DOTS Address Validation – Canada should suit perfectly as a substitution. Any of our services that take an address as an input pairs perfectly with the Google Places API. Taxes is another one, DOTS FastTax is a great example that can also help you out. I could go on, but I’ll leave it to you to explore.
Once we have those keys, we will move on to setting up the form, and getting some of the UI Boilerplate stuff taken care of. After that, we will jump into the needed methods from the Google Places API. Lastly, we will go over how to setup the call to DOTS Address Validation – International. Though most of the work is done in Java Script, we will show you how to create a proxy, so that our license key is not exposed to the outside world for anyone to use and take. I guess this will be a kind of a two-in-one demo.
Now we’re going to switch over to the Google API account on page, this is where you can get any API that Google offers. Here we’re interested in the Places API, and the Maps Java Script API. You can see we have those already selected here. Here’s a bunch of unselected APIs down here, unused. We have the ones that we want already selected and added. What we’re going to want to do … I’ll go back for a sec. We’re going to want to first create a project, and we’ve already done that. We’ve called it Service Objects Address Suggest, and we added the APIs to this.

On their dashboard here, you can see the transactions that have occurred against the API. You can have the resolution up to 30 days, down to an hour. You can download this kind of information to really keep track of how your APIs are being used. We’re going to want to click in, and get some deeper resolution on the Google Places API, because we’re going to need the key. We’re going to need to pull the key out. When we switch over to that page, we have the key here, I suggest copying that off, saving it for later. You can name it whatever you want.

Then there’s some restrictions, non-HTTP, IP address, Android, iOS. You can set some restrictions up here that’s going to help with securing your API key when people are looking at your Java Script. They can see your key, so locking it down to particular page, or an IP address, those things are going to help secure your key so other people don’t take it and use it. For now, while we’re in development mode, we’re going to use the “None” selection so we can just get through our development. But before we go to production, we’re going to want to lock this down a lot better. That’s pretty much it. Save your API key off for later, and then we’ll show you where that plugs in as we go through the steps.
Jumping back really quick, I did want to mention that you can use the Google API key to a degree for free, but you’re going to want to go through the usage, and billing plans that they have for the key itself. Depending on your volume, you’re going to want to watch out for this, because it will cost money per transaction. There’s a few links here that I found useful when trying to analyze which plan to go with. You’re going to have to read through this yourself. It seems to be changing a lot recently, so you’re going to want to keep up to date with these pages here, and make sure you’re always on the right plan.

This page here, the Places API usage and billing was good. The pricing that scales to fit your needs helped a lot for figuring things out, and also this Pricing and Plans page as well gave a lot of good information that can help figure out which plan you need to go with, and what you need to look out for. But there’s a lot here, and it can be kind of confusing. I would really take your time, and look through these pages. Make sure you’re setting yourself up for the right plan.

Now we’re going to switch over to getting the DOTS Address Validation – International API key from Service Objects. This is our homepage, a lot of useful links to the products, solutions, developer support, blogs. Blogs have a lot of information. All of us in the company really help provide information, and write a lot of information about our services and products here, and really what’s happening in the industry that can help you guys out. That’s really worth taking a look at.

But we’re going to jump right in, and we’re going to go get the API key. I’m going to select this section for Address & Geocoding. We also have Lead Validation, Phone, Demographics, Ecommerce, Email. We’re interested in addresses, so we’re going to go in this section, and go from there. Coming into here, we have Address Validation, Address Geocode, Address Insights, NCOA Live. Address Detective. A lot of this can be used with what we’re trying to get accomplished here, since they’re all address-type services. But, for our purposes right now, we’re going to jump in, and do the Address Validation – International.

We’re going to click on this link. It’s going to take us to the product page for that service. Down here, you can fill out your information, and get your trial key. It will get emailed to you. Once you fill out this page, you’ll also get redirected to the information that you need as well. If you don’t see it in your email, maybe you got … You’re going to want to check your Junk Mail, just in case, but if you don’t find it in either place, you could still get the information after you get redirected here, so you’ll be okay.

Otherwise, you’ve got this input form, where you can try one-off lookups. You have your own information that you can send through the API in these fields here, or you can select this drop-down, and select some pre-populated data, and run that through to see what you can get back. Then at the bottom, we just have some example request, and responses in JSON and XML, based on the information that’s in these fields when you submit, just to take a look at what the service will give you back.

We do have an API key already, so I’m not going to fill out this form, it’s just good to know where you can get the API key. We already have all the code worked out for this project. I’m going to, instead of recreating everything one-by-one, I’ll just go through everything that I’ve created, the order in which you’re going to want to create these things, and I’ll highlight the different interesting points along the way. You could then take the code, and really understand what’s going on here. It’s not all the complicated, you guys should be able to use this, and use any of our other services that could use address suggestion, and implement that here as well, it would be really easy.

Just follow along as I go through the steps. I’ll point out the different things that are really unique, or something that you’re going to watch out for. The first thing we’re going to do, or what we did, was create some files that we’re going to need. Let’s just start with the style sheet folder. This stuff, I’m really not going to over style sheets, this is just to make the page look the way we want it to look. You guys could take what we have here, or do whatever you want on your side with the styles and classes here. The only interesting part here really, is the first part, where we create, basically default all the browser settings that may be sitting in the background, just to make sure that you have a consistent style sheet all the way through the different browsers. That’s all this is. The rest of it is any specific things that you wanted to change on your own with the look and feel of the page.

All this code that we’re going to be going over is going to be included in a file that you can download, and adjust, make your own. You can adjust anything we have here to suit your needs, which is likely what you’re going to want to do. All this code is freely available, so please take it, and make it your own. The next thing we’re going to go over is the addresssuggestinternationalform.HTML, the HTML markup page. We’re going to start by looking at the first include here that we have. Here it is, the jQuery script. We do use jQuery in the background, and when we get to the Java Script section of this tutorial. Keep that in mind. If you’re not familiar with jQuery, it’s really simple, a lot of good documentation out there to learn that, you can be up and running in no time. Just take a look at their documentation and you should be good.

The second script we have here is the API Handlerinternational.js. This is the file we’re going to be writing later on, once we get to the JavaScript section. Don’t worry about that one for now. That’s followed by the div for the result modal. That’s the modal that’s going to popup after we click the Verify button so we can see the results from the validation. After that, we have our header area, with the logo, and the title of our application, in this case, it’s Address Suggestion Demo. Then we have the country input field, the drop-down select. Here I’ve hard-coded all the different countries that we want to deal with. You’re going to want to populate this list dynamically when you’re doing this for production, or for live, or for your company.

I did this real quick to hard-code it in, but it’s definitely going to be better as something that you can dynamically populate. Then we’re followed by all the different input fields we have, autocomplete, this is really our address one field, this is the field where people are going to be typing their address in. Google Places API is going to be keying off of this, and showing you suggested addresses based off of what you’re typing in here. That’s what autocomplete field here is, it’s our address line one. Then we have our address line two, which really is supposed to be containing our apartment, or our suite information, the city, the state, the zip, that should be all pretty straight forward. Then we have a div, which is really our verify button, our validate button here.

At the bottom we have script to the Google Places API. Here we have our API key. This is going to be hidden when you’re looking at it, but this is where you’re going to want drop your key that you’ve made note of earlier when you set up your Google Places API key, that’s going to drop right into here. That’s all you’re really going to need to know about this page. There really isn’t anything special, or unique about it, other than the styling, and that we link up to the autocomplete field here to do the autocomplete against the Google Places API.

I do want to go back to this section here about the Google Places API key. I really want to highlight that this is exposed when somebody runs the form that goes to the page that has your form. So when looks at the background code, they can see your API key. As I mentioned earlier, we have no restrictions on our API key right now, so someone could, if they had this, use it for their own purposes, start running up transactions on the key, and that could end up being very costly. When you do move to production, do not forget to go back and change the settings on the key in the Google API account. You’re going to want to make sure it’s locked down as best you can, so that you don’t get calls against your API key unwittingly.

Now we’re going to jump into the API Handlersinternational.js file. This is the JavaScript file that’s going to really contain a lot of the interesting things that happen in the background on the screen. We created a JavaScript folder for that, so we can keep things nice and organized. JavaScript probably needs an S, capital S there. Oh well, we’ll keep it the way it is for now, and there’s our file.

The first thing in the file are these three variables that we set up, the autocomplete object here, the variable here, it’s just going to contain the autocomplete object that’s sent back from the Google Places API. The component form, this is really just an array of the IDs in our forms. If we really want to reset any of those fields, we can easily, quickly loop through those resetting them. Then selected country, we have set to US, but that’ll be overwritten whenever we have a new country selected from that drop-down.

The first thing we’re going to do is jump down to this function at the bottom. This function executes once the page has completed loading. What it does is it adds an event to whenever someone changes the country in the drop-down for countries, to another country, then this is going to execute, and it’s going to reset all the fields in the form, which makes sense. You don’t want to have data for the wrong country displayed. Then it executes this initAutocomplete method.

I’m going to jump into this one right now, and just get it out of the way. We’ll do that next. What the initAutocomplete method does is really just initializes the Google Places API, and it returns the autocomplete object into autocomplete here based off of the autocomplete element that we pass in through the IP. We set the Google Places API to respond with geocode type, which is the address type data that we’re looking for. You could put other things in here, and it gives you varying degrees of data back. The geocode is exactly what we’re looking for, so that’s what we have here.

Then based off of the country selected, you can restrict the API to a country. You can have a list here of countries. I think of up to five, you’ll want to double-check the documentation for this field. But what we need it for right now is the one country that is selected from the drop-down. After that, we add a listener to the autocomplete object. It’s going to basically react to selecting one of the addresses, once a address is selected from one of the suggestions this fill in address method gets executed. Basically, what we do is we get the place object back from the autocomplete object, and then we reset the form.

After we reset the form, we parse the place object, basically parsing apart the places data that comes back in JSON, and we populate the input fields based off of the address that was selected. Varying pieces of data come back, so you have several cases here that you might want to deal with, or expand upon, or not use, depending on your purpose. This is just an example of what can be done. After that, we populate the fields with parsed address from the response.

The bottom of the page really just has … what’s left is the modal start method, that really just deals with opening and closing the modal page, the result page that comes up after verification. The next part to really talk about is the click event, when someone clicks the Verify button, the click event will fire, and then we’re going to pull in the variables from the screen, the fields, and we’re going to populate them to the local variables here. Once we have those variables populated, we’re going to then toss them into the call address validation method. This is where the magic on the Java Script side of things happen. But inside that code, we’re going to call our backend proxy, which will really do the call to the validation. We’ll go over that next.

This is the call address validation method. It takes in the parameters that we sent in earlier from below. Then it makes an AJAX call, and processes the request. It processes the request against a Handler that we create, and that’s basically a backend script that’s going to allow us to hide our license key from being exposed to the outside world. It’s going to make a call to the API from there. It’s going to do the address validation. It’s going to process the response, and send it back to the AJAX method here. It’s going to land in either the success, or error sections here.

If there’s a response, and we have data coming back, and everything looks good, then we’re going to go into the success method. If not, we’re going to land in the error. With the error, we’re going to call the Handler again, but this time, we’re going to call the ValidateAddressBackup method instead. The only difference here, is that we’re calling the backup endpoint. Everything between the two endpoints is really the same, it’s just a backup server with the same data. Everything’s going to be the same, except, this is basically how we’re going to be handling our failover so that we can always guarantee a response from the service.

From there, we can still have a success or a fail. If we have a fail, then there’s more likely than not, a network error. Based on your process, you might be able to return more detail then show network error that I have here, but it really just depends on what you’re doing. If we’re successful, then again, we’ll call the success method. The AJAX method is going to call the Handler.aspx methods, and they’re going to basically be a single web form with nothing really on the markup page, except the basic stuff that gets prepopulated there. Really doesn’t matter, we’re not going to be using this part of it. We’re more interested in the Handler.aspx.cs page, the code behind for that page.

This is the page that’s going to be doing the request to the API. It’s going to be pulling in our key information from the Web.config file. Then it’s going to call out to the web service, and come back with a response, and then send that back to the Java Script that was calling us originally. To create the service references, you can go to References, and click Add Service Reference. In here you can fill in the URL of the address to the WSDL, and then name it. To know what URL to use here, you could just go to our website, and look at our docs. The documentation page is docs.serviceobjects.com. If you go there, you’ll have a list of all the services that we have. You’ll click on Address Validation. Once you get to Address Validation, you could scroll down a little bit, and find the WSDL.

This is the link to the trial WSDL. Just copy and paste that here, hit Go, and it finds the service. Here you can rename it however you’d like. I’ve already done this for our live, and backup endpoints, so let me just go over those with you really quick. What we’ve done already is created the AVI, and AVI Backup endpoints here. Right now, they’re actually both configured to look at this trial.serviceobjects.com endpoint. This is the main endpoint, and the backup endpoint. That one also is set to the trial.serviceobjects.com endpoint.

When you start out, you’re going to have your trial key. You’re going to set it up this way. But when you move to production, you’re going to come in here, and you can update this to ws.serviceobjects.com for your main endpoint. And then you’ll switch over to the backup endpoint, and you’ll switch trial over to wsbackup.serviceobjects.com. That way, you could hit the live, and backup endpoints, with the service reference.

Here in the Web.config file, you’re going to see a couple of things. Here we have appSetting key that we set for a value that we can pull in for the timeout, 5000 milliseconds, five seconds, and our license key for our service that we got in the trial signup for Address Validation – International. The rest of this is generated code based off of the WSDL that you created with the service reference earlier. There’s really not much going on in here, but the key part is that you have a license key outside of the application. We can change it on the fly, but it’s also something that gets pulled in, and it’s not seen by anyone who’s running your code, who’s running the form, who can easily look at your JavaScript or your code and see the underlying methods, the underlying codes. You won’t be able to see our key exposed this way.

Back in the Handler.aspx.cs file, we have the two methods, the ValidateAddressLive, and ValidateAddressBackup. They’re pretty much going to be doing the exact same thing, so we’re just going to go over one of them. The main point is that one of them is going to have the live endpoint, and the other one’s going to have the backup endpoint. Basically that’s really the only difference. Typically I would actually have this set into just one method, and maybe have another parameter here, but I just wanted to spread this out so that you could see that there is a difference, and one’s calling a main endpoint, and one’s calling a backup endpoint.

As I mentioned earlier, the live and backup endpoints are both pointed to the trial endpoint at this point. That’s what we’re going to keep doing here during our trial purposes, but when we switch to live, we’re going to switch our endpoints to the ws endpoint. Then for the backup, we’ll switch it to ws backup. Here we go. We’re going to start with this method. To be able to call it from the Java Script code, what we want to do is decorate the method with the web method attribute here. After that, we’re going to create the response object, and the service client object. From there we’re going to pull in our license key for the web service, and for the Address Validation – International API. Along with that, we’re going to pull back the 5000 millisecond timeout, the five second timeout.

After that, we create our client, and add the settings to it. Finally, we’re ready to call the endpoint with our parameters, with address one, two, we set a couple of these fields to empty strings, because we don’t need them in our cases. But depending on what country you’re dealing with, you may be pulling back address line one, two, three, four, five. So that’s what this is a placeholder for, is for additional address lines. We’re keeping it simple, and we’re just dealing with address lines one, and two. We’re going to cover the majority of situations this way anyways.

But those are there just in case you want to handle things a little bit differently. Then we have the Locality, AdministrativeArea, PostalCode, Country, and OutputLanguage, followed by the license Key. Since this is all on the backend server end, instead of the client, you’re going to be hiding the license key from anyone seeing it, and that’ll really protect your key from anybody being able to use it for any other purposes, other than what you’re intending here.

After the call to the service is made, the response is populated. If we have a null response, then we’ll just return an empty string, otherwise, we’re going to de-serialize the response into a JSON string, and return the JSON. The same exact same thing’s going to be happening here when, and if we need to make a call to the backup endpoint.

Here we are back in the CallAddressValidation method, where we called a Handler from the AJAX. Now we know what happens when the Handler is called. It basically calls the Address Validation – International API, processes it, and returns a result back to us. From there, we do the success, or the fail, the success, or error. In this case, we’re going to start by going through the success, and then we’ll go back over the error, or the failover later.

Going into the success, to get there, we make sure that the response object is not empty, so it has something to process. From there we get the JSON value out. It’s really just a JSON string at that point. Then we pass that over to the success method. Moving down to the success method, there’s really nothing too complicated here. What we’re doing is looking to see that the response is not undefined, and the response.AddressInfo object is not undefined as well. If we have those two things, we’re going to come in here, and we’re going to start building some string HTML output.

We’re going to use the string HTML output later on to inject back into class a div that we have on the main page. Let me see if I can find that. I might have it highlighted here already for us. This div, we’re going to try to populate that with the string HTML that we have in the JavaScript here. Moving back up, we’re going to be populating the HTML output with string HTML that we’re going to build in this method. We’re going to build it based off of comparing the inputs that were on the form, to the outputs of the results that we got from the API. That’s why you might really be wondering at this point why are we pulling back in the values that we had earlier, that we already passed and used to the address validation method?

Well, we’re pulling them back in so that we have these values locally, and that we can compare them to the output results from the service. We can show to the user what’s different between what we entered, and what the API found. Moving down … well first we have the inputs, these are the outputs pulled from the response object. Now we start building our HTML. We’re really just making our comparisons here, and figuring out how we want to display the output. Let’s just move to one of these ones here. Here we check to see if the outputLocality is not undefined. If we have something then, if it’s not equal to the input, then we’re going to put a “Bold” clasp, put “Bold” around the result, so that we can point out, or make it easy for the client to see, what is different, so they quickly make changes on their end, once they see the result.

We do that comparison for several of these fields. Based on what you’re doing, you’re going to want to probably do this section a lot different. This is just mainly a demonstration so you can see how we’re getting the elements out of the JSON, and how we’re comparing them to the inputs. You’ll have likely a different scenario, but this is how you could use it. As we move down here, the HTML is built. Then we inject it into the div. Back at the top of the success method, we demonstrated what happened if the response was not undefined, and the AddressInfo object was not undefined. Well, what happens if they are undefined?

Let’s go down and see what happens there. Well first we check to see if there is a response object that actually is defined, but has an error object that’s defined as well, and in particular, an error object with a TypeCode three. Well, if response is still undefined, we’re going to drop down here, and we’re just going to show the error. Otherwise, if we have the response code three, then we’re going to want to do a failover call. We’re only going to want do it once. We govern that with this count object that we increment once we get in here. That’ll force this to only ever happen once. Then we’ll call the failover method.

If we’ve already done it before, then we’re going to just drop down, and do the error response. One key thing that I want to point out here is that, even though we’ve made it into the success method, doesn’t mean we’re on our way to success guaranteed. There could be other problems along the way, like I just demonstrated with the else here, and what could have happened with the response. Success needs to be taken with a grain of salt. You might want to name it a little differently, so it’s a little more clear, but it’s not 100% success.

Now that we’ve gone over the success method popping out of the AJAX call here, well we should probably go over the error method when AJAX comes back with an error. When there’s an error, we’re just going to simply try our post to the other method, the ValidateAddressBackup method on the Handler. From there, it’s really, like I said earlier, going to do the exact same thing as the live, but call the backup endpoint, and run through the same steps. In here, again, we can have a success, or a fail, or a success and an error. If we get a success on the backup endpoint, then we’ll follow what we just went through with the success method, otherwise we’re going to show, at this point, it’s likely some sort of network error. We might, again, have some other information that you can add to this. You’ll do whatever you want there. We’re just showing a network error for now.

Earlier, as we went through the success method, and we realized that there’s an option when the address info or the response is undefined, we have this else here at the bottom. Well, I had mentioned the failover method here. The failover method, just to go into that briefly, it really mimics the same thing that the error did in the AJAX call, it just basically makes a second call to the other endpoint, to the backup endpoint, just in case there was a problem. It’s doing the exact same thing as earlier. You’ll see the exact same code here as you did there. You could pull this out into a method, and make sure that you only call this thing once since repeated, but here it is a demonstration just so it’s easy to see that you can do it this way.

Besides that, I think we’re pretty much wrapping up going over this code. There really isn’t much else on this file that’s complicated. Here’s a couple of methods that use jQuery to remove and add classes based on what comes back from the service, and what you want to display to the screen. It’s just basically checking in off of the classes, and then removing, or adding classes based off of the results. There’s nothing too big there.

The other important one here is this geolocate. This really helps with the auto-complete, where it pretty much does a bias to where … to a location. In this case, we have it set to a lat/long. What it does, is it looks for your current location. It gets the lat/long, and draws a radius around that location. It biases the results to that circle. When the form is created, if we look down here at the bottom, you’ll see that … where is it? Oh, here it is. When the field is focused on, we set the geolocate method. From there, that’s how we create the bias.

There you have it. That’s pretty much it. What we have left to do now is just give this a run, and see what we built. Everything loaded up properly, as expected. We have our drop-down menu with the countries. Let’s try our Service Objects address. 27 East … There it is, 27 East Cota Street. That really helps adding those kinds of addresses real quick. Let’s just do any address, how about … and this is obviously biased from where I’m located. I’m located in Santa Barbara.

Even though I have the country selected to United States, which is correct, it’s not picking any random place in the United States to start guessing. It’s basically picking where I’m sitting. That’s going to be an important detail for a lot of people in call-centers that are working locally, or if you’re in a call-center, and you change locations, you might want to give people an option to change their bias to a particular area, so they get results that come up quicker for the area that they’re located in.

Let’s pick this address here. Populates the fields here. Then we can click Validate. Then what comes back is the address that was … or that came back from the service, and the address that was entered. We can see that East was changed, Street was changed, and we have the plus-four added to the results. We still have some other details, but you know the service, it comes back with a lot of different details. I just gave you a rough, short look at what you can do. If you look at the API in the documentation page that I showed you earlier, there’s a lot of information there that will allow you to do many different things with the service. There’s a lot of different fields that come back that are useful.

In this case, someone could click on one of these, and either proceed with one of these addresses, or they can close, or try a different address, but that’s basically how the address suggestion application works. Thank you.

 

Best Practices for Phone Exchange 2 International

Having extra information about phone numbers can be invaluable to a business for several reasons. It can help fight fraud, support TCPA compliance and give an extra boost in insight about how and when to best contact the customers using their provided phone number.

Our DOTS Phone Exchange 2 is designed to validate phone numbers with the kind of detailed results that gives businesses an edge over their competition. The beauty of our Phone Exchange 2 product is that it can validate both domestic and international phone numbers, so no matter where your phone data is coming from, we help provide the intelligence your business needs to get ahead and communicate more effectively with clients.

Which Operation to Use

We have two separate operations in our Phone Exchange 2 service. We recommend using them both to get the most insight into your data. Here is a brief description of each of them:

GetExchangeInfo – Takes a 10-digit phone number and returns the line type, carrier information, ported information and more. This operation can validate US and Canadian phone numbers.

GetInternationalExchangeInfo – Takes a phone number and the country associated with the phone number as input, and returns carrier information, line type, a flag that indicates its validity and more. This operation can validate any phone number from around the world.

If you have US or Canadian phone numbers we recommend using the GetExchangeInfo operation, as that can provide ported information for the phone number as well as more detailed output data compared with the GetInternationalExchangeInfo operation.

As with most of our APIs, these operations can be processed in one-time batches, one-off lookups, automated batches and real time API integration.

Key Output Fields

There are many output fields in our Phone Exchange 2 service. First let’s look at some outputs that overlap between the two operations, that our clients have found particularly valuable:

Name – Provides the name of the phone carrier that this phone number is associated with.

Location, Latitude/Longitude – This is the location where the phone number was registered; it is important to note that this is not the location of the phone or the phone contact number.

Line Type – The line type for a phone number, which can be valuable for several reasons. Many organizations have different protocols that will be followed if a number is wireless versus landline. Landline numbers can also give a higher probability that phone number is in the location provided, while wireless numbers can receive text messages.

Time Zone – This is another field that can assist in determining an appropriate time to contact clients by their phone number. This value is based off of the carrier location.

Finally, here are two popular outputs that are specific to the GetInternationalExchaneInfo operation:

IsValid – A simple true/false flag that indicates whether the phone number is valid.

IsValidForRegion – A true/false flag that indicates whether the phone number is valid for the given country in the input. I.E. if a US number is given to the service with the Country value of “Germany” the service will return false for this field.

Using Countries and Calling Codes

GetInternationalExchangeInfo does quite a lot to determine which country the given phone number is associated with – which is good news! Dealing with calling codes, parsing phone number lengths, and determining the best fit country for a phone number can be tough work. We’ve done all of the leg work and put a lot of thought into how our international operation determines the validity of the phone number.

The two important pieces to parsing an international phone number are the country code and the country provided in the input. This service uses both of these to parse and determine the best country.

The service will look for a country code at the beginning of a phone number and will give precedence to numbers in front of a “+” sign, which is a standard way of writing an international phone number. If no country code is given it will use the given country information to determine the validity of the phone number.
If both a country code is given with a “+” sign and a country is given in the input, the service will generally use the country code as the country identifier.

Conclusion

As with any of our services, we are always happy to make recommendations about your specific use case. Don’t hesitate to reach out to our integration specialists here at Service Objects, and we will be glad to make recommendations on how to get the most out of DOTS Phone Exchange 2 or any other service.

It Don’t Mean a Thing If It Ain’t Got That Ping

How do you know if an email address is valid? There is more than one way to find out. In this article, we will show you how something we do – known as “ping testing” – makes these results much more accurate. More important, we will show you how to get the best out of these capabilities.

Email Verification 101

There are fundamentally three ways to make sure an email address is legitimate:

  • Examine the email address itself for things like proper syntax, obvious misspellings (like “gmial” instead of “gmail”), and other problems (like missing “@” symbol).
  • Compare this email address against lists of existing emails – both to see if it is a legitimate address, and also to flag known problem addresses such as spam traps, honeypots, known spammers, blacklisted addresses, and more.
  • Physically test (or “ping”) the email server, domain and address to make sure the address is valid.

All three of these checks are important in their own way. Basic address testing quickly weeds out addresses that are clearly invalid, with fast response times. List testing is also quick but often isn’t enough, because of addresses that haven’t made the list yet. (According to a report from the Radicati Group, new email addresses get created at the rate of a quarter billion per year!)

Then there is “ping” testing, which involves checking the actual email server and address for a response, which is the gold standard for determining the validity of an address. It can also be important for applications such as fraud prevention, to guard against perpetrators who create email addresses in near-real time. There are three main types of ping checks:

  • Testing an email server (STMP) to see if it is real and available.
  • Testing to see if an email address is allowing emails at the domain (DNS) level.
  • Testing to see if the address can reach an inbox.

Of course, Service Objects’ DOTS Email Validation service performs all of these checks. Now, let’s see how you can use them efficiently for your own email validation.

Here’s where you come in

Service Objects’ Email Validation capabilities give you a great deal of control over both performance levels and output tests. Here are some tips to get the most out of your email validation, taken from our developer guide:

To ping or not to ping: You can validate emails quickly – at the expense of possibly missing ping testing – by using our ValidateEmailFast operation. If a “ping” takes too long, it will not be considered in the check (and STMP data about this address will not be returned). However, be aware that this is a less accurate check.

Putting a lid on pinging. The amount of time a “ping” takes may vary widely, from nearly instant response to lengthy delays. If you are using email validation in a real-time application, or are concerned about response speed, the Timeout input variable is your friend. This value specifies how long the service is allowed to wait for all real-time network level checks to finish, such as STMP and DNS testing. Time is entered in milliseconds, with a minimum value of 200ms.

Email servers can be slow to respond to ping checks, and one of the most important aspects is how long you are willing to wait for a response. If you only wait a second or two – and you fail emails that do not respond in that time – you will get a lot of false negatives. If you can wait and/or update the results based on latent responses, you will get a more accurate verification.  If real-time responses are a priority, we recommend setting up a two-step verification process, to help mitigate slow email server response times and ensure a quality user experience.

Two-step validation. The initial step will validate the email address using real-time syntax and ping testing. Syntax issues and fast-responding email servers will provide accurate feedback, so issues can be flagged in real-time.  This allows for real-time notification of any issues, enabling user corrections before being captured by your application or CRM. The amount of time you are willing to wait should be considered in your user’s experience.

The second step is to accommodate slow-responding email servers that ‘timed-out’ in the initial step.  When capturing the email address to your database, include a Yes/No flag of whether the email validation timed-out before completing validation.  For those email addresses that timed-out, you can validate them again but with a much longer Timeout setting, allowing slower email servers time to respond and ensuring the email address has been fully validated.

Pinging isn’t perfect. Sometimes a non-existent address will still “ping” properly. Why? Because some email domains are “catch-all” domains, meaning that their servers will accept mail to any address within that domain. You can test for this using the IsCatchAllDomain output variable that comes back with your results.

Finally, remember that ping testing is not the only factor in effective email validation. Our developer guide has a wealth of tools you can use as part of your specific use case, ranging from optional email address correction to warning codes for bogus, vulgar or disposable email addresses. Check it out, or better yet, “ping” our friendly support team for expert advice. We’re always glad to help!

Bad Email Addresses: A Rogue’s Gallery

Once upon a time, many businesses simply lived with bad email addresses as an inevitable cost of doing business. Today this has changed dramatically, in the face of increasing costs and regulatory consequences. According to figures from Adestra’s 2017 Email Marketing Industry Census, nearly 80% of firms proactively cleanse their email marketing lists.

What can go wrong with email contact addresses? Plenty. And what you don’t know can really hurt you. Here are just a few examples of bad email addresses and their consequences:

The faker: donaldduck@nowhere.com

You build a list of leads by offering people something of value in return for their email address. Unfortunately some people want the goodie, but have no intention of ever hearing from you again. So they make up a bogus address that goes nowhere, wasting your time and resources.

The fat-fingered: myaddress@gmial.com

Someone gives you their email address with the best of intentions, but types it in wrong—for example, misspelling “gmail.com” as “gmial.com”. So your future correspondence to them never arrives, with consequences ranging from lost market opportunities to customer dissatisfaction.

The trap: honeypot@aha-gotcha.com

Here you have rented a list of email addresses, or worse, taken them from publicly available sources. But some of these addresses are “honeypots”: fake addresses designed to trap spammers. Send an email to it, and you will get blacklisted by that entire domain—which could be a real problem if this domain is a major corporation or source of leads and customers.

The fraudster: zzz1234@misterfraud.net

Someone places an expensive order with you, using a stolen credit card—and a bogus email address that never existed and cannot be traced. Better to flag these fraudulent orders ahead of time, instead of after the horse has left the barn—or in this case, the shipping dock.

Of course, this isn’t an exhaustive list. (By the way, none of these sample email addresses are real.) But these are all good examples of cases where email validation can save your time, money and reputation.

What is email validation?

Think of it as a filter that weeds out the good email addresses from the bad ones. In reality, a good email validation service will examine multiple dimensions of what can go wrong with an email address—and in some cases, can even fix erroneous addresses to make them usable. But at its root, email validation takes your email contact data and makes it clean, safe and usable.

Here are some of the specific things that Email Validation can do for your business:

Make sure the format is correct

Our standard service uses server-side scripting on Web forms to check if email address data includes a name, the “@” symbol, and a valid top-level domain (TLD).

See if the address works

Instead of relying on stagnant, aggregated lists for verification, our real-time email validation checks the authenticity of email contact data instantaneously through kinetic and responsive two-way communication with email service providers (ESPs).

Perform advanced validation checks

Advanced checks can examine things such as:
• Checking if valid data exists on both sides of the “@” symbol, in both the username and the domain name
• Verifying that the domain in the email address exists and has a valid MX record associated with it
• Testing the mailbox to determine if it actually receives mail
• Detecting and flagging bogus, vulgar or malicious addresses that may be cluttering up your list

Correct fixable errors

The email hygiene component of this service identifies and corrects invalid emails with fixable errors such as typos, extraneous text, common domain misspellings and syntax problems.

Real-time email validation improves lead quality, saves time processing and pursuing leads, and protects your company from blacklists and regulatory traps. Download our free whitepaper, The ROI of Real-Time Email Validation, to learn more about bad email contact data and how email validation can help you correct inaccurate contact data and reject bogus email addresses.

DOTS Name Validation 2: What Do The Scores Mean?

What’s in a name? Hopefully, valuable contact data for your business. But some names clearly contain red flags for bad data – and that’s where we come in.

Name Validation is a very effective tool for weeding out garbage, bogus and unreliable names. This service can be used in real-time while creating leads, or used to process a large list of names at once. It is great tool for cutting down on the amount of unreliable data that can be entered into a system.

This article will walk you through the different scores that the DOTS Name Validation 2 service provides, to help you get the most out of this tool. In addition to a massive list of names that we compare input names against, we also do several other checks. These scores can help identify why a particular name was considered to be invalid, as well as helping to shed some light as to what types of validation Name Validation performs.

Overall scores

One of the first things users will want to look at is the OverallNameScore value. This score represents the service overall rating for the given name. This score value ranges from 0 to 5, with 0 indicating a definitely bad name and 5 indicating a definitely good name. This is usually the first result someone might look at when determining the validity of a name.

We generate this overall score based on several other checks, validations and scores that the service can generate. However this might not be the last stop a user would make when attempting to determine if a name is valid or not. Based on your use case, you may want to look at one of the other score values our service provides, described below.

Other scores provided

The other score values that the service gives also range from 0 to 5. These values indicate the likelihood that the particular scoring category applies to that name. For example if a name received a VulgarityScore of 5, then that name would definitely have some type of vulgar word present. Below are the different scoring categories that the service provides.

VulgarityScore

As mentioned above, this score indicates the likelihood that a vulgar word is present in the input name. This score highly affects the overall score, as this is a key item used to sniff out bad or unprofessional name information.

CelebrityScore

This rating represents the likelihood that the input name provided is a known celebrity. This field will also work with fictional celebrities, so names like “Micky Mouse” and “Homer Simpson” will receive high Celebrity scores, as well as real life celebrities like “Tom Cruise” or “Madonna”.

BogusScore

The BogusScore field will let the user know if a given name is simply just a word or phrase that wouldn’t make sense. For example, single words or phrases that aren’t names (such as “Sandwich” or “The Quick Brown Fox”) will receive a high bogus score.

GarbageScore

Random key strokes or inputs that are not valid words will receive a high Garbage score. This would correspond to input like “asdfg” or any other series of random letters, keystrokes and input that doesn’t make a whole lot of sense as a name.

DictionaryScores

Finally, we provide scores that indicate the likelihood that the input text is a dictionary word. These tend to have less weight on the overall score, as there are quite a few legitimate dictionary terms that can be considered last names. For example, the name “Park” is a relatively common last name, so it will receive a lower dictionary score of 1, while a word like “Fluorescent” would receive a high dictionary score because it is less common.

As with any of our services, there can always be specific use cases that may require some more information about how our services work. Service Objects has a team of customer focused people standing by to help you get the validated data you need. If you have any questions about our services, don’t hesitate to reach out to us – we would love to help you get the validated data you need!

Help Santa Check It Twice: A Holiday Addressing Gift for You!

The holidays are fast approaching. Soon you’ll be celebrating the season and sending holiday gift baskets and cards to people you have enjoyed working with this year. So here at Service Objects, we’ve teamed up with none other than Santa Claus himself, with a great gift for you! A free web-based portal where Santa will help you verify addresses online, powered by our Address Validation capabilities.

It’s ready to use right now.

If you have never used online address validation before – or even if you have, and want a quick, fun way to check a few addresses – Santa is here to help. Take a look:

Use this form to give him a delivery address – anywhere in the world where reindeer fly, business or personal – and then he and his helpers will be right back with one of the following results:

Finally, a little bit of fine print. You will be allowed to look up a maximum of 10 addresses using this tool. This screen will allow you to look up one address at a time, including business names where needed, but bear in mind that we offer convenient API and list-processing versions of these tools as well. If you need to look up more addresses, no worries – a convenient link will lead you to learn more about our full-feature capabilities, as well as additional information about our phone and email validation capabilities.

We’re hoping that once you get a taste of some holiday address verification – and find out how simple it is to implement for your business – you’ll want to have these capabilities for yourself, all year round. (In fact, Santa confided to us that he and Mrs. Claus will keep using Service Objects tools to improve his own delivery accuracy every Christmas from here, because sometimes even reindeer are no match for automated shipping.) Want to learn more? Talk to our friendly technical experts, and we’ll make it a happy holiday season for you too!

Tuning your Implementation: What Impacts Speed and Throughput

How fast is fast? How fast should things be? And more importantly, what can you control to make the response time of using our services as fast as possible?

In this blog we will break down the key components of response time, to shed some light on what you can optimize to make your implementation respond faster. Whether it is a call to a Service Objects web service or an API integration in general, there are specific phases that take place: input data preparation and transmission, data processing, and results transmission and processing. For example, a simplified view of a single data validation request is as follows:

  • Your data is securely sent to the Service Objects web server to be processed
  • The data is received by our server and processed/validated
  • The validated data is then securely returned to you

Most people think of response time as being the round-trip time to get your data validated – and this is indeed the primary concern – but total throughput should be addressed as well. And if you are able to get a single validation done as optimally as possible, expanding the workflow to fit simultaneous requests shouldn’t take too much modification to the workflow.

To understand what is going on under the hood – and in particular, understand what factors may be slowing down your speed – let’s compare this process to a trip to the grocery store. It has similar phases to a data validation process: you leave your house, travel down the road, stop at the market, purchase your groceries, and then drive home. Here is how this process breaks down:

Step 1: Data preparation. This is analogous to the steps you take before leaving your home. Did you turn off the lights? Are all of the doors shut? Is the security system armed? Do you have everything on your grocery list?

Each of these steps is similar to the checks that your application goes through in order to leave your company’s internal network. The application has to gather the information to be processed, dot all of the i’s, cross all the t’s, make sure it knows where it is going, and make it through any layers of security your company has in place. Each layer takes time to complete and detracts from your overall speed.

Step 2. Input data transmission. This is like traveling down the road to the supermarket. How much traffic people encounter depends on factors such as how many lanes the road has, how many people are in each car, and how many cars are on the road. From an engineer’s perspective, this is like the concept of single threading and multithreading: if the road that you are traveling down has multiple lanes, the number of cars (e.g. API requests) can be larger before running into traffic.

By creating a multithreaded application, you can effectively widen the road and allow for more throughputs. Your choices include single car + single lane (single value request), single car with multiple people in it + single lane (smartly using one validation client to do multiple requests), multiple cars with single passengers on multiple lanes (semi-smart multithreaded validations), and multiple cars with multiple passengers on multiple lanes (really smart multithreaded apps).

Step 3. Data processing. Once you reach the store and make your purchases, it is time to pick a checkout aisle. The number of aisles that are open act similarly to the lanes on the road. More aisles allow a larger number of people to be processed before a queue builds up.

This part of the process is handed by Service Objects. We have spent the last 15+ years not only making sure to stock our shelves with the best quality products, but also ensuring that there are enough checkout aisles to meet the needs of our customers. In our case, we have multiple data centers (super markets) with load balanced web servers (smartly managed checkout aisles) to keep all of our transactions moving through smoothly.

Step 4. Results data transmission. The return trip from the super market is very much the same as the trip there, but in reverse order. Similarly, entering your house with the newly purchased items (validated data) is the same in the opposite direction. This stage also contributes to the round trip time required to process a request.

Step 5. Unpacking the results. When you return from the store, you still have to unpack your groceries. Likewise, once you get the data back from the service, you still need code and business logic to unpack the results and store them in a form that meets your needs. (Or, to extend our analogy further, create a delicious meal from the quality ingredients that Service Objects supplies you!)

So total processing time is a careful balance of getting through your front door, navigating down the single or multi lane highway, checking out with your groceries, and then making the return trip home. From an API-call perspective, improvements to your speed start with your network, smart choices when writing your integration code, and juggling the requirements put in place by your business logic. And if you are looking to increase your throughput, a multithreaded application or distributed system will often increase your capacity.

Finally, one more analogy (we promise!). Good grocery stores have staff who can help you make the most of your shopping trip. Likewise, Service Objects has an industry-leading team of technical support experts who know their stuff, and are always happy to help you make the most of your implementation. How can we help you? Contact us any time to discuss your own specific needs.

Cyber Monday is Coming. Is Your Business Ready?

In 2017, Cyber Monday sales reached an all-time high – and trends show that we may see another record-breaking year. Service Objects broke its own record last Cyber Monday with the most transactions in a single day. Why were our data validation tools so in-demand? Because excited customers rushing to score online deals make lots of data entry errors. Capturing authentic contact data helps businesses avoid mistakes in the ordering and shipping processes and prepares them for future opportunities, like marketing campaigns and additional sales.

Data validation services enhance data quality in real-time by identifying and correcting inaccuracies. For example, order validation not only verifies that an order is legitimate, it also corrects and appends contact data like name, address, email, and phone number using up-to-date, proprietary databases. Cross-referencing IP, address, phone, email, and credit card information helps every aspect of your business – from making ordering and shipping efficient to helping flag identify fraud and verifying email addresses for future communications.

Data Quality at Point of Sale

Validating an order at point of sale helps smooth out transactions for customers by suggesting more accurate addresses and updating typos. Adobe Insights reported that Cyber Monday sales grew 16.8% from 2016 to 2017 reaching $6.59 billion, $2 billion of which were completed on a mobile device. Because we make five times more mistakes on mobile than desktop, fat-fingered typos and autocorrect issues are becoming more prevalent.

Order Validation can also help prevent fraud in real-time by verifying that customers are legitimate through cross-checks of contact data, IP address, and credit card information. These verifications can flag suspicious activity related to identity theft and high-risk prepaid cards, which helps avoid related chargebacks. Fraud hurts businesses through lost product, money, and hours managing the fallout – the best way to avoid those costs is through preventative measures, like validating orders before shipping.

Data Quality and Order Fulfillment

With last year’s record sales came unprecedented shipping demand, and shippers like UPS struggled to meet delivery expectations all over the country. Customers anxiously awaiting their packages took to Facebook and Twitter to air their grievances, but while UPS was the bottleneck, many angry tweets were directed at vendors.

Given the rising trend in Cyber Monday sales over the years, it’s likely this year will bring even more orders, shipments, and delivery-related problems. Using a CASS certified address validation service, like the one incorporated in Service Objects’ DOTS Order Validation API, can help ensure that your shipping addresses are correct and deliverable. The service can be implemented to help customers self-correct inaccurate information before submitting their order, or can be used post-transaction to ensure accuracy by finding issues and suggesting corrections before shipping.

Customer Service Benefits from High Quality Data

The holidays are a stressful time, and shoppers have hard deadlines when ordering gifts in November and December. According to the National Retail Federation, 38% of consumers expect free two-day delivery when making online purchases. Address verification helps meet these expectations, cutting down on service inquiries for delayed packages. Order Validation also validates email addresses and phone numbers, ensuring notifications reach shoppers and giving your customer service representatives everything they need to communicate effectively.

Precise contact data saves your customer service team time troubleshooting and appeasing upset callers, strengthens your relationship to promote repeat business, and helps you manage your reputation. And, in the off-chance that something does go wrong, your team will have the most up-to-date order information to handle the call and assure your customers that you care.

High Risk Days Require High Quality Data

Data quality plays an important role in managing the risks of high-volume transaction days like Cyber Monday. The best way to ensure contact data doesn’t get in the way of your biggest sales day is by validating and verifying transactions with a service like Order Validation. You can even try it out today with a free trial key.

Contact Data Spam: A Lesson from Google Maps

We have spoken often on these pages about the importance of validating your contact data, to make sure you have a valid address and a quality lead. Whether it is a mistyped ZIP code, a lead pretending to be Donald Duck to fake out your marketing team, or a phony email address used to commit fraud, problems can and do occur. It takes planning to stay one step ahead of the bad guys or the bad data.

Which is why we were fascinated to hear about a new cottage industry that has sprung up in recent years: fake listings on Google Maps. By cataloging the streets and business listings of much of the planet, Google Maps has often become a go-to resource for finding a business. Unfortunately, this has also made this platform a tempting target for shady operators and unfair competitors.

A few years ago, some enthusiasts succeeded in pranking Google Maps with obviously fake business listings, just to show that they could do it. One hacker even managed to plant fake contact information for the FBI and the Secret Service, forwarding callers to the actual agencies while surreptitiously recording the calls. In cases like these, the goal was to try and get Google’s attention about flaws in their system and verification procedures.

Unfortunately, fake listings have also been exploited by people with darker motives than showing off their hacking talents. Here are some examples:

Contractor fraud: Some types of businesses, such as locksmiths or plumbers, are ripe for shady contractors who come to your home and then charge exorbitant prices. By placing a listing in your neighborhood using a phony address, they are able to swoop down from anywhere on unsuspecting homeowners. According to Google, this represents about 40% of their fake listings.

Fake reviews: In this case, real businesses have shadowy people post phony reviews to disparage their competitors or build up their own business – or phony businesses run by fraudsters use fake reviews to give themselves an air of legitimacy. Despite volunteer fraud-hunters and the threat of FTC fines, a listing on Google Maps may not accurately reflect a business’s true ratings.

Squatter’s rights: Here a scammer claims a listing for an actual business such as a restaurant, often pocketing online referral fees for customers who actually found this business via organic search. Google notes that 1 out of 10 of its fake Google Maps listing fall under this category.

To be fair, Google has made attempts to keep on top of this problem. In a 2017 report on one of their blogs, they note that that have tightened up their procedures for verifying new listings, and now claim to detect and disable 85% of fraudulent ones before they are posted – resulting in a 70% reduction in such listings from their peak in 2015. However, while pointing out that less than 0.5% of searches today are fraudulent, they acknowledge that they still aren’t perfect.

The lesson here? As former US President Ronald Reagan used to say at the height of the Cold War, “Trust but verify.” To which we would add, keep your data quality practices up-to-date with your own contact data assets. Good luck, and be careful out there!

Lead Validation International: Best Practices

DOTS Lead Validation – International has been available for almost a year, and we have received great feedback from our customers on how they are using it. Using this feedback, we have compiled some general best practices to help you get the most from the service and learn how it helps your business.
There are two main uses for this service, prioritizing leads and regulatory compliance.

Lead prioritization

When your business generates hundreds to thousands of leads daily, it is best to prioritize them based on their quality. One of the simplest ways to determine a lead’s value is using the two outputs from Lead Validation – International; OverallCertainty and OverallQuality. OverallCertianty is a value that comes back in the range of 0-100 and represents how likely the prospect could be contacted with the information they provided. The OverallQuality output shows whether a lead should be rejected, reviewed or accepted.

Each main component of a lead (name, address, email, phone, IP address, and business) is also scored this way. For example, the address component also has certainty and quality scores directly associated with it, AddressCertainty and AddressQuality. The purpose of these individual component values is to allow you to see how the components’ scores break down and make even more informed business decisions.

GDPR compliance

The second major use we have seen for the Lead Validation – International service is determining if any component of your lead is from a country that falls under the General Data Protection Regulation (GDPR). We have made this simple to identify by providing an output, IsInGDPR, which simply identifies that your lead is covered by the GDPR. Our customers are using this to ensure they stay in compliance with the regulation and avoid its hefty fines.

Now that we’ve outlined its main uses, let’s focus on the three most important parts of the service: Inputs, Test Types, and Outputs and how they can be used.

Inputs

The more inputs you have, the better the returns will be. The service heavily cross-references the individual inputs for each component, which means the more data points you share with the service, the better we can analyze the data.

Some organizations simply do not collect all the data points, or the data they buy doesn’t include them. For these reasons, it is very common to have an abbreviated number of inputs for the service. But Lead Validation – International goes a step further and makes adjustments along the way to help maximize results when not all data points are available. If you are missing an IP address, company name, or another data point, Test Types have you covered.

Test types

To avoid penalizing lead scores because of lack of data, TestType is a required field that works like a directive to the service, adjusting the algorithm itself to work with the data available. Using a test type is not only required, it is also just as important a consideration as the other input data. For example, attempting to validate business leads without using TestType=business will skew the results, leaving you scratching your head at the end of the day. Best practice is to match the following test type to your available inputs:

Standard test types
  • normal1p/normal2p – Incorporates all the main components except the business component. The only difference between the two types is that normal2p allows for a second input phone number, where normal1p is limited to one.
  • noip – Same as normal1p, but does not incorporate IP address input in the processing.
  • nap – Simple but common test type that looks at name, address and phone components, a second phone number is optional.
Business test types

Designed for business-to-business leads, not having a business name in a business test type is allowed, but providing a business name returns better scores.

  • business – Like normal2p, but adds the business component.
  • business-noip – Like the business test type except it does not utilize the IP address component. While designed for data with a missing IP address input, this is NOT one of the more recommended operations. Having the IP address as an input for business to business leads provides strong links to connect to other data points and provides some useful flags for fraud.
  • business-n-e-p – Checks name, email, and phone components.
Custom test types

Custom test types can be created for specific needs. In some instances, you may have a component that you don’t have much confidence in and want the system to be less strict in analyzing. Conversely, some organizations may have fields that are so critical that they want to scrutinize specific components over others. Most organizations fit into one of our predefined test types, but customizations are available to ensure unique business needs are met to maximize the results from our service.

Multiple test types

Some companies use multiple test types. It is less common to see multiple test types in the same process, because if a field is missing you likely want that lead to be penalized in OverallCertainty. However, you may have multiple processes fed by leads from several departments and various sources, so ideally you will match the test type to the process and available inputs.

Outputs

Lastly, you will want to pay attention to the OverallCertainty and OverallQuality fields when prioritizing your results. It all comes down to the higher the certainty, the better the lead. There are several factors to consider when thinking about prioritization. For instance, cost of leads, sales team bandwidth, or automated CRM lead scoring could all affect priority outside of validation. Your organization will make these considerations before making any final decisions.

The Notes field is helpful for tying everything together, and will help you understand how the return was generated. The service will output general notes about the validation such as IsNamePhoneMatch or IsPhoneAddressMatch, but also creates Notes about each individual component like IsBadStreet for address or IsPublicProxy for IP Address. Each component that can associate itself to a country can impact the output, IsInGDPR, indicating if the lead or a component of the lead falls under GDPR.

In closing, it is worth reiterating that the quality of the results from Lead Validation – International are predicated on the number of inputs and using the correct test types. This service helps you prioritize your leads, identify the weak and the strong points in your data, and stay in compliance when it comes to GDPR. If you’re working with international leads, reach out to our team to learn more about how our validation service can help your business.

Garbage In, Garbage Out – How Bad Data Hurts Your Business

The old saying “garbage in, garbage out” has been around since the early days of modern computing. Code for operator error or bad data, the adage implies that the output of a program is only as good as the input supplied by the user. With more data being collected, stored, and used than ever before, data quality at the point of entry should be a top priority for all organizations.

Now that data is informing more aspects of our businesses, it’s not difficult to imagine a future where data accuracy is vital. Think of delivery drones, which have been tested all over the US and UK in recent years. If a contact’s bad address information goes unchecked, it could feed a drone the wrong coordinates resulting in misdeliveries and lost products.

Data quality affects every aspect of your business, from sales and development to marketing and customer care. Yet a 2017 survey by Harvard Business Review found that “47% of newly-created data records have at least one critical (e.g., work-impacting) error.” So, what constitutes garbage input? It depends on the data and how it enters your system.

What Is Bad Data?

Of course, there are many kinds of data an organization may choose to collect, but here we will focus on one of the most critical – contact data. This includes a contact’s name, address, phone number and email, all of which are crucial to marketing, sales, fulfillment, and service.

Some common issues that make a contact record bad:

  • Inaccuracies like bad abbreviations or missing zip code
  • Typos caused by speed or carelessness
  • Fraudulent information
  • Moving data from one platform to another without appropriate mapping
  • Data decay as contacts move, get new phone numbers, and change positions

So, how does a bad contact record make it into your database?

Contacts entering their data online, whether downloading a whitepaper or ordering a product, are usually the first to commit a data quality error. Filling out an inquiry form on a mobile device, rushing through a purchase, or providing inexact information (such as missing “West” before a street name) are all examples of bad data leading to inaccurate contact records.

Your sales and customer service team can compound poor data quality by manually updating information that looks incorrect and making mistakes in the process. Good business practices can help mitigate operator error, but if a record was poor to begin with that likely won’t matter – you already know what happens to garbage when it ages.

What Is the Cost of Bad Data?

Much like garbage, bad data only gets worse over time.

Poor data quality can cause an organization’s sales team to waste time and effort chasing bad leads. According to a 2018 study by SiriusDecisions, B2B databases contain between 10% to 25% of critical contact data errors. That means up to 1 in 4 leads could have bad phone or email information attached, so follow-up communications may never reach the intended contact.

The customer service department loses time and money first in dealing with unhappy customers – even if they provided poor contact information, they’ll still blame your business for a package that never arrives. Additional time is spent troubleshooting problems and clarifying bad information, leading to major inefficiencies and frustration.

Overall costs to businesses include reputation related losses that occur when upset customers take to the internet to air their grievances. Time is lost due to the hidden data factories that arise within an organization when individuals working with bad data take it upon themselves to make “corrections” without understanding why or how the data is incorrect. Lastly, poor data robs a business of the ability to take full advantage of business tools like marketing, sales automation systems, and CRM.

How Can My Business Fix Bad Data?

Tightening up business policies around collecting and managing data is a start, but implementing data validation services will help ensure your data is as genuine, accurate, and up-to-date as possible and keep contacts current through frequent updates. Contact data validation can be integrated in a number of ways to best meet the needs of individual organizations, including:

  • Real-time RESTful API – cleanse, validate and enhance contact data by integrating our data quality APIs into your custom application.
  • Cloud Connectors – connect with major marketing, sales, and ecommerce platforms like Marketo, Salesforce, and Magento to help you gather and maintain accurate records.
  • List Processing – securely cleanse lists for use in marketing and sales to help mitigate data decay.
  • Quick Lookups – spot-check a verbal order or cleanse a small batch.

Service Objects’ validation services correct contact data including name, address, phone number, and email and cross-checks it against hundreds of databases to avoid garbage input. The result: cleaner transactions and more efficient processes across all aspects of your business.

Contact our team to determine which of our services can help you collect and maintain the highest quality data and kick the garbage to the curb.

Customer Service Week: More Than a Week to Us

This week, October 1-5, was Customer Service Week: a nationally-recognized event first proclaimed by the US Congress in 1992. It was designed to recognize the work of customer service professionals in the United States – over 2.5 million nowadays, according to the Bureau of Labor Statistics – and educate people about the importance of customer service in business.

This is always one week where you hear a lot of people talking about customer service. Let’s be honest, if you were to ask any business leader whether customer service was important to them, they would all reply “of course.” But to us, good service is a little like a good athletic performance: it isn’t just an attitude you can summon on command, but rather the end product of having the right culture and practices every day.

As part of our Customer Service Week, our own team made a list of traits that define our approach to customer service, using the letters of S-E-R-V-I-C-E O-B-J-E-C-T-S as a guide:

This is actually a pretty good summary of who we are. Let’s look at how these terms break down in terms of our approach to our customers:

Data Ninjas

This was our favorite. We’re good at what we do, and we take lots of pride in our expertise. We aren’t the only company in this space, but we’ve provided enterprise-level data quality solutions for over 15 years – and everywhere from our development team to our 24/7/365 technical support, people have a healthy “ninja” mentality about being experts and continually learning.

Customer, Friendly, Exceptional

Let’s face it, customer service tends to have this smiling-person-with-headset stereotype. But if you’ve worked with us, you’ve probably noticed: we really are pretty friendly, with customers and each other. Whether it is our knowledgeable, low-pressure approach to sales, our technical professionals, or even (in all immodesty) our marketing team, you can tell that we like each other – and like working with you, too. This starts with being a cool place to work, and also springs from supporting people to do the right thing with our customers.

Accurate, Precise, Exact

Every great company has a fanaticism about something. With us – being in the data quality business – it is accurate results. People depend on us to provide accurate leads, contact data, tax rates, and a host of other real-time, mission critical information. So much like the bakery that goes the extra mile to make the perfect croissant, getting it right every time is our particular fanaticism.

Innovative, Creative, Advanced, Insightful

This industry doesn’t stand still, and we have a lot of fun leading the curve with new tools and capabilities. This year alone, for example, we have rolled out everything from API and service enhancements to our bundled Address Insight capabilities, as well as educational white papers and articles ranging from GDPR compliance to email marketing.

Effective, Authoritative, Global

This is the part of our service reputation we eventually grew into. When our CEO Geoff Grow first started this company in 2001, to correct contact addresses and reduce the waste stream of direct mail, few people envisioned Service Objects as the global company we are now. Today we serve over 2500 customers – including major firms like Amazon, Microsoft, Verizon and American Express – and proudly wear the mantle of an industry leader.

These are some of the reasons that every week is Customer Service Week for us, and why we have built so many long-term partnerships with customers. Last but not least, let us know what we can do to serve you too!

The Role Data Quality Plays in Master Data Management

Enterprises are dealing with more data than ever before, which means that proper information architecture and storage is crucial. What’s more, the quality of the data you store affects your business more than ever nowadays. This goes double for your contact data records, because you need the most accurate and up-to-date lead and customer data to ensure that marketing, sales, and fulfillment all run smoothly.

Master Data Management, or MDM, involves creating one single master reference source for critical business information such as contact data. The goal of MDM is to reduce redundancies and errors by centrally storing records, as well as increasing access. No matter how well organized your database might be, the quality of its data will ultimately determine the effectiveness of your MDM efforts.

Why is master data management important?

Organizations look to MDM to improve the quality of their key data, and ensure that it is uniform and current. By bringing all data together in a hub, a company can create consistency in record formatting and ensure that updates are available company-wide.

MDM sounds simple in theory, but think of how (and how much) information is created and collected within an entire organization. Let’s take a single customer contact as an example. Over the course of a month, John Smith provides information to your company in three different instances:

  1. First, he is an existing customer of your services division, and his data exists in their customer records.
  2. Second, he visits your website and downloads a whitepaper, getting added to your marketing department’s database of leads.
  3. Third, he calls the sales team for one of your specialty products, and gets added to their database of contacts.

In a typical organization with departmental “silos,” John Smith is now part of as many as three separate databases in your company, none of which have a longitudinal view of his relationship with you as both a customer and a lead. Worse, slight differences in contact records could even turn John into three separate people in a master database. If each of these touch points get assigned to different automation drips and your sales reps are calling and emailing, you have a real disconnect in your efforts and a high likelihood of spamming your contact.

Having good data collection and storage practices is the first step to a good MDM program, but that alone may not solve John Smith’s problem. Ensuring the data he provides is correct, current and most importantly consistent at the point of entry is what guarantees that the best quality contact information is stored as master data. Implementing a lead validation service could help solve this issue at the point of entry by cross-validating each contact record with multiple databases in real time, facilitating accurate merge/purge operations later.

Lead validation not only corrects and appends contact data, it can also feed into your CRM and automation tools to help you further qualify your leads, so your sales team is only dealing with the highest quality information and pursuing genuine prospects. Once your prospects become customers, their contact data records will be passed on to other departments to complete any transactions – and through your MDM they will also have the most up to date and accurate information to conduct their business.

The costs of poor data quality

In a 2018 study conducted by Vanson Bourne, 74% of respondents agreed that while their organizations have more data than ever, they are struggling to generate useful insights. Some say this is because they are not effectively sharing data between departments, but only 29% of respondents say they have complete trust in the data their organization keeps.

Every aspect of your organization can be impacted by poor data quality, especially if your contact data is lacking. If 71% of your sales team feel that their lead information is bad, how effectively can they do their jobs?

Here are just a few of the ways that bad data can hurt your business:

  • Marketing – sending marketing materials to existing customers for discounts for new subscribers, spending extra money creating mailers for addresses that don’t exist
  • Sales – Multiple sales reps calling the same lead, the same sales rep calling the same lead multiple times, wasting time on bogus leads
  • Order fulfillment – bad or incomplete address data causing failed deliveries and chargebacks
  • Accounts receivable – sending invoices to incorrect addresses or old business contacts, billing addresses not matching credit card records
  • Business development – making bad decisions for growth in a particular market based on inaccurate data

Each of these issues could be solved by creating a master data management system and making sure it is only fed validated contact data from the point of entry. Further, frequently updating the MDM system by validating name, address, phone, email, IP, and other key pieces of information ensures that each data record is as current as possible.

Finally, one major cost of poor data quality – and a key reason for the growth of MDM – is the compliance risk as new data privacy and security regulations have emerged in recent years. For example, the European Union’s recent General Data Protection Regulation (GDPR) and the US Telephone Consumer Protection Act (TCPA) both have severe penalties for unwanted marketing contacts, even when such contact is inadvertently due to data quality issues like bad contact data or changed phone numbers. Accurate, centralized data is now an important part of compliance strategies for any organization.

Trends in master data management and data quality

The competitive advantages of consistent centralized data – together with the advent of cloud-based tools for data quality – have made MDM a growing trend for organizations of all sizes. Moreover, David Jones of the Forbes Technology Council predicts that increasing regulation and the rise of cybersecurity risks are converging to change the way data is managed at the business level. We now live in a world where things like validating contact data, rating lead quality, verifying accurate and current phone or email data, and other data quality tools have now become a new standard for customer-facing businesses.

Today, adding a contact data validation service to your processes before storing master data will improve overall data quality and increase efficiencies among all of your teams. Service Objects offers 24 contact data validation APIs that can stop bad data from undermining your master data management system. Want to learn more? Contact our team to see which API is best for your business type.

Why Google Maps Isn’t Perfect

Google Maps is an amazing service. Much of the civilized world has now been mapped through its data sources, ranging from satellite data to its ubiquitous camera-mounted vehicles. The result is a tool that allows you to find a location, link to local businesses, or virtually drive anywhere from downtown Paris to rural Mexico.

However, if you use Google Maps to validate addresses in a business, it is a little like trying to find a lifelong mate for your grandmother on Tinder: it is possible, but with a tool that wasn’t necessarily designed for that purpose. So let’s look at the differences between this service versus professional address validation and geolocation tools.

Google Maps versus address validation

Let’s start with the most important difference: Google Maps is very complete, but sometimes wrong. How wrong? Mistakes can range from bad directions, wrong street names, and bad addresses to wrong country borders, omitting large cities and everything in between. Once, in a mistake Google acknowledged, a Texas construction firm even demolished the wrong house when Google Maps sent them there.

Another difference is where the data comes from in the first place. Google Maps uses a variety of sources including administrative boundaries, parcels, topographic features, points of interest, trails, road features, and address points or ranges. It also accepts data from “authoritative” organizations as well as individuals, subject to a vetting process. As a result, however, it is possible for mistakes to be introduced and/or made when aggregating or consolidating the data.

Finally and perhaps most importantly, Google does not know exactly where every address is. When it does not have rooftop level data to pinpoint the address it will estimate where an address is using techniques such as address interpolation. Sometimes an address may also be wrong because an individual claimed the location and entered the information incorrectly, or changes such as new municipal or postcode boundaries were not updated.

What the pros do

By comparison, professional address validation and geolocation tools don’t guess at results, because their focus is more on accuracy. Tools such as Service Objects’ DOTS Address Validation and Address Geocode capabilities are focused on delivering an accurate and precise response, versus settling for “close enough.”

To get specific, if our address validation tool cannot correct and validate that an address is real, we will fail it and will not guess. By comparison, Google may just use the closest approximation, which can lead to issues. Similar rules apply to geocoding latitude and longitude coordinates from address data: where necessary, Service Objects will move down a gradient of accuracy/precision, but will still often be closer to the correct coordinates than Google.

Another key difference lies in our data sources. For example, DOTS Address Validation uses continually updated USPS, Canada Post and international data in combination with proprietary databases, to create near-perfect match accuracy. Likewise, for Address Geocoding addresses and coordinates are validated against our master database, the US Census Bureau, TIGER®/Line file, USPS® ZIP+4 tables, and other proprietary databases, ultimately yielding a 99.8% match rate accuracy when translating an address to its latitude and longitude coordinates.

Use the right tool

We like Google Maps. Without it we wouldn’t be able to easily visit major world cities online, find a good sushi bar near our hotel, or get directions to visit Aunt Mildred. But when you need professional-grade accuracy in address and location data for your business, be sure to use the right tools. Need more specifics? Contact us for a no-pressure consultation, and our team will be happy to explore your specific needs.

Why Data Quality Isn’t a Do-It-Yourself Project

Here is a question that potential customers often ask: is it easier or more cost-effective to use data validation services such as ours, versus building and maintaining in-house capabilities?

We are a little biased, of course – but with good reason. Let’s look at why our services are a lot easier to use versus in-house coding, and save you money in the process.

Collecting and using information is a crucial part of most business workflows. This could be as simple as taking a name for an order of a burger and fries, or as complex as gathering every available data point from a client. Either way, companies often find themselves with databases comprised of client info. This data is a valuable business asset, and at the same time often the subject of controversy, misuse, or even data privacy laws.

The collection and use of client information carries an important responsibility when it comes to security, utility, and accuracy. If you find yourself tasked with keeping your data genuine, accurate, and up-to-date, let’s compare the resources you would need for creating your own data validation solution versus using a quality third-party service.

Using your resources

First, let’s look at some of the resources you would need for your own solution:

Human. The development, testing, deployment, and maintenance of an in-house solution requires multiple teams to do properly, including software engineers, quality assurance engineers, database engineers, and system administrators. As the number of data points you collect increases, so does the complexity of these human resources.

Financial. In addition to the inherent cost of employing a team of data validation experts, there is the cost of acquiring access to verified and continually updated lists of data (address, phone, name, etc.) that you can cross-validate your data against.

Proficiency. Consider how long it would take to develop necessary expertise in the data validation field. (At Service Objects, for example, our engineers have spent 15+ years increasing their proficiency in this space, and there is still more to learn!) There is a constant need to gain more knowledge and translate that into new and improved data validation services.

Security. Keeping your sensitive data secure is important, and often requires the services of a team. An even worse cost can be failing to take the necessary steps to ensure privacy, which can even lead to legal troubles. Some environments need as much as bank grade security to protect their information.

Using our resources

The points above are meant to show that any data validation solution, even an in-house one, carries costs with it. Now, let’s look at what you get in return for the relatively small cost of a third-party solution such as ours for data validation:

Done-for-you capabilities. When you use services such as our flagship Address Validation, Lead Validation, geocoding, or others, you leverage all the capabilities we have in place: CASS-certifiedTM validation against continually updated USPS data, lead quality scores computed from over 100 data points, cross-correlation against geographic or demographic data, global reach, and much, much more.

Easy interfacing. We have multiple levels of interfacing, ranging from easy to really easy. For small or well-defined jobs, our batch list processing capabilities can clean or process your data with no programming required. Or integrate our capabilities directly into your platform using enterprise-grade RESTful API calls, or cloud connectors interfacing to popular marketing, sales and e-commerce platforms. You also spot-check specific data online – and sample it right now if you wish, with no registration required!

Support and documentation. Our experts are at your service anytime, including 24/7/365 access in an emergency. And we combine this with extensive developer guides and documentation. We are proud of our support, and equally proud of all the customers who never even need to contact us.

Quality. Our services come with a 99.999% uptime guarantee – if you’re counting, that means less than a minute and a half per day on average. We employ multiple failover servers to make sure we are here when you need us.

We aren’t trying to say that creating your own in-house data validation solution can’t be done. But for most people, it is a huge job that comes at the expense of multiple company resources. This is where we come in at Service Objects, for over 2500 companies – including some of the nation’s biggest brands, like Amazon, American Express, and Microsoft.

The combination of smart data collection/storage choices on your end and the expert knowledge we’ve gained over 15 years in the data validation space can help to ensure your data is accurate, genuine and up-to-date. Be sure to look at the real costs of ensuring your data quality, talk with us, and then leave the fuss of maintaining software and security updates and researching and developing new data validation techniques to us.

More Than an Address: What is a Delivery Point?

Most people think that they mail or ship things to addresses – and they would be wrong. And the reasons for this might be very important to your bottom line.

First, let’s look at one actual address here in our native Santa Barbara, California: 1540 N. Ontare Road.

 

This address is quite real. (In fact, its property is currently for sale on Realtor.com.) But we wouldn’t recommend shipping a package there – at least not yet – because at the moment it is a vacant 20-acre lot.

Now, let’s look at another address: 350 Fifth Avenue, New York, NY:

 

This is also a valid address: it is the famous Empire State Building, one of the tallest buildings in the United States. We wouldn’t recommend using this address by itself for shipping a package either, because without more detail such as a suite number, there is no way of knowing which of its more than 1000 businesses serves as the destination. (In fact, the address itself isn’t even that important here: this building is large enough to have its own ZIP code, 10118.)

Understanding delivery points

These are both examples of the differences between an address and a delivery point. Addresses simply describe the location of a piece of geography, while delivery points are the lifeblood of physical shipments: they are approved unique locations served by delivery services such as the U.S. Postal Service. Many people think they are shipping to addresses, but they are actually shipping to delivery points.

This underscores the importance of delivery point validation, whether you are doing a direct mail marketing campaign or shipping products to customers. There are several possible points of failure where a delivery point may be invalid or undeliverable:

  • The physical address may be incorrect
  • The physical address may be correct, but undeliverable (such as our vacant lot example above)
  • The physical address alone may be insufficient, such as a multi-tenant building
  • Additional delivery point information may be incorrect or invalid: for example, a fourth-floor suite in a three-story building, or a nonexistent suite number
  • The delivery point information may be completely correct, but correspond to the wrong recipient

So from here, your new mantra should be: is it deliverable?

Address validation: the key to accurate delivery points

This is where our flagship address validation tools come in. Available for US, Canadian and international markets, these services provide real-time verification of deliverability – including flagging of vacancy, returned mail, and general delivery addresses – to ensure accurate contact data at the time of data entry or use.

These tools instantly verify, correct and append delivery addresses, using APIs that integrate with your CRM or marketing automation platforms, cloud connectors, or convenient batch services for cleaning your databases without the need for programming. Whichever approach you use, you will leverage our vast infrastructure of up-to-the-minute data from the USPS, Canada Post and other sources, along with sophisticated and accurate address verification capabilities.

Our DOTS Address Validation – US 3 service, for example, provides near-perfect match accuracy with updates mirroring the USPS, and sub-second response times that allow you to validate live customer input in real time. And our industry-leading GetBestMatches operation combines Delivery Point Validation (DPV) to verify an address is deliverable, Residential Delivery Indicator (RDI) to identify residential or business, and SuiteLink (SLK) to add secondary suite information for businesses, all with a single API call to our USPS CASS Certified™ engine.

Want to learn more about engineering delivery point validation into your operations? Contact us for friendly, knowledgeable answers from our experienced team of data quality professionals.

Data Collection – Getting It Right

As a data validation company, we think a lot about how information is collected and stored – and we know that the right approach to data collection can ensure that you start off on the right foot. This blog shares some expert tips for channeling your data into your company’s data stores, in ways that give you the best chance to have this data corrected, standardized, and validated.

Let’s imagine a few of the most common data entry points. At the top of the list we find web forms, compiled datasets, and manual entry from locations such as Point-of-Sale (POS), phone calls, and written transcriptions. Each type of data entry is unique from an interface perspective, but there are still two key rules you can use to limit the chance for errors or garbage. Here they are:

  1. Divide and conquer

    First, break down your data into its most simple components. By collecting data in its simplest form you can ensure there is no confusion between what the data is supposed to represent and other fields. This is a principle we call separation of data, where each field only contains the most relevant data for its type.

    In the best case, you have a collection of very specific fields with unique data types – as opposed to the worst case, where you have a group of identical fields collecting your data. Could a user enter, say, his grandmother’s name in the ZIP code field? If so, you still have some refinement to do here.

  2. Set data constraints

    Next, determine what characters are relevant to each specific field. For example, if you are collecting “Age” data, you know that the data field should be a number. It doesn’t make sense to allow for anything other than the numbers 0-9 (and please don’t get cute and allow words such as “thirty-five”). Extending your requirements further, you could even limit the “Age” field to positive numbers to prevent someone from claiming they are -5 years old.

Now, let’s put these two rules to work for two of the most common types of contact data: delivery addresses and email addresses.

United States Delivery Addresses

This is a case where a basic understanding of the data you are collecting will take you a long way. Here we are going to examine the anatomy of a typical US address, as shown in Wikipedia, and then split the individual components into different fields.

https://en.wikipedia.org/wiki/Address_(geography)#United_States

From this US address example, we can see the recommended USPS format. Within the address, we have the house number and street name (Address1), the name of town (City), the state abbreviation (State), and the ZIP+4 (ZIP/PostalCode).

Here are some good and bad examples of address data:

Ideal Input

FieldContentDescription
Address127 E Cota St STE 500Contains a mix of alphanumeric data pertaining to the mailbox # and street
Address2c/o John SmithNon-critical deliverability information
CitySanta BarbaraContains only alpha characters
StateCAStandard 2 character representation of a state chosen from a list of acceptable values
ZIP931015 digit number without extraneous characters

Address 2 Used Incorrectly

FieldContentDescription
Address127 E Cota St STE 500No issues
Address2Santa Barbara, CA 93101City, state and ZIP are not separated

Allowing Abbreviations and Local Language

FieldContentDescription
Address127 E Cota StNo issues
CitySBAbbreviating the city name is not ideal. Spell out full name or use ZIP to determine and populate this field
StateCaliInformal spelling instead of California or CA. State should be selected from a data set
ZIP93101abc Should be a 5 digits with no alpha characters

Email addresses

Now let’s turn to the format of an email address, again from Wikipedia:

https://en.wikipedia.org/wiki/Email_address

Emails are a bit trickier, but constraints can still be placed on the point of entry. The set of allowed characters in an email is far more expansive than say, postal codes, but there are still limitations. By putting the restrictions in place you can still cut out some of the garbage that can be submitted.

In this case, you could examine the individual elements of an email and place restrictions on each. Since an email must consist of a local-part, “@” symbol, and a domain, it is safe to restrict your collected data to only email inputs that conform to the anatomy of an email. For example, an address such as “Input 1: 123ABCSt.foobar@domain.com Thirty-Five” does not end with a legal domain identifier, and also contains illegal spaces within the address and after the closing period.

General Contact Record Data Collection

The same techniques can be applied to all of the data that you collect. Some fields may be easier to apply constraints to, but any attempt to filter acceptable input and organize the data into manageable sections will benefit you down the line. We specialize in data validation and have spent over 15 years refining our methods for interpreting, standardizing, and validating data. If data is gathered in component parts we can at least gain context and, even if the data isn’t correct, apply our specialized knowledge to try and validate this information.

Finally, give some thought to how your data will ultimately be stored in a data store or database. Since this is the bottom level, it is crucial for database administrators to make educated choices about these data fields. If constraints aren’t placed at the database level, the information you have collected could be rendered useless. Smart choices in data type, accepted data length, and constraints can help ensure that your data is stored in its most sensible form.

As with most things in life, an ounce of prevention is worth a pound of cure when it comes to collecting and maintaining contact data accuracy. The techniques described above, in conjunction with Service Objects web services, will help provide you with the most genuine, accurate, and up-to-date data as possible.

Who Is Service Objects?

Who is Service Objects? We are a data validation company that aims to cleanse, append and enhance our customers’ contact data. We are fanatical about data quality, customer service and our 99.999% uptime guarantee (that’s less than five minutes a year, if you are doing the math). We are also committed to reducing wasted paper, time and resources through data quality excellence. We strive to provide the industry’s top contact validation tools to our customers.

What does Service Objects offer?

We offer 24 different services that our customers can use. These include APIs that validate email addresses, IP addresses, physical mailing addresses, phone numbers and BIN numbers. Our tools can also provide tax rates based on addresses, validate domestic and international lead information, append missing contact information, or even provide demographic data or location intelligence about your contacts.

You can embed these real-time APIs in your own contact data platforms, or even use our convenient batch services to clean and validate your databases with no programming required. Either way, we provide you with better data along with a host of associated information.

Why would I use Service Objects?

There are many reasons! One of the reasons that we are really passionate about is that validated, verified and corrected data helps eliminate waste. For example, take our flagship Address Validation 3 service: if addresses are run and verified through our CASS certified system, this will reduce the number of mailings that are undeliverable due to bad or incorrect addresses – along with all of the costs and human effort that go along with this.

Other key reasons revolve around ROI and profitability. We can validate the quality of your marketing leads, boost sales and target marketing better by offering a more complete view of your customers, reduce shipment errors and service failures, and help you avoid compliance penalties in a world that is increasingly regulating marketing contact activities. Our services can also fight fraud by helping identify high-risk or bad contact information.

How do I call a Service Objects web service?

Great question!  Our services are generally very simple to use and call. For starters, you will need a trial key to successfully call our services. Sign up for a free API trial key here! After you have a trial key, you will be able to test any of our webservices. For example, if you have signed up for a trial key of our DOTS Address Validation 3 service, you can test out the service by putting your key into the URL below:

https://trial.serviceobjects.com/av3/api.svc/GetBestMatches?BusinessName=SERVICE+OBJECTS&Address=27+E+Cota+St+STE+500&Address2=&City=Santa+Barbara&State=CA&PostalCode=93105&LicenseKey=YourKeyHere

This GET request will return an XML version of the response; however, most of our services also offer JSON as an available output type. Requests to our services can also be made through POST or SOAP protocols as well.  Below is an example of the response you can expect from our DOTS AV3 service.

How do I deal with the response?

This depends on your application. For example:

  • If you are using a Service Objects subscription to cleanse a database, then you would likely want to update old records with the standardized and validated data.
  • If you are looking to delete old bad data, then you may want to remove any record that receives an error from our service.
  • If you are looking to enhance data, then you will likely want to add notes that we provide in many of our services to your records.
  • If our services are being used in real time, then the results can be used to relay information back to the user inputting it.

Potential things to highlight or notify a user about could include that an address is not deliverable, the given phone number is not valid, or the given email address is not deliverable. Each use case is unique!

Which leads to one of the best features about working with Service Objects: us. We love to work with our customers to help them get the most out of the services and data that we provide. Contact us anytime to learn more about what we can do for you!

Customer Expectations are Getting…Younger

Being based in the college town of Santa Barbara, California, we notice something interesting: the students seem to get younger every year. Of course, it is actually our own ages that continue to change. But this illusion contains a valuable marketing lesson for all of us.

The Rise of the Generational Customer

According to the latest State of the Connected Customer survey from Salesforce.com, consumers really are getting younger, as markets shift over time from older customers such as Baby Boomers to Generations X/Y and the Millennials. In fact, this year marks the first time that adult consumers exist who have never lived in the 20th century.

This trend means a lot more than having customers who don’t remember the 9/11 attacks, or realize that Paul McCartney was in a band before Wings. Some of the key points from this survey include:

  • Millennials and Generation Z live in an omnichannel world, using an average of 11 digital channels versus nine for traditional/Baby Boomer customers.
  • Nearly twice as many Millennials prefer to use mobile channels versus traditional/Baby Boomer customers (61% versus 31%), with 90% of Millennials using this channel versus 72% of older customers.
  • Traditional and Baby Boomer customers use less technology than their younger counterparts, but they aren’t dead yet: over 70% of them use channels such as mobile, text/SMS, and online portals and knowledge bases. However, usage falls off sharply with age for newer channels such as social media and voice-activated personal assistants like Siri and Alexa.
  • Between 77% and 86% of survey respondents believe that technologies such as chatbots, voice-activated assistants, and the Internet of Things (IoT) will transform their expectations of companies. The most important ones? AI and cybersecurity, at 87% each.
  • Over two-thirds of all consumers surveyed (67%) prefer to purchase through digital channels.

Overall, one of the key takeaways from this survey was the growing importance of customer experience. Eighty percent of respondents stated that the experience provided by a company was every bit as important as its products and services. This in turn involves greater connectivity between companies and their customers, with 70% of customers noting that connected processes are very important to winning their business.

It All Comes Down to Data

What does this mean for the future of marketing? For one thing, it is clearly becoming more data-driven. While your oldest consumers still remember ordering from catalogs, your youngest ones expect to engage you on your tablets and smartphones, with little tolerance for error. This also means that both your marketing and your customer service are increasingly becoming electronic.

We welcome this trend at Service Objects: our company was originally founded in 2001 around reducing the waste stream from direct mail. But this trend also creates a mandate for us – and for you – to keep looking beyond simple contact data validation, into a world of data analyses that range from demographic screening to compliance with growing privacy laws. It is a major challenge, but also an opportunity for all of us – and frankly a big part of what keeps us young.