Posts Tagged ‘Business Solutions’

Text: 2018 Year End Message from Our CEO

A Year End Message From Our CEO

With 2018 coming to a close in just a few short weeks, we enter our busiest time of year validating millions of deliveries for our customers every day. Just like Santa’s elves, we’ve been preparing for the season all year long.

2018 has been a good year at Service Objects, with strong growth and record-setting validations for our customers. Our mission continues, helping businesses weed-out fraud and operate more efficiently through data quality.

We made substantial improvements in many facets of our services. We achieved certification for GDPR Privacy Shield. We introduced a new version of Address Insight. We implemented ambitious security policies. Most importantly, we achieved a customer satisfaction score (NPS) of 66. Although “66” may sound low, it ranks us among the top 1% of all technology companies nationwide. Our NPS score proves we have a personal connection with our customers.

Of course, these wonderful results are only possible due to the work of each and every team member here, and the core values in which we share in our work together. This year we welcomed Naoko Rico, Richard Delsi, Samantha Haentjens, Joy Adelente, Cary Carlton, and Sophia Graniela to the company. We are SO happy to have them onboard. Our team, and the values we foster, underpins all the good we do. It’s through our collective efforts that we make the greatest difference.

I’m very excited about 2019. We have so many exciting projects on deck, such as global geocoding, AI-based name validation, and much more.

The success we have had in 2018 is a credit to you, our customers. Thank you for allowing us to participate in your business. It is our privilege to be a part of your work.

From all of us at Service Objects, we wish you a very happy and healthy holiday and a bright and hopeful new year.

Happy holidays!

Geoffrey W. Grow
Founder and CEO

Delivering excellent and responsive customer service is a top priority for Service Objects, from assigning a dedicated Customer Success Specialist to proactive monitoring and analysis of every account, the Service Objects team ensures maximum uptime while also providing fast and responsive service.

Service Objects Proves Customer Service Reigns Supreme with NPS Score of 65

Hitting a new all-time high for customer service, Service Objects recently earned a Net Promoter Score (NPS) of 65, placing the company shoulder-to-shoulder with other tech industry leaders. With a score seven points higher than the industry average, this current achievement marks the fifth consecutive year Service Objects has attained a “best in class ranking.”

“We are proud to hit high marks for the fifth consecutive year, but we won’t rest on our laurels,” said Service Objects Founder and CEO Geoff Grow. “Addressing our customers’ needs and wants will always be a top priority, and we look forward to helping our clients grow and thrive in all their future endeavors.”

Best in class for 5th Year

The Net Promoter Score measures customer feedback to create a numeric value representing brand loyalty. Difficult to achieve, a top NPS score means a company is held in high regard by clients and usually indicates a strong commitment to customer satisfaction and proactive, solution-focused service.

NPS measures customer satisfaction based on a 200-point scale ranging between -100 and +100, with scores over 50 considered “best in class.” Service Objects has maintained its outstanding rating for the past five years because of its continued commitment to 24/7/365 support, 99.999% server uptime, experience in validating over 3 billion contact records, and in-depth knowledge of the contact validation industry.

NPS and customer service: A powerful combination

In 2014, Service Objects launched its Customer Success Program with the goal of establishing a set of benchmarks and strategies designed to make sure clients get answers quickly and maximize their investment. Featuring intensive customer support, the Customer Success Program allows customers to become acquainted with their services and get up and running quickly.

From assigning a dedicated Customer Success Specialist to scheduling regular check-ins and initiating proactive monitoring and analysis of every account, Service Objects can ensure maximum uptime while also providing fast and responsive service. From day one, customers have access to ongoing customer support that includes quarterly check-ins, 24/7/365 emergency response, courtesy testing keys, and more.

Client satisfaction a top priority

Service Objects continually measures its NPS score to gain a better understanding of the current level of satisfaction amongst its client base. With a 2018 NPS score of 65, the company has achieved a score higher than the industry average of 58. This latest NPS ranking once again places Service Objects in the top tier of the technology industry, above other well-known companies including Intel (52), Cisco (38), and Spotify (24).

“Delivering excellent and responsive customer service is a top priority for Service Objects,” Grow said. “We instill this belief into every customer touch point. It is rewarding to see our hard work pay off with continued positive customer feedback and a five-year streak of NPS scores that place us amongst the best in the industry.”

Automated address validation comes with many benefits, but variables in data input businesses receive require a responsive, flexible and customized verification solution.

Difficulties in the Trivial: Diving Deeper into the Intricacies of Address Validation

“Can’t you just add that feature today!?”

“Can you add a simple update for this!?”

“It’s obvious, it really shouldn’t take very long to implement!”

Software developers hear statements like this more than about half a million times over their career.  It raises my eyebrows every time I hear it, typically giving me a little chuckle inside. In all honesty, it is not a surprise that those kinds of inquiries come up and that they come up as often as they do. After all, these queries come from people that do not code so, from their perspective, the questions seem legitimate.

Errors, Intent and Responsive Address Validation

Fundamentally, these are legitimate questions. The misconceptions that trigger this type of confusion often stem from how well humans can quickly find patterns, errors and solutions. So rapidly, in fact, many problems appear to have “fast” or “obvious” solutions.

For example, to us, it may seem obvious that in our Service Objects address the typo “Cta Street” should be “Cota Street” or “Santa Barbra” should be “Santa Barbara.”  These errors are relatively easy to identify, fix and classify right? For humans, the answer is yes. For a computer, on the other hand, the answer is, well…maybe.

At its heart, the issue involves underlying questions about conceptual intent and programming capabilities. For example, when the name of a city is corrected, what does that mean for the data related to the corresponding physical address? Does St. Louis being changed to Saint Louis maintain the fidelity of the locational intent? What about changing Santa Barbara to nearby Goleta? Would these solutions fall within what would be expected behavior?

While both actions result in changes to a piece of information, each solution is based on a different procedure. The first follows a standardized correction of the name of a specific location, which is a type of change to the input. The second, on the other hand, signifies a change in the place itself.

Variations on a Standard: Address Line 2

When it comes to address validation, there are more than enough address variations to examine that constitute a standard address format, and that’s just talking about Address Line 1.  Though Address Line 2 is not a standard address field, it has been a custom to see that field in one of it’s many variations on forms when we fill out our address.

Address Line 2 has often left users confused as to what information should be included in that field. While many argue Address Line 2 is designed for apartment numbers, suites, and similar secondary information, there is no consensus. In fact, the USPS does not recognize Address Line 2 as a standard address field.

Reactive Address Verification

In practice, the field is often used for apartment and suite numbers, but also for other details like ‘care of’ or to give additional information to the mail delivery person.  Almost anything can go into Address Line 2, and increasingly, people expect Address Line 2 data to be handled as part of the entire address validation protocol.

Ultimately, this field adds a significant layer of complexity to an address validation solution. Take for instance the scenario where someone enters “Apt 2 A C O Sally” on Address Line 2. Typically, someone would not enter the data that way, but you may be surprised how often something like this does come up. Visually, we can easily identify the intention of the data. An address validation process, on the other hand, will find identification and categorization of this information difficult. For example, what does “Apt 2 A C O Sally” signify? Is it Apt 2A, Care of Sally? Or is just Apt 2? Furthermore, once we identify the “Care Of” details, does that information need to be preserved on the output?

Standardized Solutions and a Custom Fix

In some situations, Service Objects can create a solution capable of handling the complexities of multiple address data inputs beyond the standard format. In other instances, the best result is a happy medium. Sometimes a specific solution can be tailored to a particular client based on their individual needs. Always, our updates can fine-tune operations and benefit all of our clients.

We use our extensive validation knowledge and industry standards and expectations to help govern which approach we take. Situations like these represent what our teams look at all the time, and we are always finding ways to make improvements. Our services are based on almost 20 years of experience.

When we update logic in the system, we often walk a fine line between what the code should do versus what it already does.  We always want to improve our processes without negatively impacting clients already used to expected behavior. Improvements capable of changing the expected results for current users will often be bundled up in a new version, but usually, we can find ways to update current versions, so clients always have the latest and greatest option without needing to make any changes.

As we know, the devil is in the details. By drilling down to the smallest and most incremental element of an address input, Service Objects provides the highest level of data verification and address validation services. In fact, our focus on details allows us to be experts in the validation services field, so your organization doesn’t have to be.


Recognizing the vital role contact data quality plays in GDPR compliance, Service Objects is offering affected businesses a free data quality assessment.

Free Data Quality Assessment Helps Businesses Gauge GDPR Compliance Ahead of May Deadline

As the May 25, 2018, deadline looms, Service Objects, the leading provider of real-time global contact validation solutions, is offering a GDPR Data Quality Assessment to help companies evaluate their if they are prepared for the new set of privacy rules and regulations.

“Our goal is to help you get a better understanding of the role your client data plays in GDPR compliance,” says Geoff Grow, CEO and Founder, Service Objects. “With our free GDPR Data Quality Assessment, companies will receive an honest, third-party analysis of the accuracy of their contact records and customer database.”

Under the GDPR, personal data includes any information related to a natural person or ‘Data Subject’ that can be used to identify the person directly or indirectly. It can be anything from a name, a photo, an email address, bank details, posts on social networking websites, medical information, or a computer IP address.

Even if an organization is not based in the EU, it may still need to observe the rules and regulations of GDPR. That’s because the GDPR not only applies to businesses located in the EU but to any companies offering goods or services within the European Union. In addition, if a business monitors the behavior of any EU data subjects, including the processing and holding of personal data, the GDPR applies.

Recognizing the vital role contact data quality plays in GDPR compliance, Service Objects decided to offer a free data quality assessment to help those industries affected by the regulation measure the accuracy of their contact records and prepare for the May 2018 deadline.

The evaluation will include an analysis of up to 500 records, testing for accuracy across a set of inputs including name, phone, address, email, IP, and country. After the assessment is complete, a composite score will be provided, giving businesses an understanding of the how close they are to being compliant with GDPR’s Article 5.

Article 5 of the GDPR requires organizations collecting and processing personal information of individuals within the European Union (EU) to ensuring all current and future customer information is accurate and up-to-date. Not adhering to the rules and regulations of the GDPR can result in a fine of up to 4% of annual global turnover or €20 Million (whichever is greater).

“To avoid the significant fines and penalties associated with the GDPR, businesses are required to make every effort to keep their contact data is accurate and up-to-date,” Grow added. “Service Objects’ data quality solutions enable global businesses to fulfill the regulatory requirements of Article 5 and establish a basis for data quality best practices as part of a broader operational strategy.”


For more information on how to get started with your free GDPR Data Quality Assessment, please visit our website today.

When that data is incomplete, poorly defined, or wrong, there are immediate consequences: angry customers, wasted time, and difficult execution of strategy. Employing data quality best practices presents a terrific opportunity to improve business performance.

The Unmeasured Costs of Bad Customer and Prospect Data

Perhaps Thomas Redman’s most important recent article is “Seizing Opportunity in Data Quality.”  Sloan Management Review published it in November 2017, and it appears below.  Here he expands on the “unmeasured” and “unmeasurable” costs of bad data, particularly in the context of customer data, and why companies need to initiate data quality strategies.

Here is the article, reprinted in its entirety with permission from Sloan Management Review.

The cost of bad data is an astonishing 15% to 25% of revenue for most companies.

Getting in front on data quality presents a terrific opportunity to improve business performance. Better data means fewer mistakes, lower costs, better decisions, and better products. Further, I predict that many companies that don’t give data quality its due will struggle to survive in the business environment of the future.

Bad data is the norm. Every day, businesses send packages to customers, managers decide which candidate to hire, and executives make long-term plans based on data provided by others. When that data is incomplete, poorly defined, or wrong, there are immediate consequences: angry customers, wasted time, and added difficulties in the execution of strategy. You know the sound bites — “decisions are no better than the data on which they’re based” and “garbage in, garbage out.” But do you know the price tag to your organization?

Based on recent research by Experian plc, as well as by consultants James Price of Experience Matters and Martin Spratt of Clear Strategic IT Partners Pty. Ltd., we estimate the cost of bad data to be 15% to 25% of revenue for most companies (more on this research later). These costs come as people accommodate bad data by correcting errors, seeking confirmation from other sources, and dealing with the inevitable mistakes that follow.

Fewer errors mean lower costs, and the key to fewer errors lies in finding and eliminating their root causes. Fortunately, this is not too difficult in most cases. All told, we estimate that two-thirds of these costs can be identified and eliminated — permanently.

In the past, I could understand a company’s lack of attention to data quality because the business case seemed complex, disjointed, and incomplete. But recent work fills important gaps.

The case builds on four interrelated components: the current state of data quality, the immediate consequences of bad data, the associated costs, and the benefits of getting in front on data quality. Let’s consider each in turn.

Four Reasons to Pay Attention to Data Quality Now

The Current Level of Data Quality Is Extremely Low

A new study that I recently completed with Tadhg Nagle and Dave Sammon (both of Cork University Business School) looked at data quality levels in actual practice and shows just how terrible the situation is.

We had 75 executives identify the last 100 units of work their departments had done — essentially 100 data records — and then review that work’s quality. Only 3% of the collections fell within the “acceptable” range of error. Nearly 50% of newly created data records had critical errors.

Said differently, the vast majority of data is simply unacceptable, and much of it is atrocious. Unless you have hard evidence to the contrary, you must assume that your data is in similar shape.

Bad Data Has Immediate Consequences

Virtually everyone, at every level, agrees that high-quality data is critical to their work. Many people go to great lengths to check data, seeking confirmation from secondary sources and making corrections. These efforts constitute what I call “hidden data factories” and reflect a reactive approach to data quality. Accommodating bad data this way wastes time, is expensive, and doesn’t work well. Even worse, the underlying problems that created the bad data never go away.

One consequence is that knowledge workers waste up to 50% of their time dealing with mundane data quality issues. For data scientists, this number may go as high as 80%.

A second consequence is mistakes, errors in operations, bad decisions, bad analytics, and bad algorithms. Indeed, “big garbage in, big garbage out” is the new “garbage in, garbage out.”

Finally, bad data erodes trust. In fact, only 16% of managers fully trust the data they use to make important decisions.

Frankly, given the quality levels noted above, it is a wonder that anyone trusts any data.

When Totaled, the Business Costs Are Enormous

Obviously, the errors, wasted time, and lack of trust that are bred by bad data come at high costs.

Companies throw away 20% of their revenue dealing with data quality issues. This figure synthesizes estimates provided by Experian (worldwide, bad data cost companies 23% of revenue), Price of Experience Matters ($20,000/employee cost to bad data), and Spratt of Clear Strategic IT Partners (16% to 32% wasted effort dealing with data). The total cost to the U.S. economy: an estimated $3.1 trillion per year, according to IBM.

The costs to businesses of angry customers and bad decisions resulting from bad data are immeasurable — but enormous.

Finally, it is much more difficult to become data-driven when a company can’t depend on its data. In the data space, everything begins and ends with quality. You can’t expect to make much of a business selling or licensing bad data. You should not trust analytics if you don’t trust the data. And you can’t expect people to use data they don’t trust when making decisions.

Two-Thirds of These Costs Can Be Eliminated by Getting in Front on Data Quality

“Getting in front on data quality” stands in contrast to the reactive approach most companies take today. It involves attacking data quality proactively by searching out and eliminating the root causes of errors. To be clear, this is about management, not technology — data quality is a business problem, not an IT problem.

Companies that have invested in fixing the sources of poor data — including AT&T, Royal Dutch Shell, Chevron, and Morningstar — have found great success. They lead us to conclude that the root causes of 80% or more of errors can be eliminated; that up to two-thirds of the measurable costs can be permanently eliminated; and that trust improves as the data does.

Which Companies Should Be Addressing Data Quality?

While attacking data quality is important for all, it carries a special urgency for four kinds of companies and government agencies:

Those that must keep an eye on costs. Examples include retailers, especially those competing with Inc.; oil and gas companies, which have seen prices cut in half in the past four years; government agencies, tasked with doing more with less; and companies in health care, which simply must do a better job containing costs. Paring costs by purging the waste and hidden data factories created by bad data makes far more sense than indiscriminate layoffs — and strengthens a company in the process.

Those seeking to put their data to work. Companies include those that sell or license data, those seeking to monetize data, those deploying analytics more broadly, those experimenting with artificial intelligence, and those that want to digitize operations. Organizations can, of course, pursue such objectives using data loaded with errors, and many companies do. But the chances of success increase as the data improves.

Those unsure where primary responsibility for data should reside. Most businesspeople readily admit that data quality is a problem, but claim it is the province of IT. IT people also readily admit that data quality is an issue, but they claim it is the province of the business — and a sort of uneasy stasis results. It is time to put an end to this folly. Senior management must assign primary responsibility for data to the business.

Those who are simply sick and tired of making decisions using data they don’t trust. Better data means better decisions with less stress. Better data also frees up time to focus on the really important and complex decisions.

Next Steps for Senior Executives

In my experience, many executives find reasons to discount or even dismiss the bad news about bad data. Common refrains include, “The numbers seem too big, they can’t be right,” and “I’ve been in this business 20 years, and trust me, our data is as good as it can be,” and “It’s my job to make the best possible call even in the face of bad data.”

But I encourage each executive to think deeply about the implications of these statistics for his or her own company, department, or agency, and then develop a business case for tackling the problem. Senior executives must explore the implications of data quality given their own unique markets, capabilities, and challenges.

The first step is to connect the organization or department’s most important business objectives to data. Which decisions and activities and goals depend on what kinds of data?

The second step is to establish a data quality baseline. I find that many executives make this step overly complex. A simple process is to select one of the activities identified in the first step — such as setting up a customer account or delivering a product — and then do a quick quality review of the last 100 times the organization did that activity. I call this the Friday Afternoon Measurement because it can be done with a small team in an hour or two.

The third step is to estimate the consequences and their costs for bad data. Again, keep the focus narrow — managers who need to keep an eye on costs should concentrate on hidden data factories; those focusing on AI can concentrate on wasted time and the increased risk of failure; and so forth.

Finally, for the fourth step, estimate the benefits — cost savings, lower risk, better decisions — that your organization will reap if you can eliminate 80% of the most common errors. These form your targets going forward.

Chances are that after your organization sees the improvements generated by only the first few projects, it will find far more opportunity in data quality than it had thought possible. And if you move quickly, while bad data is still the norm, you may also find an unexpected opportunity to put some distance between yourself and your competitors.


Service Objects spoke with the author, Tom Redman, and he gave us an update on the Sloan Management article reprinted above, particularly as it relates to the subject of the costs associated with bad customer data.

Please focus first on the measurable costs of bad customer data.  Included are items such as the cost of the work Sales does to fix up bad prospect data it receives from Marketing, the costs of making good for a customer when Operations sends him or her the wrong stuff, and the cost of work needed to get the various systems which house customer data to “talk.”  These costs are enormous.  For all data, it amounts to roughly twenty percent of revenue.

But how about these costs:

  • The revenue lost when a prospect doesn’t get your flyer because you mailed it to the wrong address.
  • The revenue lost when a customer quits buying from you because fixing a billing problem was such a chore.
  • The additional revenue lost when he/she tells a friend about his or her experiences.

This list could go on and on.

Most items involve lost revenue and, unfortunately, we don’t know how to estimate “sales you would have made.”  But they do call to mind similar unmeasurable costs associated with poor manufacturing in the 1970s and 80s.  While expert opinion varied, a good first estimate was that the unmeasured costs roughly equaled the measured costs.

If the added costs in the Seizing Opportunity article above doesn’t scare into action, add in a similar estimate for lost revenue.

The only recourse is to professionally manage the quality of prospect and customer data.  It is not hyperbole to note that such data are among a company’s most important assets and demand no less.

©2018, Data Quality Solutions


Study after study has shown that investing in employee experience impacts the customer experience and can generate a high ROI for the company.

The Un-Ignorable Link Between Employee Experience And Customer Experience

Engaged employees lead to happy customers.

There is an undeniable link between employee experience and customer experience. Companies that lead in customer experience have 60% more engaged employees, and study after study has shown that investing in employee experience impacts the customer experience and can generate a high ROI for the company. Here are 10 companies that have seen the benefit of engaging their employees to build customer experience.

“Take care of associates and they’ll take care of your customers.” -J.W.Marriott

Marriott International founder J.W. Marriott said, “Take care of associates and they’ll take care of your customers.” It still holds true at the company—employees are valued, which makes them want to share that experience with guests. Marriott publicly rewards employees for a job well done, celebrates diversity and inclusion, values loyalty, and offers a wide variety of training programs. It has been regularly rated a top place to work and a top company for customer experience.

Chick-Fil-A encourages employees to build relationships with customers

With its chicken and waffle fries, Chick-fil-A generates more revenue per restaurant than any other chain in the country. But it’s not just the food that sets the restaurant apart—it’s the employees. Franchise owners are given thorough training but also have bandwidth to explore creative ideas. Employees are encouraged to build relationships with customers because they have strong relationships with each other and with the company.

The Zappos contact center calls its team customer loyalty team members

E-commerce site Zappos is known for connecting with its customers and for responding to issues quickly. That’s likely because the company also has a great reputation for connecting with its employees. Every employee plays a role in the company’s customer-first culture—even call center employees are referred to as customer loyalty team members. When employees feel connected to and valued by the brand, they want to bring customers into the circle.

Nordstrom only asks employees to use their best judgement

Employees at Nordstrom are given just one rule in their employee handbook: “Use best judgment in all situations. There will be no additional rules.” Instead of being bogged down with corporate guidance, empowered employees know they are trusted and valued. That translates to their interactions with customers and is a large reason why the “Nordstrom Way” of doing customer service is well respected.

Taco Bell provides an easy way for employees to ask for help

Fast food giant Taco Bell puts employees first by always providing them a way to contact management. The company has a network of 1-800 numbers to field complaints, answer questions, and alert management of potential red flags for its 175,000-plus employees. It also holds regular employee roundtable meetings and company-wide surveys to gage employee satisfaction. With their needs met and questions answered, employees can focus on helping customers.

Jet Blue employees are allowed to go the extra mile for customers

Jet Blue is consistently rated one of the best airlines, and a large part of that is the great customer experience. Jet Blue’s employees are given the freedom to go the extra mile to help customers. Instead of being constrained by red tape and bureaucracy, employees have power to solve problems themselves, which means they often consider customers problems to be their own. Jet Blue also fosters a spirit of collaboration and teamwork with employees that extends to customers.

Starbucks provides extensive training on how to interact with customers

Starbucks knows that happy employees lead to happy customers. The company is consistently at the top of every customer experience “best” list, and this recognition comes from taking care of its employees. Starbucks provides employees competitive wages, health benefits, and stock options. Each employee is trained not only on how to make the drinks but also how to interact with customers. The welcoming atmosphere of a Starbucks coffee shop is echoed in the company, where every employee knows they are welcomed and included.

Airbnb helps employees focus on personal growth

Airbnb’s mission statement of “Belong Anywhere” extends beyond customers to also include employees. Airbnb is invested in every aspect of its employees’ lives, not just what they do at the office. The company works to create a culture that sets employees up for success in their personal and professional lives, from having a flexible, open office space to being transparent with the goals of the company. Employees can focus on their personal growth and the mission of the company, which allows them to create better customer experiences.

Adobe ties employee compensation to customer experience

Instead of viewing customers and employees as separate entities, Adobe brings them together to drive positive, connected experiences. Employees are trained on customer experience metrics and how each person’s role impacts the overall customer experience. It also encourages employees to be advocates for customers’ needs and jump in when they see a problem instead of waiting for something to run its course. At Adobe, employee compensation is tied to customer experience. When employees are connected with customers and see the role they can each play individually, they want to create a better experience (disclosure: Adobe is a client).

GE uses root cause analysis to improve customer satisfaction

It takes an innovative HR department to drive employee experience at General Electric. Employees are involved in the process to make sure they have the physical space and technological tools to do their best work and that training programs keep employees moving forward. When a division of GE saw it had low customer satisfaction scores, it worked to find the root cause and streamline internal processes. Cutting red tape keeps employees happier and allows them to be more productive, which helped the customer satisfaction score jump more than 40% in two years.

Your employees are your often your most untapped resource when it comes to building powerful customer experiences. I hope you are just as inspired by the companies highlighted here as I was.

The article above was first published on and reprinted with permission. View original post here.

About the author: Blake Morgan is a customer experience futurist, author of More Is More, and keynote speaker. You can read more of Blake’s articles by visiting her website.

A Daisy Chain of Hidden Customer Data Factories

I published the provocatively-titled article, Bad Data Costs the United States $3 Trillion per Year in September, 2016 at Harvard Business Review. It is of special importance to those who need prospect/customer/contact data in the course of their work.

First read the article.

Consider this figure: $136 billion per year. That’s the research firm IDC’s estimate of the size of the big data market, worldwide, in 2016. This figure should surprise no one with an interest in big data.

But here’s another number: $3.1 trillion, IBM’s estimate of the yearly cost of poor quality data, in the US alone, in 2016. While most people who deal in data every day know that bad data is costly, this figure stuns.

While the numbers are not really comparable, and there is considerable variation around each, one can only conclude that right now, improving data quality represents the far larger data opportunity. Leaders are well-advised to develop a deeper appreciation for the opportunities improving data quality present and take fuller advantage than they do today.

The reason bad data costs so much is that decision makers, managers, knowledge workers, data scientists, and others must accommodate it in their everyday work. And doing so is both time-consuming and expensive. The data they need has plenty of errors, and in the face of a critical deadline, many individuals simply make corrections themselves to complete the task at hand. They don’t think to reach out to the data creator, explain their requirements, and help eliminate root causes.

Quite quickly, this business of checking the data and making corrections becomes just another fact of work life.  Take a look at the figure below. Department B, in addition to doing its own work, must add steps to accommodate errors created by Department A. It corrects most errors, though some leak through to customers. Thus Department B must also deal with the consequences of those errors that leak through, which may include such issues as angry customers (and bosses!), packages sent to the wrong address, and requests for lower invoices.

The hidden data factory

Visualizing the extra steps required to correct the costly and time consuming data errors.

I call the added steps the “hidden data factory.” Companies, government agencies, and other organizations are rife with hidden data factories. Salespeople waste time dealing with erred prospect data; service delivery people waste time correcting flawed customer orders received from sales. Data scientists spend an inordinate amount of time cleaning data; IT expends enormous effort lining up systems that “don’t talk.” Senior executives hedge their plans because they don’t trust the numbers from finance.

Such hidden data factories are expensive. They form the basis for IBM’s $3.1 trillion per year figure. But quite naturally, managers should be more interested in the costs to their own organizations than to the economy as a whole. So consider:

There is no mystery in reducing the costs of bad data — you have to shine a harsh light on those hidden data factories and reduce them as much as possible. The aforementioned Friday Afternoon Measurement and the rule of ten help shine that harsh light. So too does the realization that hidden data factories represent non-value-added work.

To see this, look once more at the process above. If Department A does its work well, then Department B would not need to handle the added steps of finding, correcting, and dealing with the consequences of errors, obviating the need for the hidden factory. No reasonably well-informed external customer would pay more for these steps. Thus, the hidden data factory creates no value. By taking steps to remove these inefficiencies, you can spend more time on the more valuable work they will pay for.

Note that very near term, you probably have to continue to do this work. It is simply irresponsible to use bad data or pass it onto a customer. At the same time, all good managers know that, they must minimize such work.

It is clear enough that the way to reduce the size of the hidden data factories is to quit making so many errors. In the two-step process above, this means that Department B must reach out to Department A, explain its requirements, cite some example errors, and share measurements. Department A, for its part, must acknowledge that it is the source of added cost to Department B and work diligently to find and eliminate the root causes of error. Those that follow this regimen almost always reduce the costs associated with hidden data factories by two thirds and often by 90% or more.

I don’t want to make this sound simpler than it really is. It requires a new way of thinking. Sorting out your requirements as a customer can take some effort, it is not always clear where the data originate, and there is the occasional root cause that is tough to resolve. Still, the vast majority of data quality issues yield.

Importantly, the benefits of improving data quality go far beyond reduced costs. It is hard to imagine any sort of future in data when so much is so bad. Thus, improving data quality is a gift that keeps giving — it enables you to take out costs permanently and to more easily pursue other data strategies. For all but a few, there is no better opportunity in data.

The article above was originally written for Harvard Business Review and is reprinted with permission.

In January 2018, Service Objects spoke with the author, Tom Redman, and he gave us an update on the article above, particularly as it relates to the subject of data quality.

According to Tom, the original article anticipated people asking, “What’s going on?  Don’t people care about data quality?”

The answer is, “Of course they care.  A lot.  So much that they implement ‘hidden data factories’ to accommodate bad data so they can do their work.”  And the article explored such factories in a generic “two-department” scenario.

Of course, hidden data factories take a lot of time and cost a lot of money, both contributing to the $3T/year figure.  They also don’t work very well, allowing lots of errors to creep through, leading to another hidden data factory.  And another and another, forming a sort of “daisy chain” of hidden data factories.  Thus, when one extends the figure above and narrows the focus to customer data, one gets something like this:

I hope readers see the essential truth this picture conveys and are appalled.  Companies must get in front on data quality and make these hidden data factories go away!

©2018, Data Quality Solutions

Is Your Data Quality Strategy Gold Medal Worthy?

A lot of you – like many of us here are Service Objects – are enjoying watching the 2018 Winter Olympics in Pyeongchang, Korea this month. Every Olympics is a spectacle where people perform incredible feats of athleticism on the world stage.

Watching these athletes reminds us of how much hard work, preparation, and teamwork go into their success. Most of these athletes spend years behind the scenes perfecting their craft, with the aid of elite coaches, equipment, and sponsors. And the seemingly effortless performances you see are increasingly becoming data-driven as well.

Don’t worry, we aren’t going to put ourselves on the same pedestal as Olympic medalists. But many of the same traits behind successful athletes do also drive reliable real-time API providers for your business. Here are just a few of the qualities you should look for:

The right partners. You probably don’t have access to up-to-the-minute address and contact databases from sources around the world. Or a database of over 400 million phone numbers that is constantly kept current. We do have all of this, and much more – so you can leverage our infrastructure to assure your contact data quality.

The right experience. The average Olympic skater has invested at least three hours a day in training for over a decade by the time you see them twirling triple axels on TV, according to Forbes. Likewise, Service Objects has validated nearly three billion transactions since we were founded in 2001, with a server uptime reliability of 99.999 percent.

The right strategy. In sports where success is often measured in fractions of a second, gold medals are never earned by accident: athletes always work against strategic objectives. We follow a strategy as well. Our tools are purpose-built for the needs of over 2500 customers, ranging from marketing to customer service, with capabilities such as precise geolocation of tax data, composite lead quality scores based on over 130 criteria, or fraud detection based on IP address matching. And we never stop learning and growing.

The right tools. Olympic athletes need the very best equipment to be competitive, from ski boots to bobsleds. In much the same way our customers’ success is based around providing the best infrastructure, including enterprise-grade API interfaces, cloud connectors and web hooks for popular CRM, eCommerce and marketing automation platforms, and convenient batch list processing.

The right support. No one reaches Olympic success by themselves – every athlete is backed by a team of coaches, trainers, sponsors and many others. We back our customers with an industry-leading support team as well, including a 24×7 Quick Response Team for urgent mission-critical issues.

The common denominator between elite athletes and industry-leading data providers is that both work hard to be the best at what they do and aren’t afraid to make big investments to get there. And while we can’t offer you a gold, silver, or bronze medal, we can give you a free white paper on how to make your data quality hit the perfect trifecta of being genuine, accurate and up-to-date. Meanwhile, enjoy the Olympics!

How to Convince Your Boss Your Business Needs a Data Quality Solution

Many developers come to Service Objects because they recognize their company has a need for a data quality solution. Maybe you were tasked by a manager to help find a solution, perhaps someone in Marketing mentioned one of their pain points, or maybe you just naturally saw opportunities for improvement. Whatever the reason was that got you looking for a data quality solution, you most likely need to get someone in management to sign off on your solution. Often, especially in larger organizations, this can be a challenge. So how are others accomplishing this?

Service Objects has been in business for over 16 years and in that time frame, we have helped potential customers just like you get sign off for our services. In addition to being armed with information from your Account Executive on how you can achieve ROI for the service you have chosen, the following recommendations are the ones we have found to be the most useful.

Test out our services

If you haven’t already, get a free trial key for the service you are interested in.  We allow you to try out as many of our 23 data quality solutions as you would like.  My blog, Taking Service Objects for a Test Drive, can help you figure out which method of testing our services is best for you.

One of the values in testing our services is that it reinforces your own understanding of our products. Testing allows you to quickly visualize and show others in your organization the benefits of utilizing a data quality solution.  Showing your boss an actual test integration you did with our API is more powerful than just explaining how it works.  Or, if you decide to run a test batch with us, you will see first hand how we can improve your contact data.

You can also leverage our quick look up page and set up on a screen share with a larger team and run live examples for them.  Let your team choose some data to run live and see the results first hand.

Data summaries

Data summaries are a great visual aid to take a look at how our services help.  Take a subset of any of your data sets and send them over for a batch test.  Each batch test contains a comprehensive summary, providing you with how we validated your data and what the results mean.  Your Account Executive can review this and guide you through the results. Having a detailed report that clearly shows the current state of your data is a great tool to have when you are ready to go forward with any type of presentation to your team. A recent blog shows what one of our reports includes.  All data has a story, and with our batch summaries we try to tell that story.

Integration plan

Having an integration plan is an asset when getting sign off for our services.  You are bound to get some questions after showing your plan, such as “who?” “how?” and “how long?”.  It is a good idea to be prepared to answer these questions.

Integrating with us is not a complicated task and we have several resources to guide you.  Our sample code can quickly get you up and running and we can even customize it for your specific use case. We also have best practices available to help you with your integration. For instance, you may need to check if there are any network or firewall issues your IT team needs to complete. If you are switching from another solution or vendor, you may need to have your integration timeline in sync with turning off an old service and turning our service on.  And never forget about testing, which is one of the most important parts of any integration.  Finally, you may need to account for any training you want to provide to your team about how to work with our data or your new system.  Having answers to these questions can help arm you with everything you need to keep the ball rolling.

Customer service and support

Support.  It is just one word but it is key.  Sometimes even the most important word.  You can purchase a top-notch product or service, but if you can’t get adequate customer support when you need it, your purchase loses most of its value to you.  With Service Objects, this is never the case.  In fact, customer service is one of our core values. We care about your success and truly want you to be succeed.  If you want to get buy in from your team, then it is very important to discuss our customer service expertise. Having a dedicated customer support resource, comprised of engineers, will ease your transition to our offerings and in the end, will save your organization significant time and money.

Value proposition

At Service Objects, our value proposition is very clear. Our data quality solutions ensure your data is as genuine, accurate and up-to-date as it can possibly be. A large percentage of our customers have been with us for many years, and this is due to the effort we put into the quality, accuracy, speed, reliability and expertise we deliver.

Getting buy-in from colleagues

It is always helpful to get people on your side when you want to present a new service to your company. Talking it over with colleagues and managers is great and many of the things written in this blog can help get them on your side to support your idea.  It also goes a long way in educating your team on the data quality issues you are facing and how a Service Objects’ solution can alleviate many of those problems.

I hope some of these ideas or tips can help you along the way in presenting our services to your team or manager.  The last thing you want to do is have a conversation with your boss without adequate preparation.  Your ideas are good, so take your time and plan the right action in getting them implemented.  In the end, the results our services will provide for your company will be impactful and worth your time.

Visit our product page to Get started testing any of our services.

To Be Customer-Centric, You Have To Be Data-Centric

In today’s fast-paced world, customers have become more demanding than ever before. Customer-centric organizations need to build their models after critically analyzing their customers, and this requires them to be data-centric.

Today, customers expect companies to be available 24/7 to solve their queries. They expect companies to provide them with seamless information about their products and services. Not getting such features can have a serious impact on their buying decision. Customer-centric organizations need to adopt a data-centric approach not just for targeted customer interactions, but also to survive competition from peers who are data-centric.

Customer-centric organizations need to go data-centric


Today, customers enquire a lot before making a decision. And social media enquiries are the most widely used by customers. A satisfactory experience is not limited to prompt answers to customer queries alone. Customers need a good experience before making the purchase, during the installment of product or deployment of service, and even after making the purchase. Thus, to retain a customer’s loyalty and to drive in regular profits, companies need to be customer-centric. And that can only happen if companies adopt a data-centric approach. Big data comes handy in developing a model that gives optimum customer experience. It helps build a profound understanding of the customer, such as what they like and value the most and the customer lifetime value to the company. Besides, every department in a data-centric organization can have the same view about its customer, which guarantees efficient customer interactions. Companies like Amazon and Zappos are the best examples of customer-centric organizations that heavily depend on data to provide a unique experience to their customers. This has clearly helped them come a long way.


Companies can collect a lot of information that can help them become customer-centric. Here are some ways in which they can do so:

  • Keep a close eye on any kind of new data that could help them stay competitive, such as their product prices and the money they invest in logistics and in-product promotion. They need to constantly monitor the data that tells them about the drivers of these income sources.
  • Reach out to customers from different fields with varying skill sets to derive as many insights as possible so as to help make a better product.
  • Develop a full-scale data project that will help them collect data and apply it to make a good data strategy and develop a successful business case.

Today, there is no escaping customer expectations. Companies need to satisfy customers for repeat business. Customer satisfaction is the backbone of selling products and services, maintaining a steady flow of revenue, and for the certainty of business. And for all of that to happen, companies need to gather and democratize as much information about their customers as possible.

Reprinted with permission from the author. View original post here.

Author: Naveen Joshi, Founder and CEO of Allerin

The Difference Between Customer Experience And User Experience

There are a lot of buzzwords thrown around in the customer sphere, but two of the big ones relate to experiences—customer and user. Although CX and UX are different and unique, they must work together for a company to have success.

User experience deals with customers’ interaction with a product, website, or app. It is measured in things like abandonment rate, error rate, and clicks to completion. Essentially, if a product or technology is difficult to use or navigate, it has a poor user experience.

Customer experience on the other hand focuses on the general experience a customer has with a company. It tends to exist higher in the clouds and can involve a number of interactions. It is measured by net promoter score, customer loyalty, and customer satisfaction.

Both customer experience and user experience are incredibly important and can’t truly exist and thrive without each other. If a website or mobile app has a bad layout and is cumbersome to navigate, it will be difficult for customers to find what they need and can lead to frustration. If customers can’t easily open the mobile app from an email on their phone, they likely won’t purchase your product. Likewise, if the product layout is clunky, customers likely won’t recommend it to a friend no matter how innovative it is. User experience is a huge part of customer experience and needs to play a major role when thinking like a customer.

Although UX and CX are different, they need to work closely together to truly be successful. Customer experience representatives should be working alongside product engineers to make sure everything works together. By taking themselves through the entire customer journey, they can see how each role plays into a customer’s overall satisfaction with the product and the company. The ultimate goal is a website or product that beautifully meshes the required elements of navigation and ease with the extra features that will help the brand stand out with customers.

When thinking about customer experience, user experience definitely shouldn’t be left behind. Make both unique features an essential part of your customer plan to build a brand that customers love all around.

Reprinted with permission from the author. View original post here.

Author Bio: Blake Morgan is a customer experience futurist, author of More Is More, and keynote speaker.

Go farther and create knock your socks-off customer experiences in your organization by enrolling in her new Customer Experience School.

Leverage Service Objects’ Industry Expertise to Reach Your Data Quality Goals

At Service Objects, we are fully committed to our customers’ success, which is a main factor in why over 90% of our business is from repeat customers. And with over 16 years of experience in contact validation, we have accumulated a broad base of industry expertise, created numerous best practices and are considered thought leaders in global data validation.

It is because of this knowledge that some of our customers turn to us when they lack the internal resources to carry out their data quality project. Whether it is assistance in implementing a data quality initiative, asking for customization to our products to meet specific business needs or help integrating our solutions into Marketing or Sales Automation platforms, Service Objects’ Professional Services can assist your business in achieving optimal results on your project in a quick and efficient manner.

Here are just three of the ways we can help:

Integration programming and code support

If your team is overwhelmed or lacks the technical resources to integrate data quality solutions into your existing systems, Service Objects can step in and quickly get your project moving. We provide your team with the technical knowledge, support, and best practices needed to implement your chosen solution in a timely fashion and within your budget.

CRM or marketing automation platform integration

We have created cloud connectors for the leading sales and marketing platforms and have developed extensive knowledge on how these systems work with our data quality solutions. We enable your organization to implement best practices, allowing your business to verify, correct and append contact data at the point of entry. The result is your contact database contains records that are as genuine, accurate and up-to-date as they can possibly be.

Custom services

Our engineers have years of experience creating, implementing and supporting data quality services in many different programming languages. As a result, we can customize our existing services to solve a challenge that is specific to your business. Our proactive support services team will work with your technical team to refine, test and implement the custom service to work for your business’ specifications.

These are just some of the ways we can help. For more information about how you leverage our industry expertise and technical knowledge, contact us.

Three Things I am Thankful for This Thanksgiving

Thanksgiving in the United States is about much more than taking a couple of days off from work, eating too much turkey, or starting your holiday shopping. This holiday has always had a higher purpose: to reflect and give thanks every year. So what am I thankful for?

There are three things on my list. First and foremost, I am thankful for you, our customers. Obviously every CEO is thankful for paying customers, but I also mean it from a different perspective: you are a very diverse and talented group of people, and we enjoy working with you. Our customers run the whole gamut of applications: increasing museum attendance through demographic analysis, creating precise address lists to expand rural broadband coverage, geocoding maps of hurricane refugees, and much more. It is truly never boring to come to work here.

Which leads me to the next thing on my list: I am thankful that Service Objects is a great place to work. We began as a small startup in 2001, but today we are a growing organization whose management team has over a century of combined experience in our respective fields.  And we hire really cool people! We have a strong culture that emphasizes teamwork, healthy living and environmental awareness. And who wouldn’t be thankful for coming to work in sunny Santa Barbara, California every morning?

More important, everyone here is extremely good at what they do, whether it is technology, service, or operations. We operate in a very competitive space, and quite frankly the talent of our people – and the clear service culture people experience when they work with us – is our biggest competitive differentiator. If you look at the many customer success stories on our website, you will find clients raving about our technology AND our people, and I am very proud of that.

Finally, one of the biggest things I am thankful for is the chance to make a real difference in the environment we share on this planet. Many people know that I first started this company in response to the problem of excessive junk mail in the waste stream, developing mathematical models for ways to mitigate this massive waste of resources. Our focus on real-time contact validation, and all the products and services that follow from there always has had its roots in resource conservation.

Today, we are proud to have processed nearly three billion transactions. This translates directly to less waste and more efficient use of our precious resources, saving 23 million gallons of water and 11 thousand cubic yards of landfill space each and every year. Since Service Objects’ inception in 2001 this adds up to 12 million trees and 150 million pounds of paper saved – and counting. This is a legacy we frankly cherish as much as any quarterly sales figures.

SO what are you thankful for?

Hope you have a very happy Thanksgiving holiday!

Service Objects Launches Newly Redesigned Website

Service Objects is excited to announce that we have launched a newly redesigned website, The redesign effort was undertaken to enhance the user experience and features a new graphical feel, enhanced content and improved technical functionality. Visitors can now more quickly find information on how Service Objects’ contact validation solutions solve a variety of challenges in global address validation, phone validation, email validation, eCommerce and lead validation. Free trial keys for all 23 data quality services can also be readily accessed.

As part of the launch, Service Objects also made significant updates to its data quality and contact validation blog, which contains hundreds of posts on topics such as fraud protection, address validation and verification, data quality best practices, eCommerce, marketing automation, CRM integration and much more. New content is published weekly and visitors can subscribe to have new content and updates sent to them directly.

“The recent launch of DOTS Address Validation International and DOTS Lead Validation International has firmly established Service Objects as the leader in global intelligence,” said Geoff Grow, CEO and Founder, Service Objects. “We redesigned our website to more prominently communicate Service Objects’ expertise in the global intelligence marketplace and continue to reinforce what is most important to our customers: in-depth developer resources, guaranteed system availability, 24/7/365 customer support and bank grade security.”

New features also include three ways to connect with our services: API integration, Cloud Connectors or sending us a list.  We hope you will take a look at our new website and blog and send us your feedback at

Service Objects integrations can help improve your contact data quality, help with data validation, and enhance your business operations.

API Integration: Where We Stand

Applications programming interfaces or APIs continues to be one of the hottest trends in applications development, growing in usage by nearly 800% between 2010 and 2016 according to a recent 2017 survey from API integration vendor, Cloud Elements. Understandably, this growth is fueling an increased demand for API integration, in areas ranging from standardized protocols to authentication and security.

API integration is a subject near and dear to our hearts at Service Objects, given how many of our clients integrate our data quality capabilities into their application environments. Using these survey results as a base, let’s look at where we stand on key API integration issues.

Web service communications protocols

This year’s survey results bring to mind the old song, “A Little Bit of Soap” – because even though the web services arena has become dominated by representational state transfer (REST) interfaces, used by 83% of respondents, a substantial 15% still use the legacy Simple Object Access Protocol (SOAP) – a figure corroborated by the experiences of our own integrators.

This is why Service Objects supports both REST and SOAP among most if not all services. We want our APIs to be flexible enough for all needs, we want them to work for a broad spectrum of clients, and we want the client to be able to choose what they want, whether it is SOAP or REST, XML or JSON.  And there are valid arguments for both in our environment.

SOAP is widely viewed as being more cumbersome to implement versus REST, however tools like C# in Visual Studio can do most of the hard work of SOAP for you. Conversely, REST – being URL http/get focused – does carry a higher risk of creating broken requests if care is not taken.  Addresses, being a key component in many of our services, often contain URL-breaking special characters.  SOAP inherently protects these values, while REST on a GET call does not properly encode the values and could create broken URLs. For many clients, it is less about preference and more about tools available.

Webhooks: The new kid on the block

Webhooks is the new approach that everyone wants, but few have implemented yet. Based on posting messages to a URL in response to an event, it represents a straightforward and modular approach versus polling for data. Citing figures from Wufoo, the survey notes that over 80% of developers would prefer this approach to polling. We agree that webhooks are an important trend for the future, and we have already created custom ones for several leading marketing automation platforms, with more in the works.

Ease of integration

In a world where both applications and interfaces continue to proliferate, there is growing pressure toward easier integration between tools: using figures cited from SmartBear’s State of the APIs Report 2016, Cloud Elements notes that this is a key issue for a substantial 39% of respondents.

This is a primary motivation for us as well, because Service Objects’ entire business model revolves around having easy-to-integrate APIs that a client can get up and running rapidly. We address this issue on two fronts. The first is through tools and education: we create sample code for all major languages, how-to documents, videos and blogs, design reference guides and webhooks for various CRM and marketing automation platforms. The second is a focus on rapid onboarding, using multiple methods for clients to connect with us (including API, batch, DataTumbler, and lookups) to allow easy access while APIs are being integrated.

Security and Authentication

We mentioned above that ease of integration was a key issue among survey respondents – however, this was their second-biggest concern. Their first? Security and authentication. Although there is a move toward multi-factor and delegated authentication strategies, we use API keys as our primary security.

Why? The nature of Service Objects’ applications lend themselves well to using API keys for security because no client data is stored. Rather, each transaction is “one and done” in our system, once our APIs perform validation on the provided data, it is immediately purged from our system and of course, Service Objects supports and promotes SSL over HTTPS for even greater protection.  In the worst-case scenario, a fraudster that gains someone’s key could do transactions on someone else’s behalf, but they would never have access to the client’s data and certainly would not be able to connect the dots between the client and their data.

Overall, there are two clear trends in the API world – explosive growth, and increasing moves toward unified interfaces and ease of implementation. And for the business community, this latter trend can’t come soon enough. In the meantime, you can count on Service Objects to stay on top of the rapidly evolving API environment.

The Talent Gap In Data Analytics

According to a recent blog by Villanova University, the amount of data generated annually has grown tremendously over the last two decades due to increased web connectivity, as well as the ever-growing popularity of internet-enabled mobile devices. Some organizations have found it difficult to take advantage of the data at their disposal due to a shortage of data-analytics experts. Primarily, small-to-medium enterprises (SMBs) who struggle to match the salaries offered by larger businesses are the most affected. This shortage of qualified and experienced professionals is creating a unique opportunity for those looking to break into a data-analysis career.

Below is some more information on this topic.

Data-analytics career outlook

Job openings for computer and research scientists are expected to grow by 11 percent from 2014 to 2024. In comparison, job openings for all occupations are projected to grow by 7 percent over the same period. Besides this, 82 percent of organizations in the US say that they are planning to advertise positions that require data-analytics expertise. This is in addition to 72 percent of organizations that have already hired talent to fill open analytics positions in the last year. However, up to 78 percent of businesses say they have experienced challenges filling open data-analytics positions over the last 12 months.

Data-analytics skills

The skills that data scientists require vary depending on the nature of data to be analyzed as well as the scale and scope of analytical work. Nevertheless, analytics experts require a wide range of skills to excel. For starters, data scientists say they spend up to 60 percent of their time cleaning and aggregating data. This is necessary because most of the data that organizations collect is unstructured and comes from diverse sources. Making sense of such data is challenging, because the majority of modern databases and data-analytics tools only support structured data. Besides this, data scientists spend at least 19 percent of their time collecting data sets from different sources.

Common job responsibilities

To start with, 69 percent of data scientists perform exploratory data-analytics tasks, which in turn form the basis for more in-depth querying. Moreover, 61 percent perform analytics with the aim of answering specific questions, 58 percent are expected to deliver actionable insights to decision-makers, and 53 percent undertake data cleaning. Additionally, 49 percent are tasked with creating data visualizations, 47 percent leverage data wrangling to identify problems that can be resolved via data-driven processes, and 43 percent perform feature extraction, while 43 percent have the responsibility of developing data-based prototype models.

In-demand programming-language skills

In-depth understanding of SQL is a key requirement cited in 56 percent of job listings for data scientists. Other leading programming-language skills include Hadoop (49 percent of job listings), Python (39 percent), Java (36 percent), and R (32 percent).

The big-data revolution

The big-data revolution witnessed in the last few years has changed the way businesses operate substantially. In fact, 78 percent of corporate organizations believe big data is likely to fundamentally change their operational style over the next three years, while 71 percent of businesses expect the same resource to spawn new revenue opportunities. Only 58 percent of executives believe that their employer has the capability to leverage the power of big data. Nevertheless, 53 percent of companies are planning to roll out data-driven initiatives in the next 12 months.

Recruiting Trends

Companies across all industries are facing a serious shortage of experienced data scientists, which means they risk losing business opportunities to firms that have found the right talent. Common responsibilities among these professionals include developing data visualizations, collecting data, cleaning and aggregating unstructured data, and delivering actionable insights to decision-makers. Leading employers include the financial services, marketing, corporate and technology industries.

View the full infographic created by Villanova University’s Online Master of Science in Analytics degree program.

Reprinted with permission.

Getting the Most Out of Data-Driven Marketing

How well do you know your prospects and customers?

This question lies at the heart of what we call data-driven marketing. Because the more you know about the people you contact, the better you can target your offerings. Nowadays smart marketers are increasingly taking advantage of data to get the most bang from their marketing budgets.

Suppose that you offer a deal on a new razor, and limit the audience to adult men. Or take people who already eat fish at your restaurant on Tuesdays, and promote a Friday fish fry. Or laser-target a new lifestyle product to the exact demographic group that is most likely to purchase it. All of these are examples where a little bit of data analytics can make a big difference in the success and response rate of a marketing campaign.

According to UK data marketing firm Jaywing, 95% of marketers surveyed personalize their offerings based on data, although less than half currently measure the ROI of these efforts, and less than 10% take advantage of full one-to-one cross-channel personalization. But these efforts are poised to keep growing, notes their Data Management Practice Director Inderjit Mund: “Data availability is growing exponentially. Adopting best practice data management is the only way marketers can maintain a competitive advantage.”

Of course, data-driven marketing can also go sideways. For example, bestselling business author and television host Carol Roth once found herself peppered with offers for baby merchandise – including an unsolicited package of baby formula – even though she is not the least bit pregnant. Her suspicion? Purchasing baby oil regularly from a major chain store, which she uses in the shower, made their data wonks mistakenly think that she was a new mother. Worse yet, this kind of targeted marketing also led the same chain to unwittingly tip off a father that his daughter was pregnant.

This really sums up the promise, and the peril, of using data to guide your marketing efforts. Do it wrong, and you not only waste marketing resources – you risk appearing inept, or worse, offending a poorly targeted segment of your market base. But when you do it right, you can dramatically improve the reach and efficiency of your marketing for a minimal cost.

This aligns very closely with our view of a marketing environment that is increasingly fueled by data. Among the best practices recommended by Jaywing for data-driven marketing, data quality is front and center with guidelines such as focusing on data management, having the right technology in place, and partnering with data experts. And they are not alone: according to a recent KPMG CEO survey, nearly half of respondents are concerned about the integrity of the data on which they base decisions.

There is a clear consensus nowadays that powering your marketing with data is no longer just an option. This starts with ensuring clean contact data, at the time of data entry and the time of use. Beyond that, smart firms leverage this contact data to gain customer insight in demographic areas such as location, census and socioeconomic data, to add fuel to their address or email-based marketing. With cost-effective tools that automate these processes inside or outside of your applications, the days of scattershot, data-blind marketing efforts are quickly becoming a thing of the past.

No image, text reads Service Objects Tutorials

How to Use DOTS Email Validation 3

The DOTS Email Validation 3 (EV3) service has been designed to be robust enough to accommodate the particular needs of a detailed oriented programmer and simple enough to be used by a marketing assistant who needs to run an email campaign. The service can meet various needs that can essentially be narrowed down to two use cases, form validation and post-processing jobs such as batches and database hygiene. Before we discuss those two cases we will first go over the recommended service operation and review some of the important result fields.

Which operation should I use?

The recommended service operation for EV3 is the ValidateEmailAddress method. This operation performs real-time server-to-server email verification. It lets the user specify a timeout value, in milliseconds, for how long it can take to perform real-time server checks. A minimum value of 200 milliseconds is required; however, results are dependent on the network speed of an email’s host, which may require several seconds to verify. Average mail server response times are approximately between 2-3 seconds, but some slower mail servers may take 15 seconds or more to verify.

Please note that the above information is also available in the service developer guide.

Understanding the results

The service returns many results that can be used to meet a programmer’s particular email validation needs, but the easiest way to determine if an email should be accepted or rejected is by looking at either the IsDeliverable value or the Score value.


For most cases it is recommended to use the Score along with other output values to cater to your particular needs. Here are the possible score values.

Score Description Notes
0 Email is Good Indicates with high confidence that the email address is deliverable and good. The email address was verified with the host mail server and no malicious warnings were found.
1 Email is Probably Good Indicates that the email is deliverable but one or more lesser warnings were found. For example the email may be a potential alias or a role, which are sometimes used as disposable addresses.
2 Unknown Indicates that not enough information was available to determine deliverability and integrity. Unknowns most commonly occur for slow mail servers that do not respond to the web service in time. They also occur for catch-all mail servers and greylists.
3 Email is Probably Bad Indicates that one or more warnings were found, such as a potential vulgarity or a string of garbage-like characters.
4 Email is Bad Indicates with high confidence that the email address is bad and/or undeliverable. Occurs for email addresses that fail critical checks such as syntax validation and DNS verification. Most commonly occurs for email addresses where the actual host mail server verified that the email does not exist. Also occurs for deliverable email addresses that are known spam traps or bots.


The simplest way to use the service is to look at the IsDeliverable field. This field will return true, false or unknown. If your primary concern is to be able to send out email with the lowest possible chance of a hard bounceback then this field alone will suffice. However, this field does not take spamtraps, vulgarities, bots or other factors into consideration. It simply indicates if the service was able to verify the deliverability of an email address with the host mail server. It does not measure the overall integrity of the email address.

If you choose to only look at one result value then it is our recommendation that you use the Score value instead of the IsDeliverable value. The Score evaluates the overall integrity of the email address and not just its deliverability. Either one of these fields can be used in conjunction with other result values to more intelligently evaluate an email address if the need arises. For example, if an email comes back as unknown in either the Score or in IsDeliverable, then we can refer to the following outputs to help us decide if we should accept, reject or retry the email address.


Returns true, false or unknown to indicate if the email’s host mail server was responsive at the time of the check. This is a one of the service’s critical checks. If this value comes back false then it will be reflected in the IsDeliverable value and in the score. Refer to this value if the email is unknown. If the value for this field is also unknown then the service most likely did not have enough time to finish verifying the email address with its host mail server. In these cases the service will continue to try and verify the email in a background process even though the request has finished. Chances are high that if you wait one or more hours and check the email again that the service will have been able to finish verifying the email addresses with the host mail server.


Returns true, false or unknown to indicate if the email’s host mail server is a catch-all. A catch-all mail server will say that an email address is deliverable even if it is not.  This is because catch-all mail servers do not reject email addresses during the initial SMTP session. This means that a catch-all mail server cannot be trusted to verify the deliverability of an email address because it may or may not reject the email address until after an email message is sent. If an email address is unknown and this value is false then chances are good that if the email is checked again at a later time then the service will have verified its deliverability. If catchall is true and there are no warnings, then we know that the mail server is good and that the email does not appear to be bad. In general this scenario leads to a 55% chance that the email is deliverable and won’t result in a hard bounce.


Returns true, false or unknown to indicate if the service was able to verify the email address with its host mail server. This value can be treated similarly to the IsDeliverable value. A true value indicates that the email address is deliverable. If the value comes back false then the mail server verified that the email is undeliverable. A false will be accompanied by the warning flag, ‘Email is Bad – Subsequent checks halted.‘ Some common reasons why this value will return unknown; the mail server is a catch-all, the service ran out of time when communicating with the host mail server or the host mail server used a defensive tactic such as a greylist.

A complete list of the output fields and values are available in the service developer guide.

The result fields given above are useful when it comes to sorting, grouping and filtering all of your validated email addresses. This is useful when working on a post-processing email job, which we will discuss later. Next, we will look at some of the descriptive flags that the service will return. These flags can be used programmatically or at a glance to determine the status of an email address.

Warning Codes & Descriptions:

There are many warning flags that the service may return but we will look at some of the more common and critical ones.

DisposableEmail, SpamTrap, KnownSpammer and Bot

An email address may be deliverable but if one or more of these warning flags is returned then it is highly recommended to reject it.

Alias, Bogus and Vulgar

If one of these warning flags is returned then you may want to either reject the email or set it aside for later review, depending on how strict you want to be.

InvalidSyntax, InvalidDomainSpecificSyntax and InvalidDNS

These are warnings for critical checks that failed. If one of these flags appears then it will be immediately followed by the warning flag ‘Email is Bad – Subsequent checks halted.

Email is Bad – Subsequent checks halted

This warning indicates that the email failed a critical check and is undeliverable. If the flag is not preceded by one of the critical warning flags then it simply means that the email’s host mail server verified that the email address is undeliverable.

A complete list of warning codes and their descriptors are available in the dev guide.

Note Codes & Descriptions:

The note flags will return descriptive information about the email, not all of which will affect the score, but we will focus on the ones that will explain why some email addresses came back as unknown.


The service is good at detecting greylist behavior from mail servers and has procedures in place to avoid them, but not all greylists are avoidable. If the service encounters a greylist then it is temporarily unable to verify the email address with its host mail server. If you encounter a greylist then chances are good that if you try to validate the email again a couple of hours later that you will get a better response.


This flag indicates that the service was able to connect to the email’s host mail server, but that the server was temporarily busy or unavailable and it was unable to verify the email for us. If you encounter this flag then try and validate the email again a few of hours later to see if the server becomes more responsive then.


This flag indicates that the service was unable to establish a connection with a host mail server. A possible reasons for the connection failure could be that the mail server is completely offline or it is responding too slow and unable to respond in time. Some mail servers are configured to commonly respond slowly, taking as long as 60 seconds to respond to a connection. This behavior is rare but it is not entirely uncommon. If an email returns this flag then try and enter a longer timeout time to allow the service the time it needs to verify the email.


This flag indicates that the service was unable to finish verifying the email address with the host mail server in the time allowed. The mail server could be responding very slowly or the timeout time given to the service was too short. If an email returns this flag then try and enter a longer timeout time to allow the service the time it needs to verify the email.

A complete list of note codes and their descriptors are available in the developer guide.

Use case 1 – using ValidateEmailAddress for form validation

The ValidateEmailAddress method has four input fields that are all required.

Input Field Name Description Notes
EmailAddress The email address you wish to validate.
AllowCorrections Accepts true or false. The service will attempt to correct an email address if set to true. Otherwise the email address will be left unaltered if set to false. The majority of the email corrections are being performed on the domain. The local part of the email address, the portion before the @ symbol, is generally left untouched.
Timeout Accepts an integer as a string. Timeout time is in milliseconds. Do not include any commas or non-numeric values. This value specifies how long the service is allowed to wait for all real-time network level checks to finish. Real-time checks consist primarily of DNS and SMTP level verification. A minimum value of 200ms is required. When it comes to form validation it is recommended to use a timeout time that is short enough to not keep your user impatiently waiting, but long enough to allow the server-to-server communication time to finish. A relatively short timeout time between 2 to 4 seconds is generally recommended.


LicenseKey Your license key to use the service.

Accept, reject or review & retry


Emails with a score of 0, 1 or 2. In general it is recommended to not be too strict when accepting emails in a form because you do not want to potentially lose an end user.  Also, when performing form validation an end user may become agitated if they have to wait more than 5 seconds for the validation process to complete, but some slow mail servers may not be able to respond in that short amount of time.


Emails with a score of 3 or 4. If you do not want to be too strict then you can accept 3 for review, but you should always reject an email that receives a score of 4.


Depending on how strict/cautious you want to be you can choose to not initially accept emails with a score of 2 and instead put them aside to have them reviewed. If the IsCatchAllDomain field is not true then you can try and validate the email again later. Email addresses that return a score of 3 can also be set aside for review if you do not want to initially reject all of them. An email will commonly be given a score of 3 if a potential vulgarity or string of garbage characters is found.

In form validation the programmer is sometimes allowed some luxuries while others are taken away. For example, a programmer can be given the opportunity to communicate a result back to the end user but is usually restricted to a shorter timeout time so that the end user is not kept waiting too long. If you have the ability to communicate back the end user then ask the user to check for a typo and try again or try a different email address. If you don’t want to accept a role or alias type email address because they are commonly not accepted by mass email marketers then you can catch for that and tell the user to try again with a different email address.

Use case 2 – using ValidateEmailAddress for batches, email campaigns and data hygiene

The ValidateEmailAddress method has four input fields that are all required.

Input Field Name Description Notes
EmailAddress The email address you wish to validate.
AllowCorrections Accepts true or false. The service will attempt to correct an email address if set to true. Otherwise the email address will be left unaltered if set to false. The majority of the email corrections are being performed on the domain. The local part of the email address, the portion before the @ symbol, is generally left untouched. Since you are unable to ask a user to re-enter and try again if they make a mistake you can set this value to true and allow the service to make corrections.
Timeout Accepts an integer as a string. Timeout time is in milliseconds. Do not include any commas or non-numeric values. This value specifies how long the service is allowed to wait for all real-time network level checks to finish. Real-time checks consist primarily of DNS and SMTP level verification. A minimum value of 200ms is required. For non-form validation it is recommended to give the service plenty of time to verify an email address with its host mail server. Most mail servers will only take about 2 seconds on average to verify an email address, but for the occasional slow mail server that requires more time it is recommended to set the timeout time to 65 seconds. The number of mail servers that require this much time is generally minimal, so the long timeout should not make a big impact on the overall batch job.


LicenseKey Your license key to use the service.

Accept, reject or review & retry


Emails with a score of 0 or 1.


Emails with a score of 3 or 4. If you do not want to be too strict then you can accept 3 for review, but you should always reject an email that receives a score of 4.


Emails with a score of 2, unless the IsCatchAllDomain field value is true. An email that gets an unknown score  due to a greylist, timeout or temporarily busy server should be checked again a couple of hours later.

If you would like to discuss your particular use case for recommendations and best practices contact us!

Getting Started with Service Objects

Service Objects has worked hard to make testing our APIs as simple as possible, and this in-depth guide to getting started will have you prepped for whenever you are ready. To get the ball rolling, simply fill out the “Free API Trial Key” form for the service you are interested in testing. This form is located on the right side of each our product pages.

If you are an Engineer/Programmer and it’s your first time signing up, you will receive an email confirming your registration.  Shortly after, you will receive your Welcome email with the Trial Key and testing information. The Welcome email can be broken down into four main parts; the sample code downloads section, our detailed developer guides, sample input data downloads, and the service’s endpoint. All this information will help you get started testing quickly and smoothly.

Sample Code – We have made it our mission to provide sample code in a majority of the most widely used programming languages. This includes Ruby on Rails, Java, Python, NodeJS, C#, and many others. If your desired programming language is missing from our repository, please feel free to reach out to us. We are more than happy to provide integration advice and impart our best practices and procedures.

Within each set of sample code you will find our recommended methods of obfuscating your license key, setting request timeouts, response/error handling, and failover logic. Applying these methodologies to your code will help to ensure security and service up time.

Developer Guide – As the name implies, this is where developers (and others) can go to get into the nitty gritty of the service. This is where you can find detailed explanations for each of the inputs and outputs. The fastest way to understand the service outputs is to approach the developer guide with a clear understanding of your business logic. With your goal in mind you can make note of the various note codes, description codes, scores, and other outputs then handle the service response accordingly.

Sample Input Data – Need a data set to test with? We provide input files with records that match the operations input parameters. Running these records will result in varying service responses. These responses can be used to gain an understanding of what will be returned by the service and how the fields can be leveraged to fit your business’s needs.

Service Endpoint – The Service Objects DOTS web services allow you to make both GET and SOAP/POST requests. By clicking on the service path link in your welcome email you will be directed to the main service landing page for the particular service you signed up for. From there you can click on your preferred operation, plug in data, add your license key and click invoke. These service landing pages act as both a quick lookup tool as well as an informative page that shows the various methods of calling the service. The query string and path parameter endpoints are described on these pages.  If you prefer to consume a file and have all your classes and clients auto-generated we also provide a WSDL.

Additionally, if you prefer to have us run the results for you, you can also upload your list (up to 500 records) and we will send the results back to you.

Now that you’ve read how easy it is getting started with Service Objects’ APIs, we look forward to assisting with your data needs!

How Millennials Will Impact Your Data Quality Strategy

The so-called Millennial generation now represents the single largest population group in the United States. If they don’t already, they will soon represent your largest base of customers, and a majority of the work force. What does that mean for the rest of us?

It doesn’t necessarily mean that you have to start playing Adele on your hold music, or offering free-range organic lattes in the company cafeteria. What it does mean, according to numerous social observers, is that expectations of quality are changing radically.

The Baby Boomer generation, now dethroned as the largest population group, grew up in a world of amazing technological and social change – but also a world where wrong numbers and shoddy products were an annoying but inevitable part of life. Generation X and Y never completely escaped this either:  ask anyone who ever drove a Yugo or sat on an airport tarmac for hours. But there is growing evidence that millennials, who came of age in a world where consumer choices are as close as their smartphones, are much more likely to abandon your brand if you don’t deliver.

This demographic change also means you can no longer depend on your father’s enterprise data strategy, with its focus on things like security and privacy. For one thing, according to USA Today, millennials could care less about privacy. The generation that grew up oversharing on Instagram and Facebook understands that in a world where information is free, they – and others – are the product. Everyone agrees, however, that what they do care about is access to quality data.

This also extends to how you manage a changing workforce. According to this article, which notes that millennials will make up three quarters of the workforce by 2020, dirty data will become a business liability that can’t be trusted for strategic purposes, whether it is being used to address revenues, costs or risk. Which makes them much more likely to demand automated strategies for data quality and data governance, and push to engineer these capabilities into the enterprise.

Here’s our take: more than ever, the next generation of both consumers and employees will expect data to simply work. There will be less tolerance than ever for bad addresses, mis-delivered orders and unwanted telemarketing. And when young professionals are launching a marketing campaign, serving their customers, or rolling out a new technology, working with a database riddled with bad contacts or missing information will feel like having one foot on the accelerator and one foot on the brake.

We are already a couple of steps ahead of the millennials – our focus is on API-based tools that are built right into your applications, linking them in real time to authoritative data sources like the USPS as well as a host of proprietary databases. They help ensure clean data at the point of entry AND at the time of use, for everything from contact data to scoring the quality of a marketing lead. These tools can also fuel their e-commerce capabilities by automating sales and use tax calculations, or ensure regulatory compliance with telephone consumer protection regulations.

In a world where an increasing number of both our customers and employees will have been born in the 21st century, and big data becomes a fact of modern life, change is inevitable in the way we do business. We like this trend, and feel it points the way towards a world where automated data quality finally becomes a reality for most of us.

What Does API Really Mean?

API stands for Application Program Interface. It is a way for software to communicate with each other based on specific inputs and outputs. API’s are everywhere, nearly any application in existence can use them. One API can even utilize one or more other API’s as well. What an API is has been defined and explained in any number of Google search results. What I want to talk about is what an API really is.

It is a way to reuse existing code. Writing reusable code is often a high consideration when developers write code. If you are going to need a specific functionality over and over again, why recreate the wheel? Well, that is just what APIs help developers avoid, as well as making code more understandable and enabling them to be more productive, resulting in saved time and money.

The interchangeable nature of API’s makes switching one API out for another relatively simple. The cost of improving complex segments of code is almost as simple as unplugging a broken toaster and plugging in a new one.

The beauty of APIs is that they allow your organization’s developers to do what they do best. They don’t need to be an expert in everything. Often organizations can be sent on wild goose chases trying to figure out solutions that they are not experts in. At Service Objects, we want your business to continue being an expert at what you do and let us be your expert in the field of contact validation. Why attempt to do the heavy lifting of trying to solve a problem that is not in your wheel house? Without spending an enormous amount of time and money, there is little chance that you will be able to reproduce the code that another API can give you out of the box.

At Service Objects, we have been developing our services since 2001, so when someone is purchasing any of our 23 APIs, they are using over 16 years of cumulative knowledge and expertise. And on top of that, we are going to keep learning and improving on our services so that our customers don’t have to. What this means is that your organization can be using the best code available in your applications and leverage the best practices we have developed without being an expert in the field of data validation.

For more information, or to obtain a free trial key for any of Service Objects’ Data Validation APIs, click here.

New CRM or ERP? Reduce Your Migration Risk

Birds and data have one thing in common: migration is one of the biggest dangers they face. In the case of our feathered friends, their annual migration subjects them to risks ranging from exhaustion to unfamiliar predators. In the case of your data, moving it to a new CRM or ERP system carries serious risks as well. But with the right steps, you can mitigate these risks, and preserve the asset value of your contact database as it moves to a new system.

In general, there are two key flavors of data migration, each with their own unique challenges:

The big bang approach. This involves conducting data migration within a small, defined processing window during a period when employees are not actively using the system – for example, over a long weekend or holiday break.

This approach sounds appealing for many sites, because it is the quickest way to complete the data migration process. However, its biggest challenge involves data verification and sign-off. Businesses seldom conduct a dry run before going live with migration, resulting in the quality of migrated data often being compromised.

One particular issue is the interface between a new enterprise system and internal corporate systems. According to TechRepublic, enterprise software vendors still suffer from a lack of standardization across their APIs, with the result that every integration requires at least some custom configuration, leading to concerns about both data integrity and follow-on maintenance.

The trickle approach. Done with real-time processes, this approach is where old and new data systems run in parallel and are migrated in phases. Its key advantage is that this method requires zero downtime.

The biggest challenge with this approach revolves around what happens when data changes, and how to track and maintain these changes across two systems. When changes occur, they must be re-migrated between the two systems, particularly if both systems are in use. This means that it is imperative for the process to be overseen by an operator from start to finish, around the clock.

Beyond these two strategies, there is the question of metadata-driven migration versus content-driven migration – another major hurdle in the quest to migrate genuine, accurate, and up to date data. IT might be more focused on the location of the source and the characteristics of each column, whereas marketing depends upon the accuracy of the content within each field. According to Oracle, this often leads to content that does not match up with its description, and underscores the need for close inter-departmental coordination.

Above all, it is critical that a data validation and verification system be in place before moving forward with or signing-off on any data migration process. The common denominator here is that you must conduct data validation and verification BEFORE, DURING, and AFTER the migration process. This is where Service Objects comes into play.

Service Objects offers a complete suite of validation solutions that provide real-time data synchronization and verification, running behind the scenes and keeping your data genuine, accurate, and up to date. These tools include:

One particular capability that is useful for data migration is our Address Detective service, which uses fuzzy logic to fill in the gaps of missing address data in your contact records, validates the result against current USPS data, and returns a confidence score – perfect for cleaning contact records that may have been modified or lost field data during the migration process.

Taking steps to validate all data sources will save your company time and extra money. With Service Objects data validation services, we’ll help you avoid the costs associated with running manual verifications, retesting, and re-migration. And then, like the birds, it will be much easier for you and your data to fly through a major migration effort.

Service Objects Lands on CIOReview’s Top 20 Most Promising API Solutions

Service Objects is very proud to have been recently selected as one of CIOReview’s Top 20 Most Promising API Solution Providers for 2016, judged by a distinguished panel comprised of CEOs, CIOs and VPs of IT, including CIOReview’s editorial board.

Now if you are reading this, you probably have one of two reactions: “Wow, that’s cool!” Or perhaps, “What’s an API?”

If it is the latter, allow us to explain. An API, short for an Application Programming Interface, is code that allows our data validation capabilities to be built into your software. Which means that applications ranging from marketing automation packages to CRM systems can reach into our extensive network of contact validation databases and logic, without ever leaving the application.

What this means for them is seamless integration, real time results and better data quality. Their databases have correct, validated addresses. Their leads are scored for quality, so they are mailing to real people instead of “Howdy Doody.” Their orders are scanned for potential fraud, ranging from BIN validation on credit cards to geolocation for IP addresses, so that you know when an order for someone in Utah is originating in Uzbekistan.

What this means for you is that the applications you use are powered by the hundreds of authoritative data sources available through Service Objects – even if you never see it. Of course, we have many other ways to use our products, including real-time validation of lists using our PC-based DataTumbler application, batch FTP processing of lists, and even the ability to quickly look up specific addresses via the Web. But we are proud of our history of providing world-class data validation tools to application developers and systems integrators.

Now, if APIs are old hat to you, this award represents something important to you too: it recognizes our track record within the developer community of providing SaaS tools with superior commercial applicability, data security, uptime and technical support. As a companion article in CIOReview points out, “Service Objects is the only company to combine the freshest USPS address data with exclusive phone and demographic data. Continuous expansion of their authoritative data sets allows Service Objects to validate billions of addresses and phone numbers from around the world, making their information exceptionally accurate and complete.”

There is much more coming in the future, for systems integrators and end users alike. Our CEO Geoff Grow shared with CIOReview that one key focus is “more international data, as many of our clients are doing business outside the United States and Canada … The European and Asian markets are becoming increasingly important places (and) it is important for us to expand our product offerings and our expertise in more regions of the world.” And of course, our product offerings continue to grow and expand for clients in each of the markets we serve.

If you are a developer, we make it easy to put the power of Service Objects’ data validation capabilities in your own applications. Visit our website for complete documentation and sample code, or download a free trial API key for one of our 25 data quality solutions. We know you will see why our peers rank us as one of the best in the industry!

Service Objects Provides Customized Sample Code

One of our primary goals as Application Engineers at Service Objects, is to do whatever we can to ensure that clients and prospective clients get up and running with their DOTS validation service and programming language of choice. That’s why we have over 250 different pieces of sample code available to those who want to test our services!

But what if you are interested in integrating multiple services in your application?

Lucky for you, this commitment to getting the data hungry masses up and running with testing our services goes even further. We are dedicated to ensuring that you get the most out of the service(s) that you are testing and assisting with any integration related questions. One of the ways we do this is by writing custom sample code to help our clients and prospective clients integrate our services into their business logic.

What are some examples of custom sample code?

Well I am glad you asked! Need some sample code that will run our NCOA service against 500,000 addresses in a couple hours? No problem.  Do you want to get geocode

coordinates from the contact address that comes back from our DOTS Geophone Plus 2? We’ll write you some sample code that will get that done. Does a portion of your address data include a PO Box number reflected as the unit or suite? We can help you leverage the results from our DOTS Address Validation 3 service to programmatically identify those records.  Need to use any of our DOTS validation products with asynchronous calls? We can certainly help with that as well.

There are a multitude of other combinations that our services can be used to get you your desired result! If you’re interested in any DOTS validation products and need some assistance in how to get the intended result, please reach out to us here! We will gladly provide a consultation on how to best integrate your service (or services) of choice into your application or we’ll go ahead and write a piece of sample code for you to illustrate best practices when calling a DOTS validation web service.

We’ve Raised the Bar to a 99.999% Uptime Guarantee

For many years, we’ve provided a Service Level Agreement with 99.995% availability guaranteed. This equates to less than 26 minutes of downtime annually, or less than 2 minutes and 11 seconds monthly. We’ve consistently achieved and exceeded this promise to our customers year after year, but wanted to take this commitment up a notch…

We’re excited to announce that we increased our Service Level Agreement to a 99.999% uptime guarantee, equating to less than 5 minutes of service downtime annually, or less than 26 seconds monthly!

What is a Service Level Agreement (SLA)

SLAs make use of the knowledge of enterprise capacity demands, peak periods, and standard usage baselines to compose the enforceable and measurable outsourcing agreement between vendor and client. As such, an effective SLA will reflect goals for greater performance and capacity; productivity; flexibility and availability; and standardization.

At the same time, an SLA should set the stage for meeting or surpassing business and technology service levels, while identifying any gaps currently being experienced in the achievement of service levels.

SLAs capture the business objectives and define how success will be measured, and are ideally structured to evolve with the customer’s foreseeable needs. The right approach to SLAs result in agreements that are distinguished by clear, simple language, a tight focus on business objectives, and ones that consider the dynamic nature of business to ensure evolving needs will be met.

How we do it


Multiple data centers provide redundancy by using redundant components, systems, subsystems, or facilities to counter inevitable failures or disruptions. Our servers operate in a virtualized environment, each utilizing multiple power supplies and redundant storage-arrays. Our firewalls and load-balancing appliances are configured in pairs, leveraging proven high-availability protocols, allowing for instantaneous fail-over.


Compliance is an important benefit of professional data centers. In today’s business climate, data often falls under government or industry protection and retention regulations such as SSAE 16 standards, the Health Insurance Portability and Accountability Act, and the Payment Card Industry Data Security Standard. Compliance is challenging without dedicated staff and resources. With the third party data center model, you can take advantage of the data center’s existing compliance and audit capabilities without having to invest in technology, dedicated staff, or training.

Data security & management

We’ve invested in “bank grade” security. Several of our data centers are guarded by five layers of security, including retinal scanners. All systems are constantly monitored and actively managed by our data center providers — both from a data security and a performance perspective. In addition, we operate our own in-house alerting and monitoring suites.

Geographic load balancing

Another key factor for ensuring uptime has to do with geographic load balancing and fail-over design. Geographic load balancing involves directing web traffic to different servers or data centers based on users’ geographic locations. This can optimize performance, allow for the delivery of custom content to users in a specific region, or provide additional fail-over capabilities.

Ensuring a high level of uptime comes down to: redundancy and resiliency, compliance, geographic load balancing, great data security, and 24/7 monitoring. All of these factors are equally important and contribute to our 99.999% uptime results — guaranteed!

No image, text reads Service Objects Tutorials

Python Tutorial

Python is a versatile, robust scripting language that can be used for a variety of projects and implementations. One thing that separates Python from other programming languages is its use of white space and indentations to separate different blocks of code rather than curly braces that languages like Java or C# use. This can often be a polarizing issue for many developers but it is simple enough to start using once you have some experience in it. For this tutorial, we’re going to look into making a RESTful web service call to our DOTS Phone Exchange 2 service to validate an international phone number. You will need the following to participate in this tutorial.

What you’ll need

  • Python Installed on your Test Machine(2.7.11 is used in this version)
  • IDE or Text editor of choice. We’re using the community edition of PyCharm.
  • A DOTS product License key of choice. We’re DOTS Phone Exchange 2 for this example.

Once you have all the necessary prerequisites installed, Launch PyCharm and create a new project in your directory of choice. Once you have a project created right click on the project location in the solution and select “New” and then “Python File.”

For this project, we will use the following modules that we are importing at the top of the .py file. The “Tkinter” module will allow us to have a simple GUI to allow the user to enter phone number Information to be validated. The “Tix” module allows us to use some handy labels, scrollbars and buttons to properly display the outputs from the service, and the “requests” and “xmltodict” modules will allow us to make a HTTP Get call and parse the results in the Python dictionary respectively.

To add these modules in PyCharm you will need to go into settings from the File menu. Select Project:, and then select Project Interpreter.  Click the small green plus symbol in the upper right-hand corner and from there you can search for and install the necessary modules to run this project.

Now that we have all the necessary modules in place, let’s create a method that will eventually make the HTTP Get call. For now we will leave it empty as shown below.

There is nothing very exciting happening in that method at the moment, but we’re certainly going to change that.  We’ll now create the elements necessary to take in the values that will create a successful call to the GetInternationalExchangeInfo operation for the DOTS Phone Exchange 2 web service.  The service takes in 3 values as an input: PhoneNumber, Country and Licensekey.  To create the necessary input elements, add the following bits of code to your python file below the method definition.

This will give our GUI a title, and 3 text boxes so that we can enter the necessary information to validate an international phone number. Notice at the bottom that once the button gets pressed, it will call the PE2Int method which we have defined right above this code. Since we have the user interface all setup, we can go ahead and enter the code that will make the actual call to the web service and then display the results to the user.

Web service call and failover configuration

For starters, we’ll take inputs that a user will enter in the text boxes in the GUI and also instantiate the beginning part of the URL that we’ll be using to make a web service call. Additionally, we’ll utilize a feature of the “requests” module that allows the user to format the query string items in the URL in a more readable way. See below for the example.  Our new code will also have a “primaryURL” and “backupURL” string which we’ll talk about more when we get to implementing proper failover into the project.  Add the following code to the program under the primary method.

The project will now implement a few try/catch blocks that will call the DOTS Phone Exchange 2 web service and handle any potential errors that come up during the call to the web service. In this basic bit of code, we show how to call the service using the “requests” module and how to properly failover to another data center in the event that the primary Service Objects data center is not responding as intended.

As shown above, the code will call the primaryURL and then process the response depending on the results returned from the service. If an error code is returned and a TypeCode of “3” is returned the code will throw an exception, and then call the backupURL to receive a valid response from the service. A TypeCode of “3” indicates that something has gone wrong with the service and that it is throwing an “Unhandled Error.”

Proper error checking and failover implementation should be used to ensure that your business logic goes uninterrupted and can continue to be used in the event that the primary Service Objects data center is offline or not behaving as expected.

Displaying the results to the user

Now that our failover configuration is properly set up, we can work on displaying the results to the user. To do this we’ll create Label’s using the Tix module that will simply allow us to show the values that are returned from the service. To do this, make statements in the primary call and back up call section of the code resembling the following.

Now it’s time for testing!  For this example, we’re using the phone number to a Hotel in Germany, but feel free to test any phone number you would like! This operation will also validate phone numbers from the US and Canada as well. Here is an example of the sample output from the service.

This concludes our python tutorial. Please contact support with any questions or tutorial requests!

Data Quality in Marketing: Trends and Directions for 2017

Asking whether data quality is important to your marketing efforts is a little like asking if apple pie and motherhood are important – of course, the answer will always be “yes.” Recently, however, some interesting quantitative research was published that shed light on just *how* important it has become.

Marketing research firm Ascend2 performed a survey of 250 mostly senior people in marketing, to see what they thought about data quality. Over 80% of the respondents were in management roles, with more than a quarter of the sample holding C-level positions. Fully half were associated with large companies with over 500 employees, and more that 85% had over 50 employees. Respondents were also equally split between short versus complex sales cycles.

The results showed very clearly that data quality has risen to become a critical consideration in marketing success nowadays. Here are some of their key findings:

Improving data quality is their most important strategic objective. With 62% of respondents rating this as their top objective, data quality now ranks far above more traditional marketing objectives such as improving marketing data analytics (45%), improving user experience (43%), optimizing the lead funnel (26%), and even acquiring an adequate budget (20%).

Data quality is also their biggest challenge. Respondents also ranked data quality as currently being their most critical challenge, in smaller numbers (46%) but in similar proportions to the other factors such as those mentioned above.

But things are getting better. Fully 83% of respondents feel that their marketing data strategy is at least somewhat successful at achieving objectives, with over one-third (34%) rating their own efforts as “very successful (best-in-class).” Similar numbers also feel that their tactical effectiveness is improving as well. While 14% feel that they have been unsuccessful in achieving objectives to some degree, only 3% consider themselves to be very unsuccessful.

Data quality is a downstream process. Respondents clearly favored cleaning up contact data versus constraining how it is collected. Nearly half (49%) felt that validating contact data was the most important tactic for improving marketing data quality, while less than a quarter (24%) felt that standardizing lead capture forms were important. Other upstream measures such standardizing the data upload process (34%) and developing segmentation criteria (33%) were also in the minority.

Call in the experts. An overwhelming majority of respondents (82%) outsource either some or all of the resources they use to improve marketing data quality, with over a quarter (26%) using no in-house resources at all.

The results of the survey clearly show that data quality is one of the largest challenges that marketers are currently dealing with. Whether you are frustrated with incomplete or inaccurate sales lead data, tired of bad contact data causing customer service issues, or wasting money on marketing campaigns with results negatively impacted by poor contact data, understanding the quality of your data is the first step in identifying the true costs that poor data quality is having on your organization.

Connecting the DOTS between Address Validation and Address Detective

Service Objects, Inc. has been standardizing and cleansing even the messiest of USPS valid addresses for years. The core purpose of our DOTS Address Validation-US service is determining if addresses are valid based on their ability to receive mail. However, we’ve occasionally come across addresses that were known to be good but just not known by the USPS. In recent months, we’ve been working on alternative data sets to identify addresses that may not be mail-able but are in fact known to exist. DOTS Address Detective was pegged as the perfect landing spot for these hard to find addresses.

Address Detective, is a sleuth service that can do many things to help with the messiest addresses by using other data points like: name, business name, phone number, etc. One of its operations, FindAddressLines, doesn’t even require the normal clean address1, address2, city, state, zip format. Rather, it analyzes client input and determines the best match for each element. Clients with very messy data sets, unknown data sets or even corrupted data sets can repair and validate addresses with this operation.

One of our most recently added operation’s, FindOutlyingAddresses, was created to help with those hard to find addresses that are not known by the USPS. They may be general delivery areas where only the Post Office is known or they might just be way out of the way, disconnected from most communication. Mammoth Lakes, CA is a well known major General Delivery area. Mail is not delivered to individual houses but to the community Post Office.

Although Address Detective is great for these hard to find addresses, DOTS Address Validation is still the best choice for most users for most addresses, returning the most robust data and needed data points. Which is why we also needed a clean way to help our clients of Address Validation know when they should go to check the FindOutlyingAddresses operation. Its likely they will only be doing it a very small percentage of the time.

In order for our current clients to gain this insight without having to reintegrate a new operation or service, we have added an error type that will return only if the client wishes to see it.  This new error type of ‘5’ will be linked to the clients key and will only activate if requested.

The address:  3 Oak Tree Way, Mammoth Lakes, CA, 93546 is one of these addresses.  In Address Validation this error would be returned:

But from our new operation, we can see that the house exists and is known.  It’s a “Premise” level match meaning we found an exact match for the house.

With the new error message, a client will still get an error but it will be of type “5” meaning we know there is some sort of match in OutlyingAddresses that will provide more information about the address in question.

The normal error messaging will still apply for any address that we do NOT think is a good match for FindOutlyingAddresses.  For example, the address: 200 Greenwell Lane, Santa Barbara, CA, 93105 is out of range but we also know that it will not be found by Address Detective.

This new operation is a good way to see what we may know about an address if it can’t be found by normal means.

If there are any addresses you have in question contact us, we’re always interested in researching what we can find to help improve your business processes!

Tech Support in the Age of Instant Gratification

We live in an age where an overabundance of information and resources are just a few clicks away. Even physical goods can be delivered to your front door the very same day you order.  People want and expect to have the similar convenience and response times when they need technical support.

Us tech support experts here at Service Objects completely understand that. One of our core values is to offer outstanding Customer Support to our clients. We have a good day at the office when we can quickly and effectively answer questions about our services, resolve issues and get the data hungry masses up and running with their validation service of choice.  To help ensure that our customers can get back to using their validation service for their business we have several avenues where people can seek support.

24/7 Phone Support

Do you have a pressing issue after hours?  We understand that this can be exceedingly stressful and frustrating.  We want help you get it resolved as quickly as possible.  If you do ever run into an after hours support issue call our office phone number (1.805.963.1700) and follow the prompts. Once directed leaved a message with a detailed description of the issue you are encountering and the best way to contact you and a member of our team will typically contact you within 20 minutes.

LiveChat Through our Website

Have a quick question that you want answered right away? Like: what URL you should be using? What does this particular response mean from the service? Is this an expected response from the service? Are there any current issues occurring with the service?  Is there a different endpoint I should hit for a different operation? Questions like this are examples of ones we would be happy to answer in our LiveChat on our website. Simply navigate to our website during business hours and someone will be able to start a LiveChat with you once you are available.  Once they do, simply state the question or issue you are experiencing, along with pertinent account information and we will happily assist in any way we can.

Support Tickets

The primary method to address and keep track of all our support inquiries is through our support ticketing system.  Whether you call in, use LiveChat or send us an email, most technical support issues will get sent to our ticketing system and we’ll use it to quickly and effectively address any issues or questions you may have.  To create a ticket, simply email or click here and you can fill out the form to get a ticket created. Feel free to use any of the above channels to contact us and we’ll be glad to offer any support that we can!

Looking Beyond Simple Blacklists to Identify Malicious IP Addresses

Using a blacklist to block malicious users and bots that would cause you aggravation and harm is one of the most common and oldest methods around (according to Wikipedia the first DNS based blacklist was introduced in 1997).

There are various types of blacklists available. Blacklists exist for IP addresses, domains, email addresses and usernames. The majority of the time these lists will concentrate on identifying known spammers. Other lists will serve a more specific purpose, such as IP lists that help identify known proxies, TORs and VPNs or email lists of known honeypots or lists of disposable domains.

There are many different types of malicious activity that occur on the internet and there are various types of lists out there to help identify and prevent it; however, there are also various problems with lists.

The problem with lists

In order to first identify a malicious activity with a list, the malicious activity must first occur and then be reported and propagated. It is not uncommon for malicious activity to stop by the time it has been reported and propagated. Not all malicious activities are reported. If you encounter the malicious activity before it is reported then you won’t be able to preemptively act on it.

IPs, Domains, Email Addresses and Usernames are dynamic and disposable. If a malicious user/bot gets blocked then they can easily switch to a different IP, domain etc.

Some lists offer warnings that blocking an IP address could affect thousands of users who depend on it in order to obtain crucial information that they would otherwise not have access to. So block responsibly.

Aggregating data to more effectively identify malicious activity

Instead of looking at one list to perform a simple straightforward lookup, we can take advantage of multiple datasets to uncover patterns and relationships between seemingly disparate values. A simple example would be, relating usernames to email addresses, email addresses to domains and domains to IP addresses, which allows us to view the activity of one value and compare it to the behavior of other values. Using complex algorithms with machine learning to process large samples of data we can intelligently discern if a value is directly or indirectly related to a malicious activity.

How Service Objects keeps it simple for the user

The DOTS IP Address Validation service currently has two flags to help its user deal with malicious IPs, ‘MaliciousIP’ and ‘PotentiallyMaliciousIP’. The ‘MaliciousIP’ flag indicates that the IP address recently displayed malicious activity and should be treated as such. The ‘PotentiallyMaliciousIP’’ flag indicates that the IP address recently displayed one or more strong relationships to malicious activity and that it has a high likelihood of being malicious. Both flags should be treated as warnings with the ‘MalciousIP’ flag being scrutinized more severely.

The warning signs of online fraud are out there, but you need a means of discovering them. Our DOTS IP Address Validation service encompasses many of the identification strategies necessary to make split-second decisions on would be attackers before any harm is done.

Avoid the Cost of Inaccurate and Incorrect Sales and Use Tax Rates

There are two unavoidable consequences in life: death and taxes. And taxes are far and away the complex of the two, particularly if you are in ecommerce or a business making sales to others. Here are some of the issues you face:

  • Tax rates not only vary across municipalities, but by district or even address. For example, in one case in Arizona, one side of a particular street has a different tax rate than the other side!
  • Tax rates change constantly. According to CFO Magazine, there were nearly 800 changes to sales and use tax in the US in 2014.
  • Once upon a time, your concerns about collecting sales taxes ended at the state border. Not any more. In Colorado, the so-called “Amazon tax” law now compels businesses nationwide to collect sales tax from Colorado purchases. It survived a recent Supreme Court challenge, and other states are now taking similar steps:, for example, now collects sales taxes for deliveries to 32 US states as well as the District of Columbia.

Within a given area, the different tax rates geolocation might be using may include state, city, county, county district and/or city district taxes. The calculation of these rates can become complicated when you add ‘geography’ of the buyer/seller to the mix.  Does the seller live in an unincorporated area? What tax rate(s) do you use for ecommerce? How do you know you are using the correct/current rate?

The consequences of charging incorrect sales taxes are greater than ever. And not just from the authorities, but from John Q. Public. Nowadays anyone can check tax rates on their smartphones, which has led to expensive and embarrassing class-action lawsuits accusing major firms of overcharging on tax. Suits that have reached the court system in recent years include many high-profile plaintiffs including Wal-Mart, Whole Foods, Papa John’s Pizza and Costco.

Meanwhile, state tax authorities have always asserted their authority in the case of tax errors. Since sales and use taxes represent a large portion of state revenue, states often devote dedicated resources to enforcement efforts, and can impose substantial penalties: California, for example, levies a 40 percent penalty on failures to pay sales or use tax. There are also costs associated with preparing for an audit and administering the results of these audits. Even when businesses err on the side of over-collecting sales and use taxes, the aftermath of an audit can involve time, manpower expense and costs associated with refunding sales tax over payments to customers, not to mention the customer service issues associated with the errors.

Given the complexity of sales and use taxes, as well as the sheer volume of tax jurisdictions and annual changes to tax law, automated tools are a must for most businesses – including small business. Given the diversity of these jurisdictions, it is particularly important to use tools that compute these taxes based on geolocation, rather than just coarse metrics such as ZIP codes. This includes geocoding specific addresses, and determining unincorporated areas that may fall outside normal municipal tax boundaries.

The best tax rate tools do the work for you by maintaining and updating sales and use tax databases, and integrating with your processes to determine the geolocation to apply the correct tax rates. For example, Service Objects’ DOTS FastTax takes over the hard work of tax validation and compliance, including accurate sales and use tax computation based on geolocation derived from street addresses as well as postal code information. FastTax then uses this location data to identify tax jurisdictions and eliminate problems associated with different rates in incorporated vs. unincorporated areas.  The tax rates are synchronized with the states and updated in real-time throughout the year.

For geolocation accuracy, FastTax also incorporates Service Objects’ flagship Address Validation capabilities for US and Canadian addresses.  Addresses are accurately resolved to provide a precise tax jurisdiction and total roll-up tax rate.  DOTS FastTax is available in four ways; as a real-time API integration, PC-based list processing, automated FTP-based list processing, and web-based Quick Lookups.

We all work in a challenging and complex environment for sales tax compliance, whose rate of change continues to accelerate. By putting this task in the hands of a good automation partner, you not only reduce your own workload and labor efforts, but protect yourself from the costs, penalties and reputation issues associated with compliance problems. The end result is a process that makes an inevitable part of your sales process a little less taxing.

A Commitment to Fanatical Customer Service Leads to “FindAddressLines”

Service Objects runs an agile Engineering team that likes to be ready to run with a great new idea at any given moment. We view it as one of the cornerstones of our fanatical customer service plan.

As soon as we learn about a challenge that a prospect or client is experiencing we’re excited to find a customized solution. A recent example of this is the release of a new operation for our Address Validation-US service called FindAddressLines.

DOTS Address Validation-US is one of our core services which takes as input, two address lines, city,  state, as well as postal code and does an excellent job of cleaning and standardizing even grossly misspelled addresses. In a perfect world, data is collected and properly placed where it needs to be to facilitate the validation. But we know the world isn’t perfect and one particular client had lists of addresses in which there were extra Address lines (sometimes up to 5) or even key pieces of data entered into the wrong columns altogether. FindAddressLines was born initially to help this client, and many others moving forward, clean up these types of problematic issues.

Let’s take a look at how it works:

In the example above, the first three rows work as expected using Address1 and Address2 as the inputs. However, on row 5, the address returns a missing secondary number because the suite number fell into Address3.  When you get to row 6 there is nothing to go on unless you are looking at Address4 and Address5 specifically.  Our FindAddressLines operation allows you to submit up to 10 lines (columns) including a city, state and zip and we do the work to make sure the right data makes it into the right locations.

Here’s an even messier example:

As we can see in row 4, the data was pushed out past the zip column. This can easily happen when importing data from a database to a spreadsheet if care isn’t taken for potential delimiters like commas.  With FindAddressLines, we are able to assign the extra columns as inputs and let the service figure it out.  The example in row 5 has a completely jumbled address that might have occurred from a corrupted database or just extremely messy data collection.  Again, we can use FindAddressLines to solve this one as well.

We enjoy talking to both current clients and prospects alike to determine what their needs are and what new features and services we can put together to help them improve their unique processes.  Our team is 100% committed to our customers’ success and can often rapidly put together a new solution to solve almost any problem you’re experiencing.

ERP: Data Quality and the Data-Driven Enterprise

Enterprise resource planning, or ERP for short, integrates the functions of an organization around a common database and applications suite. A brainchild of the late 20th century – the term was coined by the Gartner Group in the 1990s – the concept of ERP has grown to become ubiquitous for organizations of all sizes, in what has now become over a US $25 billion dollar industry annually.

ERP systems often encompass areas such as human resources, manufacturing, finance, supply chain management, marketing and customer relationships. Their integration not only automates many of the operations of these functions, but provides a new level of strategic visibility about your business. In the ERP era, we can now explore questions like:

  • Where your most productive facilities are
  • How much it costs to manufacture a part
  • How best to optimize your delivery routes
  • The costs of your back office operations
  • And many more

Its functions often interface with customer relationship management or CRM (discussed in a previous blog post), which provides visibility on post-sale customer interactions. CRM is often integrated within ERP product suites, adding market intelligence to the business intelligence of ERP.

ERP data generally falls into one of three categories:

Organizational data, which describes the infrastructure of the organization, such as its divisions and facilities. For most firms, this data changes very slowly over time.

Master data, which encompasses entities associated with the organization such as customers, employees and suppliers. This data changes periodically with the normal flow of business.

Transactional data, based on sales and customer interactions. This data, which is the lifeblood of your revenue pipeline, is constantly changing.

Note that two out of three of these key areas involve contact information, which in turn can come in to the system from a variety of sources – each of which is a potential source or error. Causes of these errors can range from incorrect data entry to intentional fraud, not to mention the natural process of changing addresses, phone numbers and email addresses. And this bad data can propagate throughout the system, causing consequences that can include wasted manpower, incorrect shipments, missed sales and marketing opportunities, and more.

According to one research paper, data quality issues are often a key driver for moving to ERP, and yet remain a concern following ERP implementation as well. This leads to a key concept for making ERP work for you: automated systems require automated solutions for data quality. Solutions such as Service Objects’ data verification tools ensure that good data comes into the system in the first place, leveraging constantly updated databases from sources such as the USPS and others. The end result is contact data quality that doesn’t depend on human efforts, in a chain that has many human touch points.

ERP is part of a much larger trend in business computing, towards centralized databases that streamline information flow, automate critical operations, and more importantly have strategic value for business intelligence. With the advent of inexpensive, cloud-based software, the use of these systems are spreading rapidly to businesses of all sizes. The result is a world that depends more than ever on good data quality – and the need to use tools that ensure this quality automatically.

The Role of a Chief Data Officer

According to a recent article in Information Management, nearly two-thirds of CIOs want to hire Chief Data Officers (CDO) over the next year. Why is this dramatic transformation taking place, and what does it mean for you and your organization?

More than anything, the rise of the CDO recognizes the growing role of data as a strategic corporate asset. Decades ago, organizations were focused on automating specific functions within their individual silos. Later, enterprise-level computing like CRM and ERP helped them reap the benefits of data interoperability. And today, trends such as big data and data mining have brought the strategic value of data front and center.

This means that the need is greater than ever for a central, C-level resource who has both a policy-making and advocacy role for an organization’s data. This role generally encompasses data standards, data governance, and the oversight of data metrics. A CDO’s responsibilities can be as specific as naming conventions and standards for common data, and as broad as overseeing enterprise data management and business intelligence software. They are ultimately accountable for maximizing the ROI of an organization’s data assets.

A key part of this role is oversight of data quality. Bad data represents a tangible cost across the organization, including wasted marketing efforts, misdirected product shipments, reduced customer satisfaction, and fraud, tax and compliance issues, among other factors. More important, without a consistent infrastructure for data quality, the many potential sources of bad data can fall through the cracks without insight or accountability. It is an exact analogy to how quality assurance strategies have evolved for manufacturing, software or other areas.

A recent report from the Gartner Group underscored the uphill battle that data quality efforts still face in most organizations: while those surveyed believed that data quality issues were costing each of them US $9.7 million dollars annually on average, most are still seeking justification to address data quality as a priority. Moreover, Gartner concludes that many current efforts to remediate data quality simply encourage line-of-business staff to abandon their own data responsibilities. Their recommendations include making a business case for data quality, linking data quality and business metrics, and above all shifting the mindset of data quality practitioners from being “doers” to being facilitators.

This, in turn, is helping fuel the rise of the central CDO – a role that serves as both a policymaker and an evangelist. In the former role, their job is to create an infrastructure for data quality and deploy it across the entire organization. In the latter role, they must educate their organizations about the ROI of a consistent, measurable approach to data, as well as the real costs and competitive disadvantage of not having one – particularly as more and more organizations add formal C-level responsibility for data to their boardrooms.

Service Objects has long focused on this transition by creating interoperable tools that automate the process of contact data verification, for functions ranging from address and email validation to quantitative lead scoring. We help organizations make data quality a seamless part of their infrastructure, using API and web-based interfaces that tap into global databases of contact information. These efforts have quickly gained acceptance in the marketplace: last year alone, CIO Review named us as one of the 20 most promising API solution providers. And nowadays, in this new era of the Chief Data Officer, our goal as a solutions provider is to support their mission of overseeing data quality.

No image, text reads Service Objects Tutorials

Ruby on Rails Integration Tutorial

Ruby on Rails (or more colloquially, “Rails”) is a server-side web application framework that provides its users a powerful arsenal to create web pages, web services and database structures. Rails utilizes a model-view-controller (or MVC for short) framework that provides an easy way to separate the user interface, database model and controller.

One of Service Objects highest priorities is getting our clients up and running with our DOTS Validation web service(s) as quickly as possible.  One way we do this is by providing sample code and occasionally, step by step instructions.  So if you are partial to using Rails, this tutorial will provide everything needed to get up and running.

What you will need

  • A DOTS license key, click here for a free trial key. We’re using DOTS Email Validation 3 for this tutorial.
  • Your Favorite Text Editor (Sublime, Notepad++ etc)
  • Ruby On Rails installed on your test machine.
  • Familiarity with the Command Line

Creating a default project

First, navigate to the directory via command line in which you would like the project created.  Once the command line is in the desired directory, run the command “rails new EV3Tutorial.” You may call the project whatever you prefer, but this will be the name of our tutorial.

After the project creation is finished you should have a brand new Ruby on Rails project.  You can launch the server by simply being in the project directory and entering “rails server” into the command line.  The default port number the project runs on is 3000. We’ll be using port 3001 for this project; the port number can be specified by entering the following command to launch the server: “rails server –p 3001”.

If you run the server and navigate to the applicable localhost URL where it has been launched, you’ll see the following page:

Ruby on Rails Tutorial Image One

This page is the default page for any new Rails project. From here, you find more documentation about working with Rails.

Generating a model and controller

We’ll need a controller to make the actual call to the DOTS web service and a model to pass information to our controller.  Simply type “rails generate controller request new show” and the necessary controller and views will be created. This command will also create two new views; one titled “new.html.erb” and the other called “show.html.erb”.  The command line should look similar to the following.

Ruby on Rails Tutorial Image 1

After this command is run, the controller folder and the views folder will have some new files created. It should look like the following:

The “new.html.erb” file will be our default page, so to make the application load it automatically open the routes.rb file in the config folder of the project and make the routes page look like the following.

For this example, we’re using the “ValidateEmailAddress” operation which has 4 separate inputs. EmailAddress, AllowCorrections, Timeout, and LicenseKey.   These, in turn, will be part of our model that will pass values from the view to the controller.  So before we fill in the HTML page, we’ll create the model and include these values in its definition.

Enter the command “rails generate model request” in the command line. The values in the model can be specified via the command line, but it is usually easier to enter it in the model file. Locate the model schema in the following folder:

To add values to the model, make the model schema on your file look like the following.

In order to use the model, it will need to be migrated. To do this enter “rake db:migrate” into the command line to commit the model.

Creating the views

Now that our model has been created and committed to the database, we can pass values from the view to controller. To do this, we’re going to create a simple web form with the inputs to send to the DOTS Email Validation service. To do this, add the following code to the “new.html.erb” file.

Now that our input page is set, we’ll make sure our “show.html.erb” file is ready to display some output values from the service. We will include a brief if statement in the show that will determine whether or not an error object is present in the response from the service. If one is present we’ll display that if not, we’ll display the valid results. Make your show.html.erb file look like the following.

These output fields haven’t been instantiated or set to any values yet but that will happen in our controller.

Integrating logic into the controller

Now that our views are set up, we’ll need to put some code in the controller to make the actual web service call to the Service Objects web service. To make this call, we’re going to use the gem “httparty” which will allow us to make a RESTful web service call. Be sure to place this gem in your projects gem folder and run a “bundle install” in the command line.

We won’t go over all the elements of the controller file in this tutorial, but rather just touch on some important parts of the logic to be integrated.  The screenshot below highlights some important aspects of calling the web service.

The code above has a primary and backup URL.  Currently, they both point to the Service Objects trial environment. In the event that a production key is purchased, the primary URL should be set to and the backup URL should be set to These two URLs along with the accompanying logic that checks for a fatal error (TypeCode 3 for DOTS EV3) will ensure that your application continues to run in the event that the primary Service Objects data center is offline or experiencing issues.  If you have any questions about the failover logic, please don’t hesitate to contact Service Objects and we will be glad to assist further.

The code above calls an internal function, “processresults,” to display the values to the user. Here is the accompanying screenshot illustrating some of the logic of this function.

Depending on what response comes back from the service, this code will display the values to the user.  For your application, you will likely want to create a class that will automatically serialize the response from the service into something a bit easier to use; but we are showing the above logic as an example of how to parse the values from the service. Note: the @ev3response, @ev3info and @everror are present to help parse the response hash that httparty automatically creates for the response.

Final Steps and Testing

Our project is ready to test out. For this example, we’ll use the email to get a valid response from the service.

Entering these values in to the text boxes, along with a valid license key, should result in the following output.

That completes our Ruby on Rails tutorial.  If you have any questions about this, any other tutorials or sample code please don’t hesitate to contact us! We would love to help you get up and running with a Service Objects web service(s).

The Importance of Data Accuracy in Machine Learning

Imagine that someone calls your contact center – and before they even get to “Hello,” you know what they might be calling about, how frustrated they might be, and what additional products and services they might be interested in purchasing.

This is just one of the many promises of machine learning: a form of artificial intelligence (AI) that learns from the data itself, rather than from explicit programming. In the contact center example above, machine learning uses inputs ranging from CRM data to voice analysis to add predictive logic to your cu
stomer interactions. (One firm, in fact, cites call center sales efforts improving by over a third after implementing machine learning software.)

Machine learning applications nowadays range from image recognition to predictive analytics. One example of the latter happens every time you log into Facebook: by analyzing your interactions, it makes more intelligent choices about which of your hundreds of friends – and what sponsored content – ends up on your newsfeed. And a recent Forbes article predicts a wealth of new and specialized applications, including helping ships to avoid hitting whales, automating granting employee access credentials, and predicting who is at risk for hospital readmission – before they even leave the hospital the first time!

The common thread between most machine learning applications is deep learning, often fueled by high-speed cloud computing and big data. The data itself is the star of the process: for example, a computer can often learn to play games like an expert, without programming a strategy beforehand, by generating enough moves by trial-and-error to find patterns and create rules. This mimics the way the human brain itself often learns to process information, whether it is learning to walk around in a dark living room at night or finding something in the garage.

Since machine learning is fed by large amounts of data, its benefits can quickly fall apart when this data isn’t accurate. A humorous example of this was when a major department store chain decided (incorrectly) that CNBC host Carol Roth was pregnant – to the point where she was receiving samples of baby formula and other products – and Google targeted her as an older man. Multiply examples like this by the amount of bad data in many contact databases, and the principle of “garbage in, garbage out” can quickly lead to serious costs, particularly with larger datasets.

Putting some numbers to this issue, statistics from IT data quality firm Blazent show that while over two thirds of senior level IT staff intend to make use of machine learning, 60 percent lack confidence in the quality of their data – and 45 percent of their organizations simply react to data errors as they occur. Which is not only costly, but in many cases totally unnecessary: with modern data quality management tools, their absence is too often a matter of inertia or lack of ownership rather than ROI.

Truly unlocking the potential of machine learning will require a marriage between the promise of its applications and the practicalities of data quality. Like most marriages, this will involve good communication and clearly defined responsibilities, within a larger framework of good data governance. Done well, machine learning technology promises to represent another very important step in the process of leveraging your data as an asset.

The Role of a Data Steward

If you have ever dined at a *really* fine restaurant, it may have featured a wine steward: a person formally trained and certified to oversee every aspect of the restaurant’s wine collection. A sommelier, as they are known, not only tastes wines before serving them but sets policy for wine acquisition and its pairings with food, among other responsibilities. Training for this role may involve as much as two-year college degree.

This is a good metaphor for a growing role in technology and business organizations – that of a data steward. Unlike a database administrator, who takes functional responsibility for repositories of data, a data steward has a broader role encompassing policies, procedures, and data quality. In a very real sense, a data steward is responsible for managing the overall value and long-term sustainability of an organization’s data assets.

According to Dataversity, the key role of a data steward is that they own an organization’s data. This links to the historical definition of a steward, from the Middle Ages – one who oversees the affairs of someone’s estate. This means that an effective data steward needs a broad background including areas like programming and database skills, data modeling and warehousing expertise, and above all good communications skills and business visibility. In larger organizations, Gartner sees this role as becoming increasingly formalized as a C-level position title, either as Chief Data Officer or incorporated as part of another C-level IT officer’s responsibilities.

One of the key advantages of having a formal data steward is that someone is accountable for your data quality. Too often, even in large organizations, this job falls to no one. Frequently individual stakeholders are responsible for data entry or data usage, and the process of strategically addressing bad data would add bandwidth to their jobs. This is an example of the tragedy of the commons, where no one takes responsibility for the common good, and the organization ultimately incurs costs in time, missed marketing opportunities or poor customer relations by living with subpar data quality.

Another advantage of a data steward is that someone is tasked with evaluating and acquiring the right infrastructure for optimizing the value of your data. For example, automated tools exist that not only flag or correct contact data for accuracy, but enhance its value by appending publicly available information such as phone numbers or geographic locations. Or help control fraud and waste by screening your contact data per numerous criteria, and then assigning a quantitative lead score. Ironically, these tools are often inexpensive and make everyone’s life easier, but having a data steward can prevent a situation where implementing these tools is no one’s responsibility.

Looking at a formal role of data stewardship in your own organization is a sign that you take data seriously as an asset, and can start making smart moves to protect and expand its value. It helps you think strategically about your data, and teach everyone to be accountable for their role in it. This, in turn, can become the key to leveraging your organization’s data as a competitive advantage.

These New Sales Tax Laws Might Affect Your Business

Colorado is a very difficult state to accurately calculate sales tax in. We estimate there are thousands of tax rates that vary based on location. As with most states, sales tax rates in Colorado are a mix of state, county, city, and special district tax rates. Thus, someone in Denver, Colorado pays a different sales tax rate than someone in Boulder or Fort Collins. To further complicate the sales tax situation in Colorado, some communities also impose a use tax and/or a service fee — and a host of exemptions exist.

Plus, sales tax rates fluctuate all the time. Tax rate changes in several Colorado communities such as Dillon, Johnstown, Grand Lake, Steamboat Springs, and Grand County (too name just a few) will become effective January 1, 2017.

As if all of the above weren’t enough to make calculating sales taxes in Colorado challenging, the U.S. Supreme Court just upheld a controversial law in Colorado that pressures online retailers to collect sales tax in the state. New sales tax rates and the upholding of the so-called “Amazon tax” law might make sales taxes even more difficult for some retailers and/or consumers.

So, what is this Amazon tax and what does it mean to out-of-state retailers? The 2010 law gets its nickname from the online retailer because it compels out-of-state retailers such as online businesses to collect sales taxes on purchases from Colorado residents. Typically, these businesses are exempt from collecting sales taxes in states in which they do not have a sufficient physical presence or “nexus.”

For example, if you run an online business in California with a physical store and a warehouse in the state and sell goods to customers in California, you’d be required to collect sales tax from your Californian customers due to your physical presence. Meanwhile, you would not need to do the same for your customers in other states (unless you have a physical presence in those states). The Colorado law, however, complicates transactions for customers in Colorado.

According to an article in the Denver Post, Colorado’s sales tax law gives “…businesses a tough choice: either collect the sales tax or deal with more red tape, including additional paperwork and the requirement they remind Coloradans that they owe sales tax to the state.”

Colorado’s 2010 tax law was immediately challenged by the Direct Marketing Association (now the Data & Marketing Association), which filed a lawsuit. Now that the Supreme Court has upheld the law, some believe that other states may follow Colorado’s lead and impose their own “Amazon” taxes.

So, for now, Colorado is likely the most challenging sales tax state in the nation, but these challenges could spread to other states. You have a choice when selling to customers in Colorado: navigate the complexities of calculating sales tax in Colorado — a tough task made much easier with Service Objects’ FastTax real-time sales tax API — or comply with Colorado’s use tax notification requirements (for businesses with at least $100,000 in gross annual sales).

Peer-to-Peer: The Next Frontier

How do you get millennials interested in a cause?

For starters, you don’t use traditional direct marketing techniques. Millennials won’t even answer a call from an unknown phone number more than 95 percent of the time. Email is something their grandparents used to use, with conversion rates hovering around 1 percent. And many of them don’t sit in front of a television every night passively watching advertising – they live within a broad web of individual human connections, fueled by smartphones and social media.

These are the kinds of numbers that motivated Bay Area startup Hustle ( to create a new paradigm: large-scale peer-to-peer communications via text messaging.

The Hustle platform is an enabling technology that allows text messages to be sent rapidly to people’s phones, using automated templates that can be personalized for each message. While still requiring human intervention to send messages, it dramatically increases the productivity of organizations trying to reach large amounts of people for an event, cause or campaign – and these people can text back and get responses from a real human being. The result is often a response rate in the 30-40% range.

As a result, Hustle has now attracted substantial venture funding, and its product was used to reach nearly 4 million people during the latest election season. More important, the concept of mass communication between individuals is now attracting a great deal of attention.

Of course, peer-to-peer communications are much more than a marketing technique. They are quickly becoming a revolution. You can see it in action when you use Uber to get a ride from a private car owner, or AirBnB to rent someone’s house for a week. Uber owns no vehicles, and AirBnB owns no real estate, but both companies connect people to other people on a massive scale. And in the future, respected prognosticators like Daniel Burrus and Donald Tapscott predict the same paradigm will transform banking, voting, education and many other industries that fuel our daily life.

So how can you prepare for the peer-to-peer revolution? By having better access to these peers. When you are blasting text messages to thousands of people, these numbers need to be correct. Otherwise, you face unintended consequences ranging from intrusive spamming to wasted human effort. Moreover, as you move from organizing to marketing, any one-to-one contact model needs verification tools to assess the legitimacy of your contacts and prevent fraud and waste.

Thankfully effective tools existing for verifying phone contact information. These tools include reverse lookup capabilities that can verify wireless or other numbers against US and Canadian databases, including geocoded carrier information and phone type. You can also detect numbers such as VoIP or prepaid phones for use in lead validation or fraud prevention. Taking things a step further, qualified phone numbers can have other contact information appended to them, and entered phone numbers can be contacted via phone or text for active verification by the customer.

The world is increasingly moving away from centralized market models to a distributed peer-to-peer marketplace. This means that now, more than ever, the data quality of both your contact database and your inbound contacts are emerging as key business drivers for the future. With a small incremental investment in maintaining this quality, you can be prepared to grow in an increasingly interconnected world.

What Is Data Onboarding – And Why Is It Important?

What is the best marketing database of all?

Statistically, it is your own customers. It has long been common wisdom that existing customers are much easier to sell to than new prospects – but what you may not know is how valuable this market is. According to the Online Marketing Institute, repeat customers represent over 40 percent of online revenue in the United States, while being much less price-sensitive and much less costly to market to. Moreover, they are often your strongest brand advocates.

So how do you tap into these customers in your online marketing? They didn’t share their eBay account or their Facebook page with you – just their contact information. But the science of data onboarding helps you turn your offline data into online data for marketing. And then you can do content channel or social media marketing to people who are not just like your customers, but are your customers.

According to Wikipedia, data onboarding is the process of transferring offline data to an online environment for marketing purposes. It generally involves taking this offline data, anonymizing it to protect individual privacy, and matching components of it to online data sources such as social media or content providers. Beyond monetizing customer information such as your CRM data, it has a host of other applications, including:

  • Look-alike marketing, where you grow your business by marketing to people who behave like your customers
  • Marketing channel assessment, where you determine whether your ads were seen and led to increased sales across multiple channels
  • Personalization, where you target your marketing content to specific customer attributes
  • Benchmarking against customer behavior, where you test the effectiveness of your marketing efforts against actual customer purchasing trends

This leads directly to the question of data quality. The promise of marketing technologies such as data onboarding pivots around having accurate, up-to-date and verified data. Bad data always has a cost in time and resources for your marketing efforts, but this problem is magnified with identity-based marketing: you lose control of who you are marketing to and risk delivering inappropriate or confusing brand messages. Worse, you lose the benefits of customizing your message to your target market.

This means that data validation tools that verify customer data, such as email addresses, help preserve and enhance the asset value of your offline databases. Moreover, you can predictively assess the value of marketing leads through cross-validating data such as name, street address, phone number, email address and IP address, getting a composite score that lets you identify promising or high-value customer data at the point-of-entry.

Online marketers have always had many options for targeting people, based on their demographics, activity, or many other criteria. Now your existing customer database is part of this mix as well. As your data becomes an increasingly valuable marketing asset, taking a proactive approach to data quality is a simple and cost-effective way to guard the value of this information to your marketing – and ultimately, your bottom line.

Protecting Yourself from High Risk IP Fraud

With the holiday season upon us, online sales surge with customers seeking to place orders with retailers. But not all orders, form submissions, and lead generation efforts are legitimate. Building fraud identification systems which can properly identify cases that are illegitimate can range from simple to complex, with the latter using such methods as tracking user behavior and performing complex authentication methods. Most, if not all, fraud identification strategies incorporate a fundamental step in identifying fraud which is through IP Validation.

IP Validation identifies the origin of an IP which is crucial for assessing whether an IP is legitimate or is considered High Risk. An IP is categorized as High Risk based on multiple factors including whether the IP origin is from a TOR Network exit node, behind an Anonymous/Elite proxy, has been blacklisted for suspicious/spam activity, or whether the IP origin is in a country that is considered High Risk for fraudulent activity.

Anonymous Proxies

A typical HTTP request includes necessary header information which describes the origin of the request to return information to. Requests which emanate from an anonymous proxy hide the origin IP and only include the proxy IP. Anonymous proxies are available through either SOCKS or HTTP protocol. HTTP protocol is used for general HTTP/HTTPS requests as well as FTP in some cases, while SOCKS proxy provides support for any type of network protocol.

TOR Network

While detecting whether an HTTP request was issued from behind a proxy may be detectable based on header information, this is not the case with a request emanating from a TOR client. TOR networks route requests through a series of participating nodes anonymizing where the origin of the request came from.

VPN Service

VPN or Virtual Private Network offers another method for fraudsters to conceal their identity. A VPN service provides a secure tunnel for users to connect to another host machine and execute requests appearing as though the requests are emanating from the VPN host machine.   VPN adds the additional security of encrypting traffic between the user and VPN host.

IP Blacklist

IP Reputation services and DNS-based blacklists track and monitor suspicious and spamming activities. Users which violate website /domain owner’s terms of service can have their IP blacklisted which terminates future activity from that IP. Website owners will check their own provided IP to ensure their website has not been used in spamming attacks or suspicious activities which could restrict their ability to operate. Accepting messages from an IP which has been blacklisted should be considered high risk.


A Botnet is another method fraudsters can use to conceal identity. A botnet is a network of machines that are under control by the attacker. Hackers frequently use botnets for large scale attacks where a high number of concurrent requests are issued to take down a system. Botnets can originate from any network connected device. This was evidenced by a recent attack on a major DNS system provider which was executed by a network of connected home devices.

How to Protect Yourself

With all of the different methods of concealing identity available to fraudsters, the picture becomes much larger of the task to thwart would-be thieves from disrupting your systems. Thankfully DOTS IP Validation encompasses many of the identification strategies necessary to make split-second decisions on would be attackers before any harm is done. From IP origin to Proxy/ TOR node detection, DOTS IP Validation has you covered.

Real-Time Email Validation and Your Sales Process

Have you ever been to Or Or Well, many of your prospects and customers have, without even knowing it. These are just a few of the misspellings of “Gmail” alone that pop up regularly when people enter their email addresses on your squeeze pages and signup forms – in fact, according to one direct marketer,, roughly three percent of their leads provided addresses that bounced. (Believe it or not, many people don’t even spell “.com” correctly!)

Unfortunately, losses like these can be just the tip of the iceberg. When you follow your human nature and ask potential leads to try and validate their own addresses by re-typing them – or worse, ask them to respond to a validation email – many people will simply throw up their hands and not bother, with no way of tracking these losses. According to Lucidchart’s Derrick Isaacson, the more bandwidth you add to your signup process, the less likely someone is to complete it. And the one lead you can never sell to is the one who doesn’t respond in the first place.

Then there are people who intentionally try to game the system. For example, you are offering a free gift to potential qualified prospects, and someone wants to get the goodie without receiving the sales pitch. So they enter a bogus address directed to nowhere, or perhaps to Spongebob Squarepants. Or worse, your next customer transaction is a scam artist trying to defraud your company.

Is there any way around this lose-lose scenario? Yes. And it is simpler and less expensive than you might think – particularly when held up against the cost of lost leads, data errors and fraud. The answer is real-time email validation. By using an API that plugs right into your email data entry process on the Web, you create a smoother experience for customers and prospects while gaining several built-in benefits:

Accurate address verification: A real-time email verification service can leverage numerous criteria to ensure the validity of a specific address. For example, Service Objects’ email validation API performs over 50 specific verification tests to determine email address authenticity, accuracy, and deliverability.

Auto-correction: The right interface not only catches typical spelling and syntax errors but can also suggest a corrected address.

Improved lead quality: The very best tools not only check email address validity but can calculate a composite quality score based on its assessment criteria, which in turn lets you accept or reject a specific address.

Less human intervention: The cost of processing an incorrect or fraudulent email address goes far beyond lost sales or revenue. The time you spend pursuing unattainable leads and processing bad data in your sales process add up to a real, tangible human cost that affects your profit margin.

Blacklist protection: Automated email validation protects your mail servers from being blacklisted by verifying authentic email addresses while filtering out spammers, vulgar or bogus email addresses, and erroneous data.

Real-world numbers bear out the value of using automated email validation. For example,’s Isaacson noted that an A-B test showed a 34% increase in product re-use and a 44% increase in paid customers among the automated validation group. On top of sales results like these, you can also add in the cost savings from reduced database maintenance, manual processing, and fraud when you deploy these tools across each of your prospect and customer touch points.

We now live in an e-commerce world that competes on making the prospect and customer’s experience as easy as possible. Automated email validation helps you compete better by reducing their bandwidth and your costs at the same time. It is a win-win situation for everyone, as well as your bottom line.

Service Not Available: USPS Mail Delivery is More Limited Than You May Think

Residents in many smaller towns and rural areas in the United States do not receive residential mail delivery from the USPS. They’re living off the grid, the postal grid, that is.

In communities such as Davidson, NC, Carmel-by-the-Sea, CA, and Jackson Hole, WY, USPS does not deliver mail to home addresses. Shippers like FedEx and UPS usually deliver packages to peoples’ doors, but the USPS does not.

In some cases, like in Davidson and Carmel-by-the-Sea, the decision against mail delivery was made locally to encourage community building. With everyone checking their mail at the local post office, they’d have to interact with each other. Carmel-by-the-Sea has been doing this for more than 100 years. According to the city’s Chamber of Commerce, a newcomer attempted to get residential mail delivery service about 10 years ago and was promptly labeled an “agitator.”

In Davidson, a delivery route is available to disabled residents. Mail delivery routes are also available to homeowners associations and retirement communities outside of the original town’s boundaries, but only to clustered mailboxes.

In other cases, cities and towns lack delivery routes because of USPS policies. For example, in Jackson, WY and Mammoth Lakes, CA, regular and heavy snowfall make it virtually impossible to deliver mail.

Rural mail delivery frequently involves a central set of mailboxes located along a rural path. These mailboxes tend to cover a large area; thus, they are rarely close to residents’ actual homes.

In 2013, the USPS changed one of its residential delivery policies. For all newly established addresses, instead of delivering to the door or a curbside mailbox, mail will only be delivered to central mailbox clusters. These can be located far from a person’s actual residence. These clusters also make package delivery tricky since the mailbox address is different from the property address.

Here at Service Objects, we are constantly looking for new addresses that don’t fit nicely into a simple box. If the USPS doesn’t deliver to a given address, it doesn’t mean that the address isn’t real or valid for FedEx and other shipping services. Our competitors often ignore tricky outlier addresses such as those that technically don’t exist, at least according to the USPS.

We understand that a lot of people are living off the postal grid, either by choice or by USPS policy. We excel at finding these addresses and dealing with their intricacies. These are the types of challenges that push us to improve our address validation software for even better accuracy.

How to Hack Character Limitations in Shipping Address Fields

If you are using an Address Validation service for shipping labels, then you may occasionally run into character limitations with the Address1 field. Whether you are using UPS, Fedex or another shipping solution, most character limits tend to range between 30 or 35 characters. While most addresses tend to be under this limit, there are always outliers that you will want your business solution to be ready to handle.

If you are using a DOTS Address Validation solution, you are in luck! The response from our API allows you to customize address lines to your heart’s content. Whether you are looking to have your address lines be under a certain limit, place apartment or unit information on a separate line, or customize the address line some other way, we will show you how to integrate the Address Validation response from a Service Objects API into your business logic.

Here is a brief example using our DOTS Address Validation 3 solution:

DOTS Address Validation 3 US provides the following fragments in a typical valid response:


If you are worried about exceeding a certain character limit, first, you can programmatically check the Address1 line to see if it exceeds that particular limit. If it does, then your application can go about splitting up the address in the way that would be the best for your particular application.

For example, let’s say you have a long address line like the following:

12345 W Fake Industrial St NE STE 130, #678

This is obviously a fake street, but it will help us show the different ways to handle long address lines. This address ends up being around 45 characters long, including spaces. The service would return the following fragments for this address:

Fragment House: 12345
FragmentPreDir: W
FragmentStreet: Fake Industrial
FragmentSuffix: St
FragmentPostDir: NE
FragmentUnit: STE
Fragment: 130
FragmentPMBPrefix: #
FragmentPMBNumber: 678

For this particular example, a solution to reduce the character limits would be to move the Suite and Mail Box information to a separate address line so it would appear like so:

12345 W Fake Industrial St NE
STE 130, #678

You may need to fine tune the logic in your business application for this basic algorithm, but this can help you get started with catering your validated address information to different character limitations.

If you have any questions about different integrations into your particular application contact our support team at and we will gladly provide any support that we can!

The Importance of Data Quality for CRM

What are your customers telling you about your business?

This question has always been the key argument for customer relationship management, or CRM for short. Capturing data about your customers can tell you how many people eat steak at your restaurant on Thursdays, or who buys polo shirts at your clothing store. It provides visibility about who is calling with customer issues, so you can improve your products and service delivery. And in an increasingly interconnected world, related tools such as identity graphs can now track customer behaviors across different vendors and channels – for example, who bought a product, with what credit card, after seeing it on a specific social media platform.

Perhaps most importantly, CRM does what its name implies: it helps you understand and manage the relationship between you and your customers. Good CRM also benefits the customer as well as your business. Done correctly, it represents a single view of the customer across all departments in the organization, to build a cohesive experience for these customers. Knowing who your customers are is strategic and personal at the same time, and its impact ranges from remembering their birthdays to driving customer growth and retention.

So how important is CRM data quality? Bad data isn’t just an annoyance – it is a real, make-or-break cost for many companies. According to a Gartner survey, users of one major CRM system disclosed that poor data quality cost their companies over US $8 million per year on average, with some respondents citing costs of over $100 million annually ! These costs range everywhere from time and money spent catching and managing these mistakes, all the way to losing customers to poor service or missed opportunities. For companies of all sizes, the amount of inaccurate or outdated CRM data ranges from 10% to 40% of their data per year.

Where does bad CRM data come from? A number of sources. For example:

  • Fraudulently entered data: for example, customers who enter “Donald Duck” or key in a phony phone number to get a customer perk or avoid customer registration
  • Errors at the data entry level
  • Duplicate information
  • The natural moves and changes that take place in business every year

Whatever the sources of bad data, simply waiting and letting it accumulate can quickly degrade the value of your CRM database, along with concomitant costs in human intervention as a result of invalid or incorrect customer records. And without a database management plan in place, and specific stakeholders taking ownership of it, the economic value of this data will continue to degrade over time.

While ensuring data accuracy is important, it is also one of the least favorite tasks for busy people – particularly for information such as CRM data. This is where companies like Service Objects come in: our focus is on automated validation and verification tools that can run through an integrated API, a batch process or a web-based lookup. These tools range from simple address and phone verification all the way to lead and order validation, a sophisticated multi-factor process that ranks a customer’s contact information with a validity score from zero to 100. All of these tools validate and cross-verify a contacts’ name, location, phone, email address, and device against hundreds of authoritative data sources.

CRM data truly represents the voice of your customer, and it can serve as a valuable asset for strategic planning, sales growth, service quality, and everything in between. Using the right tools, you can painlessly make sure that this data asset maintains its economic value, now and in the future. In the process, you can leverage technology to get closer to your customers than ever.

3 Things to Consider When Signing a Cloud Computing Contract

Cloud computing entails a paradigm shift from in-house processing and storage of data to a model where data travels over the Internet to and from one or more externally located and managed data centers.

It is typically recommended that a Cloud Computing Contract:

  • Codifies the specific parameters and minimum levels required for each element of the service you are signing up for, as well as remedies for failure to meet those requirements.
  • Affirms your institution’s ownership of its data stored on the service provider’s system, and specifies your rights to get it back.
  • Details the system infrastructure and security standards to be maintained by the service provider, along with your rights to audit their compliance.
  • Specifies your rights and cost to continue and discontinue using the service.

In addition to the basic elements of the Contract listed above, here are three important points to consider before signing your Cloud Computing Contract.

1. Infrastructure & security

The virtual nature of cloud computing makes it easy to forget that the service is dependent upon a physical data center. All cloud computing vendors are not created equal. You should verify the specific infrastructure and security obligations and practices (business continuity, encryption, firewalls, physical security, etc.) that a vendor claims to have in place and codify them in the contract.

2. Disaster recovery & business continuity

To protect your institution, the contract should state the provider’s minimum disaster recovery and business continuity mechanisms, processes, and responsibilities to provide the ongoing level of uninterrupted service required.

3. Data processing & storage

  • Ownership of data: Since an institution’s data will reside on a cloud computing company’s infrastructure, it is important that the contract clearly affirm the institution’s ownership of that data.
  • Disposition of data: To avoid vendor lock-in, it is important for an institution to know in advance how it will switch to a different solution once the relationship with the existing cloud computing service provider ends.
  • Data breaches: The contract should cover the cloud service provider’s obligations in the event that the institution’s data is accessed inappropriately. The repercussions of such a data breach vary according to the type of data, so know what type of data you’ll be storing in the cloud before negotiating this clause. Of equal importance to the breach notification process, the service provider should be contractually obligated to provide indemnification should the institution’s data be accessed inappropriately.
  • Location of data: A variety of legal issues can arise if an institution’s data resides in a cloud computing provider’s data center in another country. Different countries, and in some cases even different states, have different laws pertaining to data. One of the key questions with cloud computing is, which law applies to my institution’s data, the law where I’m located, or the law where my data’s located.
  • Legal/Government requests for access to data: The contract should specify the cloud provider’s obligations to an institution should any of the institution’s data become the subject of a subpoena or other legal or governmental request for access.

The Cloud Computing Contract is for the benefit of both the consumer and the provider. While it can be highly technical and digitalized, the Contract will ultimately establish the partnership between the parties, and following these steps should help mitigate any potential problems.

How To Have a Happy, and More Profitable, Holiday Season

The holidays are approaching. For many merchants, this is the busiest and most profitable time of year. Especially if you sell high-ticket items that are popular as gifts. Unfortunately, the holidays are also the high season for incorrect and fraudulent orders.

First of all, both your staff and your customers are human – and it is easy for both to be more human than ever over the holidays, because of high transaction volumes, rush orders, and the stress of the season. A bungled address or a credit card problem can have effects ranging from time and human intervention to the possible loss of merchandise. And your valuable customer service reputation can also take a hit from order and delivery errors.

Merchants are also often targeted for intentional fraud over the holidays. Did you know that stolen credit card numbers sell on the black market “dark web” for as little as $5 each? And that cards with what hackers call “full info,” including name, address, expiration date, verification info and CVV can be had worldwide for $30-40 apiece? It is a small price to pay for someone who uses that card with a fraudulent name and mailing address – and a large cost to you when they use it to order expensive merchandise from your business.

The cost of fraud to unsuspecting businesses can sink your profit margins. Small businesses, in particular, can be on the hook for merchandise ordered by thieves who never intend to pay, and globally this kind of fraud costs businesses throughout the US $14 billion per year . Online merchants have become particularly vulnerable, with a 9-12% year-over-year increase in fraud levels as of 2016 . Beyond chargebacks for stolen or fraudulent credit cards, there are costs in your time and manpower involved – not to mention the aggravation of unexpected losses.

Thankfully, you can mitigate a great deal of this risk with a little planning. Here are some tips for putting a little more holiday cheer back into your busiest season:

Look for the red flags

Fraudsters often have a common modus operandi. Look for orders where the “Bill to” and “Ship to” addresses are different, unusually large orders from unknown sources, or international orders, particularly from developing countries. And particularly around the holidays, pay attention to last minute high-ticket orders with rush shipment, where merchandise can fall into the wrong hands before the fraud is discovered.

Have a policy

What procedures do you follow when you receive a questionable order? What procedures do you follow to make sure that orders are valid or accurate? Are there circumstances where it would be prudent to delay shipment or decline an order? Learn the norms for vendors in your business, and teach all of your employees to follow them.

Validate your orders

For most businesses, one of the most reliable solutions is to use an inexpensive online service to verify the authenticity of a customer. A validation service can compare your order information against existing databases to quickly answer critical questions like whether an address is valid, a name or its corresponding credit information is legitimate, whether the IP address of an online order matches the address you are shipping to, and much more.

One solution for this is Service Objects’ DOTS Order ValidationSM, a real-time API that verifies, standardizes and authenticates customer order information. It performs 200 proprietary tests including address, BIN, email and IP validation, giving you an overall order score of 0-100 to flag suspicious orders. Check it out today and make sure your business avoids the perils of fraudulent orders this holiday season.

1 Danielson, Tess, “Here’s exactly how much your stolen credit card info is worth to hackers,” Business Insider, Nov. 30, 2015.

2 Heggestuen, John, “The US Sees More Money Lost to Credit Card Fraud than the Rest of the World Combined,” Business Insider, Mar. 5, 2014.

3 LexisNexis, 2016 LexisNexis® True Cost of FraudSM Study

8 Tips to Build a Successful Service Level Agreement

A Service Level Agreement (SLA) makes use of the knowledge of enterprise capacity demands, peak periods, and standard usage baselines to compose the enforceable and measurable outsourcing agreement between vendor and client. As such, an effective SLA will reflect goals for greater performance and capacity, productivity, flexibility, availability, and standardization.

The SLA should set the stage for meeting or surpassing business and technology service levels while identifying any gaps currently being experienced in the achievement of service levels.

SLAs capture the business objectives and define how success will be measured, and are ideally structured to evolve with the customer’s foreseeable needs. The right approach to an SLA results in agreements that are distinguished by clear, simple language, a tight focus on business objectives, and ones that consider the dynamic nature of the business to ensure evolving needs will be met.

1. Both the Client and Vendor Must Structure the SLA

Structuring an SLA is an important, multiple-step process involving both the client and the vendor. In order to successfully meet business objectives, SLA best practices dictate that the vendor and client collaborate to conduct a detailed assessment of the client’s existing applications suite, new IT initiatives, internal processes, and currently delivered baseline service levels.

Cropped shot of two businesspeople shaking handshttp://

2. Analyze Technical Goals & Constraints

The best way to start analyzing technical goals and constraints is to brainstorm or research technical goals and requirements. Technical goals include availability levels, throughput, jitter, delay, response time, scalability requirements, new feature introductions, new application introductions, security, manageability, and even cost. Start prioritizing the goals or lowering expectations that can still meet business requirements.

For example, you might have an availability level of 99.999% or 5 minutes of downtime per year. There are numerous constraints to achieving this goal, such as single points of failure in hardware, mean time to repair (MTTR), broken hardware in remote locations, carrier reliability, proactive fault detection capabilities, high change rates, and current network capacity limitations. As a result, you may adjust the goal to a more achievable level.

3. Determine the Availability Budget

An availability budget is the expected theoretical availability of the network between two defined points. Accurate theoretical information is useful in several ways, including:

  • The organization can use this as a goal for internal availability and deviations can be quickly defined and remedied.
  • The information can be used by network planners in determining the availability of the system to help ensure the design will meet business requirements.

Factors that contribute to non-availability or outage time include hardware failure, software failure, power and environmental issues, link or carrier failure, network design, human error, or lack of process. You should closely evaluate each of these parameters when evaluating the overall availability budget for the network.

4. Application Profiles

contractApplication profiles help the networking organization understand and define network service level requirements for individual applications. This helps to ensure that the network supports individual application requirements and network services overall.

Business applications may include e-mail, file transfer, Web browsing, medical imaging, or manufacturing. System applications may include software distribution, user authentication, network backup, and network management.

The goal of the application profile is to understand business requirements for the application, business criticality, and network requirements such as bandwidth, delay, and jitter. In addition, the networking organization should understand the impact of network downtime.

5. Availability and Performance Standards

Availability and performance standards set the service expectations for the organization. These may be defined for different areas of the network or specific applications. Performance may also be defined in terms of round-trip delay, jitter, maximum throughput, bandwidth commitments, and overall scalability. In addition to setting the service expectations, the organization should also take care to define each of the service standards so that user and IT groups working with networking fully understand the service standard and how it relates to their application or server administration requirements.

6. Metrics and Monitoring

Service level definitions by themselves are worthless unless the organization collects metrics and monitors success. Measuring the service level determines whether the organization is meeting objectives, and also identifies the root cause of availability or performance issues.

7. Customer Business Needs and Goals

Try to understand the cost of downtime for the customer’s service. Estimate in terms of lost productivity, revenue, and customer goodwill. The SLA developer should also understand the business goals and growth of the organization in order to accommodate network upgrades, workload, and budgeting.

8. Performance Indicator Metrics

Metrics are simply tools that allow network managers to manage service level consistency and to make improvements according to business requirements. Unfortunately, many organizations do not collect availability, performance, and other metrics. Organizations attribute this to the inability to provide complete accuracy, cost, network overhead, and available resources. These factors can impact the ability to measure service levels, but the organization should focus on the overall goals to manage and improve service levels.

In summary, service level management allows an organization to move from a reactive support model to a proactive support model where network availability and performance levels are determined by business requirements, not by the latest set of problems. The process helps create an environment of continuous service level improvement and increased business competitiveness.

Why Your Business Should Pay Attention to CASL

Many companies are worried about Canada’s anti-spam legislation (CASL). A new rule goes into effect next July, and the penalties are harsh. If you email Canadians who haven’t opted in, you could be on the hook for a lawsuit for sending CEMs without permission. Penalties can reach up to $10 million.

So, what is CASL? What are CEMs? And how can you comply?

Understanding CASL

CASL dates back to July 1, 2014, when it first went into effect. Section 6 of CASL covers all of the requirements and provisions of CASL. Several provisions were phased in over time, including the “private right of action” rule, which goes into effect July 1, 2017.

CASL applies to all electronic messages, such as emails and text messages, that are sent in relation to commercial activities. These messages are known as CEMs, or “commercial electronic messages”. Commercial electronic messages must be sent to an address, such as an email address or mobile phone number, in order to be subject to the terms of CASL. Thus, commercial blog posts or webpages are not considered CEMs.

CASL requires obtaining express consent, either in writing (electronic written consent is permitted) or orally, before sending CEMs. There are a few instances where implied consent is allowed, such as for existing business and non-business relationships or voluntary disclosure without indicating that the person does not want to receive messages.

If you send CEMs to people in Canada without prior consent, you could face serious consequences. Starting next July 1st, individuals and organizations can bring civil actions seeking redress in court from anyone in violation of CASL. The Canadian Radio-television and Telecommunications Commission can impose up to $10 million in penalties for the most serious violations.

Not only are US companies concerned about complying with this particular section of CASL, their legal departments don’t want to take chances. Thus, marketing departments are being told not to email anyone on the chance that a handful of contacts might be located in Canada — and it only takes one.

What does this mean to marketing and sales departments? They’re legitimately concerned that new leads will be cut off and wonder how they’ll be able to make up for such a shortfall.

But there are some important exceptions to Section 6. Using email validation tools such as DOTS Email Validation can be your key to keeping email – and the pipeline of leads – flowing.

Avoiding Running Afoul of CASL

First, it’s important to understand what section 6 of CASL applies to and what it doesn’t apply to.

Section 6 of CASL deals with CEMs sent to electronic addresses:

  • Canadian enforcement against spammers operating in Canada is allowed.
  • The Canadian Government is allowed to share information with other state governments that have substantially similar legislation (like the United States’ CAN-SPAM act) if the information is relevant to an investigation or proceeding involving similar prohibited conduct.

Section 6 of the Act does not apply to CEMs under some circumstances:

  • The person sending, causing, or permitting the CEM to be sent (the sender) must reasonably believe that it will be accessed in a foreign state listed in Schedule 1.
  • The CEM must be sent in compliance with the foreign law, which addresses conduct that is substantially similar to the conduct prohibited in section 6 of CASL.

In other words, CASL excludes emails if you’re sending them to someone you are reasonably sure lives in a foreign country that has its own spam laws and you are in compliance with those.

How to Continue Marketing Your Business After July 1st

You can’t blame your legal department for wanting to avoid lawsuits; it’s in your company’s best interest to comply with all applicable laws. However, the answer isn’t to shut down email marketing completely; it’s to become reasonably sure where your recipients live before sending CEMs.

Service Object’s DOTS Email Validation API can help you be reasonably sure where someone lives and which laws might apply. For example, the laws of the country where the person is located may be more liberal than Canada’s and would apply instead of CASL. The vast majority of nations (115 other countries ranging from all of Europe, Australia, Japan, S. Korea, China, Brazil and Russia) do have their own laws, such as the United States’ CAN-SPAM act or Canada’s CASL.

By using DOTS Email Validation software, you may be able to create an email marketing list that is safer-to-send to and will satisfy your legal department.

Sources: – Does section 6 of CASL apply to messages sent outside of Canada? SCHEDULE (Paragraph 3(f)) LIST OF FOREIGN STATES
Canada’s Castle (CASL) – Law on Spam and other Electronic Threats. – full copy of the law passed in December 2010

How Lead Validation Works

As a marketer, one of your jobs is to ensure that your sales team has access to high-quality leads. So how do you go about screening your lead lists for accuracy? Believe it or not, most marketers are doing very little to measure the quality of their leads , leaning heavily on the lead scoring tools in their marketing automation platforms to help screen leads. But this is not enough and is only one small part of the bigger lead validation picture. In fact, according to a study conducted by Straight North, about half of the leads generated by your marketing campaigns are not actual sales leads.

In our previous blog, ‘Custom Lead Scoring and How it Works‘ we touch on the perils of lead scoring. So what can you do to ensure you’re handing off high-quality leads to your sales team? Service Objects’ DOTS – Lead Validation solution is a good place to begin.

How does it work?

Now, you’re probably wondering how does Service Objects “know” whether or not a lead is a legitimate person or viable contact? Without getting too technical or giving away trade secrets, I can tell you that we use a combination of our best-of-breed data quality tools to look at key elements of a contact record. By examining a combination of the contact record’s data points like; name, email, address, IP address, device, etc., we are able to derive an overall lead certainty and quality score for each contact record. This is what DOTS Lead Validation does.

Once you have these lead quality scores, you can create specific rules and actions based on the scores. To give you some real-world scenarios, let’s run through a couple quick examples showcasing how you can use Lead Quality scores to develop rules-based actions that will streamline your marketing and sales processes:

Chip’s call center

After implementing DOTS Lead Validation API into their lead capture process, “Chip’s Call Center” sets up the following rules for their incoming, real-time leads, making it more efficient for their sales team to prioritize and respond to their potential customers.

If the lead’s quality score is between:

  • 75-100, Lead is systematically moved to the top of the call center’s call list and assigned to the best performing sales reps.
  • 60-74, Lead is assigned secondary priority on call list and sent to second tier sales reps.
  • 50-59, Lead is sent an email asking for further confirmation/interaction
  • 25-49, Lead is placed on an email drip campaign (cost effective)
  • 0-24, These leads are largely considered time-wasters and ignored or placed on a monthly newsletter with low-conversion expectation

Martha’s marketing agency

Martha is driving traffic from multiple channels and audiences within these channels, and needs to decide the best way to allocate her monthly budget. The sales cycle for her company is 9-12 months, so she has little concrete data from which to base her decisions.

  • Most of Martha’s leads are delivered from external sources like Adwords, LinkedIn and Facebook. With Lead Validation, Martha can easily determine the lead quality scores for these channels. Here’s how that score report might look:
  • Adwords – Campaign#1, average Lead Quality score of 78
  • Adwords – Campaign#2, average score of 36
  • LinkedIn, CEO Audience – 90
  • LinkedIn, IT/Developer audience – 54
  • Organic, average score of 87

Based on these average scores, Martha decides to place the majority of her marketing budget in the top scoring channels.

Heather’s house of iPhone accessories

Heather wants to buy a highly-targeted email list to drive traffic to her online store. In the past, she has purchased lists that did not perform and needs a reliable way to pre-determine the quality of these lists before buying them. As part of her negotiation with the list broker, she asks them to, “Please bounce your email list against Service Objects’ Lead Validation service and let me know the average score. I only want to buy leads that score a 50 or higher.

The point I’m highlighting here is that by investing in a Lead Validation Service to determine Lead Quality scores, your company can create whatever strategies, rules, and actions that best fit YOUR experiences and needs. Even more importantly, you can preset sales quotas and expectations around the different lead quality scores.

The power of lead validation

DOTS Lead Validation is the first step in recouping the inevitable 50% of unusable sales leads coming in. You have been shown a great resource to increase ROI on your marketing campaigns, save your company unnecessary costs, expand marketing and sales efficiencies, as well as elevate overall company morale. The benefits of incorporating a lead validation system will bring about a successful, sustainable business and we would love to help you get started!

Data Governance and You

Data governance has become another trendy buzz-phrase among information technology professionals. Twenty years ago, it was a rarely heard term. Nowadays, there are professional societies, best practices, and annual professional conferences built around it. But what does it mean for you and your business?

According to Wikipedia, data governance “encompasses the people, processes, and information technology required to create a consistent and proper handling of an organization’s data across the business enterprise.” Properly framed, it involves data quality monitoring strategies, protocols for corrective action, and responsible stakeholders.

Put another way, data governance represents a recent framework for codifying something that has been important for businesses for many years – the quality of the data that drives their operations. This includes marketing leads, orders, customer information, and much more. It recognizes that bad data is not only a cost and service quality issue but something that should be understood and managed at a corporate level.

According to the Center for Innovative Technology, best practices for data governance start with organizational structure, from which specific policies, procedures, and metrics emerge. They recommend a formal data governance committee, reporting to executive management and overseeing the activities of working groups and specific data contributors. The Data Governance Institute’s Data Governance Framework describes this in terms of having a specified Data Governance Office operating between data stakeholders and the actual stewards of this data.

This eventually leads to specific data management tasks such as removing duplicates, validating and improving existing data with data quality tools, performing regular data quality maintenance, and tracking ROI. It has frankly been in the growth and development of such tools that the historical need for data quality has evolved into the profession of data governance. This, in turn, has helped improve data quality for marketing, sales, customer and other data – with immediate, tangible benefits in reducing errors and fraud, along with intangibles such as a strong service brand and satisfied customers.

Finally, here is a closing thought about data governance from the Data Governance Institute, on what they consider to be its most overlooked aspect: “Communication skills of those staff who sit at ground zero for data-related concerns and decisions. They need to be able to articulate many stakeholders’ needs and concerns and to describe them in many vehicles and mediums.” We agree. Policies are important, and tools are important. But at the end of the day, good communication among the stakeholders who work with your actual data is the glue that holds your data quality together.

Custom Lead Scoring and How it Works

Lead scoring is a powerful sales feature offered in just about every marketing automation platform. Companies with a solid lead scoring methodology in place are better equipped to not only evaluate VOLUMES of customer prospects, but more importantly, they can zero-in on “sales-ready” leads, and turn them into opportunities!

So what exactly is lead scoring? Marketo defines lead scoring as “a shared sales and marketing methodology for ranking leads in order to determine their sales-readiness”. Essentially, leads receive a score (i.e., a points system or ranking: A/B/C/D, or perhaps “hot”, “warm”, or “cold”), based on the interest customers express in a company’s products or services, etc.

Working together, marketing and sales teams rely on these scores to determine where potential customers fit into the buying cycle.

Remember, a customer’s lead score is dynamic, requiring re-evaluation from time to time, based on a predetermined set of criteria. When marketing and sales teams operate from the same set of lead scoring rubrics, they position themselves to drive more efficient and effective marketing campaigns.

Setting up for success

But what if you’re new to marketing automation? How will you go about setting up a sustainable lead scoring strategy? How will you be integrating the information from your data into a viable scoring system? How will you measure lead successes or failures? There are many factors to consider, but before you jump right in, you’ll need to do your homework and get clear about your company’s unique marketing goals.

You can start by answering this question: Who is your target audience and what rules will you put in place to define your ideal buyer profile?

To help create your lead scoring model we recommend that you start with a lead-scoring rules checklist that aligns with your needs, such as the one created by Marketo. When it comes time to evaluate your data, it’s very important that you consider a combination of explicit and implicit-based information: explicit scoring comes from what your prospects tell you directly about themselves and implicit scoring is derived from your observations or inferences from their behaviors.

With over 250 lead scoring rules, Marketo’s checklist does an excellent job of helping to identify:

  • Key demographics that pertain to your company’s individual goals (company info)
  • Behavior-based scores to consider (online behavior, email & social engagement)
  • Bad behaviors that warrant low or negative scores (unsubscribes, incorrect or erroneous information, spam detection)

Okay, so how do these rules work to determine lead scores? The rules can be as sophisticated and/or as complicated as you’d like, providing you the ability to create your own custom rules as well. Let’s walk through a couple examples.

Perhaps a strong lead score is assigned to someone who has: visited a specific product page, downloaded a white paper, engaged in a live chat session, AND then visited the pricing page. Conversely, you may give a low lead score to a lead who only visited a non-product page, like the Careers page. Leads with low scores might be sent back to nurturing or you might decide that they are not worth your time at all.

Once you’re up and running, measuring successes and identifying opportunities should come from a combination of sources. Marketers can make well-rounded, future decisions based on feedback from their sales teams, their customers, and from company analytics. The goal is to pinpoint the content and actions that are successful in converting leads into customers.

Identify, validate, and clean it up!

So what’s the common denominator in this lead scoring equation? Without a doubt, it’s that you cannot do any of this without contact record data. Above all else, the contact data entering your lead scoring system must be genuine, accurate, and up to date.

As the number of data entry points increases, so does the propensity for erroneous data to corrupt your system. In other words, you need to know, in all stages of the buying cycle, that you are working with “real” people! Following the “garbage in, garbage out principle”, if mistakes are made and not caught at the point of entry, then your pipeline will be peppered with dead end leads, wasting time and resources.

Unfortunately, marketing automation platforms do a poor job of ensuring the contact records captured are genuine, accurate and up-to-date. This is where Service Objects comes in. Our Lead Validation tools perform real-time checks to verify the contact data entered is real. We also offer a free Data Quality Scan, which gives you a detailed report on the quality of your existing contact data and areas where it can be improved

Five Elements of a Customer Success Program

Why focus on customer success? Retaining customers, maintaining customer loyalty, and getting new customers requires a holistic approach that goes beyond the basics of providing service, ensuring satisfaction, and resolving problems.

According to the Customer Success Association, “…it’s about customer relationship retention and optimization. And the most effective way to keep your customers is to make them as successful as possible in using your technology product.”

Customers who feel engaged and heard and who have experienced a real value in doing business with you are your true success stories. Their interactions at each point have been positive, both in terms of personal interactions with your team and in using your products or services. These are the customers who will remember your brand, who will tell their friends how wonderful your company is, and who will absolutely return because the relationship and the value they receive are both so strong.

Having a Customer Success Program clearly communicates to customers and prospects what they can expect if they buy your company’s product or service. Below are five key elements to consider when developing your Customer Success Program:

1. Commitment to customer service — Demonstrate that your company is deeply committed to customer service. While your marketing materials may tout your commitment, this is one area where actions speak louder than words. Include specific metrics – e.g. we will respond to support requests within 30 minutes, we have 24/7 support – and make sure they are adhered to and backed by a Service Level Agreement (SLA).

2. Company wide buy-in — Customer service is no longer the realm of front office staff or the support center. Get buy-in from all departments in the company so that the customer has a positive experience whether they are talking to operations, engineering, IT, sales, accounting, etc. Having a centralized CRM database is important so each department can clearly see what is being communicated to the customer.

3. Assign a single point of contact — Establish a dedicated account manager to proactively communicate product updates, important features the customer may not be effectively utilizing, and serve as the main conduit of information for the customer regarding their account.

4. Proactive monitoring — Proactively monitor the customer’s account and alert them to any unusual activity to ward off potential complaints or unexpected account usage surprises. This is a great way to cultivate a “we are looking out for you” feeling.

5. Solicit feedback — Soliciting feedback — and responding to it — shows your customers that you value their insights and are listening. Make soliciting customer feedback a regular task, and respond promptly. It’s crucial to thank them for their feedback and address concerns. Have a regularly scheduled check-in call to address any issues or concerns that may have recently come up.

At Service Objects, our philosophy is customer service above all, and our Customer Success Program reflects this core value. Below are just some of the features our program includes:

  • 24/7 critical emergency support
  • Direct access to Product Engineers to discuss Best Practices
  • Guaranteed response times and server uptimes backed by a money back guarantee
  • Dedicated Account Manager
  • Priority customer support across our customer contact channels (phone, email, chat)

New Guidelines to SMS Authentication

With a seemingly ever-increasing amount of data breaches making headlines, companies are trying to be more vigilant than ever about making sure your accounts are secure. You’ve likely experienced some of these efforts firsthand. For example, one of the most common practices is 2-factor authentication. This is where a company will use two completely separate means of verifying that you are actually you and not someone attempting to impersonate you.

While there are various options as far as 2-factor authentication goes, one of the most common involves sending a text message to your phone that includes a one-time code that will expire within a few minutes. The theory here is that only you have the phone in your possession. Thus, only you will receive that code. Thus, if the code is entered via a form in a website, you must be exactly who you say you are and not a criminal. If a criminal doesn’t have your phone, it’s impossible to pretend to be you because the text message will never land in the criminal’s hands!

This method of verifying your identity has been bulletproof up until recently. Now, some of the more sophisticated criminals, and there are a few of them out there in this world, have a workaround for certain types of phone numbers. These criminals can make it look like as though they have your phone if — and it is a big IF — your number is a VoIP-type of number such as a Skype, Google Voice, or Vonage phone number. If this is the case, an SMS text containing a 2-factor authentication code could be intercepted by the criminal and your account compromised.

The National Institute of Standards and Technology, also known as NIST, which is part of the Department of Commerce, is finalizing its Digital Authentication Guideline. One of the biggest changes NIST is suggesting attempts to patch this potential vulnerability whereby criminals could make it look like they have your phone. Its recommendation is to use 2-factor authentication with SMS notifications only when you know that the delivery phone number has been verified to be associated with an actual mobile network like Verizon, AT&T, or Sprint and not with a VoIP service such as Google Voice, Skype, RingCentral, Vonage, or Ooma.

This proposed change is a huge deal as many big companies such as Amazon, PayPal and Google have 2-factor authentication with SMS built into all sorts of routine interactions ranging from trial signups to password resets.

So, how can you verify a mobile phone number’s network status? Is it a plain old cellular phone number or is it VoIP based using a service such as Google Voice? Service Objects’ Phone Exchange 2 real-time API is the answer. Our phone verification API can tell you exactly if a mobile phone number is a VoIP-type line or a good ‘ol cellular number that can be trusted for 2-factor authentication.

Simply run the number through our service before sending an SMS authentication code. Our real-time phone verification API allows you to meet NIST’s Digital Authentication guidelines without having to sacrifice your investment in SMS authentication. If a phone number is validated as a regular cellular number, you have the green light to send verification codes to it via SMS. If it appears to be VoIP related, then you have the information you need to choose an alternate authentication factor.

The NIST Digital Authentication Guideline isn’t final yet, but it’s not too early to implement measures, such as mobile phone verification, to make sure your accounts are secure.

The Challenge of Storing International Addresses

Working with international address data can be difficult and confusing. Even when you have an application available to validate an address, and it tells you that it’s deliverable, you still have to deal with the chore of storing the resulting data. So when someone asks, “what’s the best way to store international addresses?”, what they are really asking is, “what’s the easiest way to store international addresses?”

The short answer to the “what’s the best?” question, as it often is, you’re asking the wrong question. Many of you who have worked with varying data sets before already know that you first need to ask yourself, “what do I intend to do with the data once it is stored?” What the data is used for should have the largest impact on how it is stored. Depending on your specific requirements, the way you store address data can vary greatly. For some, how you store your data may not be entirely up to you as you may not have any control over the storage design, and are instead forced to work with the fields that are made available to you. Many users work with US-centric Customer Relationship Management (CRM) solutions that are designed with US address fields in mind, which can make storing international addresses all the more confusing and can also potentially lead to some data loss.

For those looking to simply print an address label for mail delivery, a single text field containing the complete formatted address will suffice. After all, why bother with breaking an address down to a mess of individual fragments if you’re not going to use them? Worse yet, what do you do when it comes time to put the pieces back together and you find that you don’t know how?

For some, correctly putting an address back together from its individual fragments might not be of great concern. The primary use of the data may be for some form of query analysis and/or organization. In which case you might be more concerned about which specific data type your individual fields should be or how to properly map these fragments. If you are implementing your own design then keep in mind that not all international addresses are necessarily parsed the same way, and you will need to consider if your design should be flexible enough to handle all international addresses or if you would prefer to go a country-specific route.

Mapping address fields

Consider this example of an address in England:

9 Gorse View
School Road
Knodishall, Saxmundham
IP17 1TS

If we include the country name, then the above address has five address lines; six if we split the third line. Now, let’s go ahead and attempt to store this address in our CRM. Most CRMs will contain the following address fields for a contact:


Depending on the CRM, we may have somewhere between five to seven address related fields on average to work with. In the above example we have seven, so that should make things easy, right? We have more than enough fields, so there should not be any loss of data, but right away we see State and ZIP fields. These should be red flags that the storage was not designed for international addresses, but unfortunately, it is what we have to work with. Let’s go ahead and look at the parsed fields that we are likely to get back from an address validation solution:

Premise Number: 9
Dependent Street Name: Gorse View
Street Name: School Road
Dependent Locality: Knodishall
Locality: Saxmundham
Postal Code: IP17 1TS

In most cases, users will find that they can typically match Locality to City, Administrative Area to State, and Postal Code to ZIP. If you are unfamiliar with the address terms “Locality” and “Administrative Area” then please check out our previous blog, Five Commonly Used Terms and Definitions in International Address Validation Systems.

In the above example, you’ll notice that an Administrative Area equivalent was never provided. You’ll quickly find that this is quite common for many countries and that the locality is usually preferred. You’ll also notice that we have a dependent locality, which is a sub-region of the locality, and a dependent street name. It is important not to omit or lose these pieces of data if they are provided, as they offer additional detail/instruction on the whereabouts of an otherwise ambiguous address. So where to map them?

Luckily, our database design offers enough fields to accommodate these values, but keep in mind that this may not always be the case. In our example, we can map the premise number and dependent street name to Address1, the street name to Address2, the dependent locality to Address3, locality to city, postal code to ZIP, country to country, and leave the state empty. However, even though we were able to successfully map every value to our CRM, it is still very tedious and risky to try and handle all of the various address formats. Also, what course of action do we take when an address also includes a double-dependent locality or a sub-region?

Missing state or administrative area equivalent

Let’s look at two more example addresses:

3-10-13 Ryoke
Saitama-Shi 330-0072


5 Rue Sainte-Catherine
12000 Rodez

The first example is a Japanese address. Looking at it with American eyes one might think that the first line is a premise number and a street name, the second line the city, and the third line the state and postal code, or their equivalents. However, things work very differently in Japan. Streets are not commonly named or used for addresses. Instead of street names, they primarily use regions that can normally be thought of as districts. In the above example, Ryoke is a second level sub-locality, Urawa is a first level sub-locality and Saitama is the locality. No administrative area equivalent value is given. Administrative areas are commonly omitted as often as they are included in Japanese addresses.

In the second example, we have a premise number and street name in the first address line, and a postal code and locality in the second. Once again, no administrative area value is given. The address is in France, but many European addresses will follow this general format, and it is common for them to omit a first level administrative area. Therefore, it is highly recommended that you do not make an administrative area a required field. Doing so would mean rejecting valid addresses for entire countries.

Facing the challenge

As I mentioned earlier, when breaking an address apart we also run the risk of putting it back together incorrectly. So while no individual address fragment might be lost, we still risk losing the correct address order and format. Addresses and their various fragments and formats can vary greatly not just from country to country, but also within the same country. So what’s the point of it all? Is there no hope when it comes to international addresses?

If you are forced to use a set storage design and are unable to alter it then your best course of action may be to simply store the complete formatted address in a single field, if it can fit. If the complete address cannot fit in a single field, then split it into multiple fields when necessary. In general, storing the complete address should be your primary objective as it should contain all of the necessary information that you need. The complete address can always be parsed out later as needed. Storing the country and postal code should be next on your priority list, although not all countries use postal codes. Postal codes are very important and useful, so be sure to store them when they are available. Finally, look towards storing the locality and admin area if they are available.

For those who will be implementing their own design, look to the output specifications of your validation solution. Most validation solutions will have a large list of address fields that cover the majority of the most widely used international addresses out there. You may consider it cumbersome, but if you include all of the output fields from your validation solution in your own design then you minimize the risk of losing data during the mapping process. You might not consider it the best way to handle storing international addresses, but unless you want to become an expert on the subject, it is definitely easier to use an existing design.

Clean Up Your Sales Pipeline: Lead Scoring vs. Lead Validation

The lead score simplified

As it stands, you are the one who decides what and how your lead scores are determined, most likely through rules you set up in your marketing automation platform. Your main objective is to send qualified leads to your sales team and of course you want those leads to be “hot”!! This is why lead scoring has become such a vital resource to marketing and sales teams— prioritizing leads to weed out dead end and “junk” leads. Accurate lead scoring helps you make informed decisions about what actions needs to be taken, whether manual or automatic, for each lead in the sales pipeline. From a conversion standpoint, lead scoring allows you to measure and qualify your leads based on:

1. Explicit Scoring—Looking at your leads to find out if they are the “right person” for your products or services, i.e., role/job title, company type/industry, company size/number of employees, company revenues, etc.

2.Implicit Scoring—Finding out if these leads are showing the “right level” of interest in your products or services, i.e., online behaviors and level of engagement of anonymous and known visitors, such as frequent visits to websites, form activity, email click-throughs, content downloads, etc.

According to Aberdeen Research, “companies that get lead scoring right have a 192% higher average lead qualification rate than those that do not”. That’s huge! Of course you need support from your execs and your sales team to make it happen, and you all need to agree about the definition of a lead and then how marketing will handoff those leads to sales, but more importantly, and this is a big element, your success in lead scoring, will depend on the technology that you’re using to capture information, orchestrate the lead handoff, and then ultimately track and process feedback.

High lead scores need validation too

Even with the best lead-scoring systems in place, we are human and humans make mistakes. What if a high-scoring, “hot” prospect somehow enters an incorrect email address or phone number? How does your sales rep contact this user to close the deal? The truth is, they can’t. Even though this lead completed the form and scored highly, there wasn’t an empirical solution in place to catch their mistake at the point of entry. Sadly, this lead now becomes “cold” or “dead”.

The key point here— just because a lead gets a high lead score, DOES NOT mean it’s also a valid lead. These are two separate entities altogether. To give you some perspective, let’s take a look at 2 hypothetical situations, one without lead validation and the other WITH a lead validation solution in place:

WITHOUT a lead validation solution

An iPhone 7 user visits a company’s home page, continues onto a service/product page, and then downloads their service whitepaper. During their 6 minutes on the site, the user signed up to receive the blog posts. Based on the rules set up in the marketing automation platform, this visitor is placed in a high-value lead drip campaign, where they’ll receive a set of emails that are specific to the service they are interested in, along with the benefits it brings. Marketing automation considers this a highly-valuable, or “hot” lead!

WITH A lead validation solution

That same iPhone 7 user completes the form from the above example as “Homer Simpson”, with a 555-555-1234 phone number, and an AOL email address, with their IP address showing they are in China but their mailing address says Santa Barbara. Even though the user did all of the right things above, and at first received a high lead-score, with the lead validation solution in place, this time around they will receive a low-quality lead validation score. At this point, marketing decides to remove this lead from the sales funnel so that it will have zero impact on future resources, reporting, and decision making. It is also important to note that the validation process is not created by the marketing team, meaning that it is not subjective.

In fact, it is a data-driven Lead Validation, i.e., Service Objects Lead Validation solution, where problems are not only identified, but legitimate leads are also corrected in real time, at the point of entry. The validation happens through these points: name, email, phone, address, IP address, device, etc., to verify that the visitor provided accurate and genuine information. Ultimately, we need to determine if this lead is in fact a real person.

The takeaway for you is, yes, lead scoring is a HUGE contributor to building a successful lead and sales pipeline, BUT it can ONLY go as far as the accuracy of the data provided. Incorporating lead validation into your lead scoring system is a powerful marketing tool and will help you reach a 192% higher qualification rate in no time. Happy validating!

Use the Net Promoter Score to Ensure You Get a Good Customer Experience

Before you buy a product or service from a company – particularly one you may need customer support from – be sure to do some research and find out their Net Promoter Score (NPS). NPS is a metric that captures a company’s customer feedback and provides a numeric value of its brand loyalty. A high score is not easy to achieve, so if a company has a great score, they will be letting the world know about it!

NPS is calculated by soliciting feedback from customers about their experience with the company. Specifically, customers are asked the question “how likely is it that you would recommend a brand/product/service to a friend or colleague?” The answers are based on a scale from 0 – 10. Those that give a company a score of 0-6 are considered Detractors, those who give a score of 7-8 are considered Passives, and those that rate a company with a high score of 9-10 are considered Promoters. The final NPS score is calculated by deducting the percentage of customers who are Detractors from the percentage of customers who are Promoters (passives are not included in the calculation). Good Net Promoter scores vary by industry, but a score of 50 to 80 is typically considered “good”.

Customers are also given the opportunity to provide feedback to the question “what is the most important reason for your score?” Companies can use this specific feedback to continue and reinforce best customer service practices, stop or improve upon procedures that are causing customer dissatisfaction, and provide employees with feedback directly from the customer.Promoters are the people companies want to go after for testimonials, while Detractors can be the ones who post negative feedback or complaints on social media channels. Proactively following up with Detractors can potentially stop a negative experience from going viral.

Promoters are the people companies want to go after for testimonials, while Detractors can be the ones who post negative feedback or complaints on social media channels. Proactively following up with Detractors can potentially stop a negative experience from going viral.Capturing NPS can also serve as an eye-opener for those companies who assume that since they have not received a large number of complaints, customers are satisfied. They can often be surprised what customers are truly thinking when they are given a chance to have their voice heard.

Capturing NPS can also serve as an eye-opener for those companies who assume that since they have not received a large number of complaints, customers are satisfied.

They can often be surprised what customers are truly thinking when they are given a chance to have their voice heard.Relatively easy to use and deploy, companies who use NPS send existing and potential customers a signal that they are strongly committed to customer satisfaction and proactively acting on their feedback. Customers who see a company with a high NPS score know that the company is focused on customer loyalty and retention, and makes customer service a priority.

Relatively easy to use and deploy, companies who use NPS send existing and potential customers a signal that they are strongly committed to customer satisfaction and proactively acting on their feedback. Customers who see a company with a high NPS score know that the company is focused on customer loyalty and retention, and makes customer service a priority.

At Service Objects, one of our Core Values is “customer service above all”. We strive to achieve the highest level of customer satisfaction and regularly survey our customers for their feedback. The result of this commitment is that for the past three years, we have maintained a score 8 times higher than the technology services industry average! Our score of 64 is well above other well-known technology companies such as Oracle (21), MailChimp (20) and Constant Contact (20) and even above other customer service leaders such as (57), Netflix (52) and American Express (45), as rated by NPS Benchmarks.

Find out more information about Service Objects’ Customer Success Program here.

Bots Need Address Validation Too

Remember watching Star Trek as a kid and dreaming of talking to a computer throughout the day? Then PCs arrived, and while you couldn’t control them with your voice, information was at your fingertips. And then along came Siri and Cortana as well as other artificial intelligence technologies like chat bots. The future has arrived!

Though initially clunky and limited in their capabilities, chat bots are getting smarter and more human-like. Earlier this year, students taking a course online at the Georgia Institute of Technology found out that their friendly teaching assistant, Jill Watson, was, in fact, a chat bot and not a real person as they had believed all semester.

Siri, Cortana, and various transactional bots that appear when you order flowers and other services online are likely to play a more prominent role as you interact with businesses online. For example, you can already use Cortana on Windows 10, and Mac OS Sierra, which is now in public beta and expected to arrive in the fall, will bring Siri to the Mac. Not only will you be able to interact with Siri on your computer, she’ll have a direct link to Apple Pay. Developers, at long last, have been given access to Siri, which means you’ll soon be able to order and pay for products with a simple Siri command.

Amazon’s Echo audio device is another example of how technology is changing how we interact with computers. This device is always listening and ready to play music, look up information, give you a weather report, order pizza, read you a story, control smart home devices, and more — all with a voice command.

Star Trek had it right when it envisioned how we interact with computers. In one iconic scene, Scotty traveled back in time to contemporary 1986. He tried to talk to the computer, but given 1980s technology, he got no response. After trying to address it, someone handed him the mouse. He tried talking into the mouse. Again, no response. Finally, he was told to use the keyboard. How primitive can you get?


Scotty would be happy to know that we are finally approaching what he thought was the obvious way to interact with computers. Computer bots are finally not only understanding what we say but also taking lots of complex actions based on what we tell them to do!

Technology has progressed to the point where Facebook wants companies to forgo email and talk to Gen-Zers via chat bots. According to Facebook’s Developer News Post, How to Build Bots for Messenger, “…bots make it possible for you to be more personal, more proactive, and more streamlined in the way that you interact with people.”

Clearly, bots have a big job to do, and that job is getting bigger and more complex. Are they up to the task?

Unfortunately, there have been problems with bots not validating information correctly. Shortly after Facebook’s online demonstration of 1-800 Flowers’ chat bot integration with Messenger, users began posting their own awkward interactions with the bot. One user entered a delivery address multiple times, yet the chat bot continuously ignored the given address, prompting the user to enter an address again or choose from a list of buildings located halfway around the world.



So, while chat bots are getting better, smarter, and more prominent, Facebook’s Messenger and other companies using bots need to do a better job of fuzzy matching. Service Objects easily handles these delivery issues with no problems whatsoever. The address would have been validated correctly and frustration free.

Remember, as friendly and helpful as bots appear, they are not humans; they are computer programs. While a human may quickly recognize an address or apartment number even if it’s in a non-standard format, computers rely on the algorithms and databases they’ve been instructed to use — and it’s your classic case of garbage in, garbage out.

If a chat bot has been integrated with a quality Address Validation API such as Service Objects’, it will be able to instantly understand and recognize an entry as an address.

Now here’s where a chat bot has an advantage over a human: linked to an address verification API like our real-time address parsing software, a chat bot could instantly verify and correct address data as well as retrieve geocodes that pinpoint the exact address location on a map.

As smart and intuitive as bots are, they still need our help. The best way you can help your company’s chat bots is to link them to our address validation API. It’s easy and affordable, and it will deliver a superior customer experience, not to mention delivering those tulips to the correct address.

Your Own ‘Big Data’ is Silently Being Data Mined to Connect the Dots

With apps like Facebook, Waze, and the release of iOS 9, you probably didn’t realize that your cell phone is now quietly mining data behind the scenes. Don’t be afraid, though. This isn’t big brother trying to watch your every move, it’s data scientists trying to help you get the most out of your phone and its applications, ultimately trying to make your life easier.

Here are a few things your phone is doing:

Data mining your email for contacts

Since it was released late last year, Apple’s newest iPhone operating system (iOS 9) now searches through your email in order to connect the dots. For example, let’s say that you get an email from Bob Smith and the signature line in the email gives his phone number. iOS9 records this so that if his number calls you, and Bob isn’t in your contacts, Apple shows the number with text underneath that says “Maybe: Bob Smith”.

Apple was quick to point out that this automatic search is anonymous – not associated with your Apple ID, not shared with third parties, nor linked to your other Apple services, and you can turn it off at any time in your Settings.

Mining your data via Facebook’s facial recognition

Upload a photo with friends into Facebook and it will automatically recognize them and make suggestions for tagging them based on other photos of your friends.

When facial recognition first launched on Facebook in 2010, it automatically matched photos you would upload, and tagged your friends accordingly. This spooked so many users that Facebook removed the feature. They later they brought it back, this time around asking the users if the tagged photos were correct first. They also included the ability to turn it off altogether for those who thought it was still too ‘Big Brother”. You can turn it off via Facebook Settings -> Timeline and Tagging -> Who sees tag suggestions when photos that look like you are uploaded?

Waze crowd-sourced data mining for traffic

Google purchased Waze in 2013 for $1.3 Billion and people wondered “why so much?” Quite simply: because of the data. Accepting the terms of the app when you install it means that even when running in the background, the app sends the data to Waze of where you are and how fast you are driving. Waze had amassed a large enough user base that they have a constant stream of real-time traffic. The users are both the source of how fast they were going on any given road at any given time and the beneficiaries of knowing how fast everyone else is going on all other roads. There is no need for cameras or special sensors on the freeway. This meant Google could use the real-time data to make better maps and projections for traffic conditions, and re-route you based on traffic and incidents others had reported to Waze.

Here is a case where, if you read the fine print of the app user agreement, you might have second guessed your download. But like nearly everyone else, you probably didn’t read it and you are now avoiding traffic for the better.

Un-connecting the dots

Sometimes Big Data will have connected the dots, but you’d like to undo the connection. A recent article in the New York Times gave examples of how people managed breakups on social media:

‘I knew that if we reset our status to “single” or “divorced,” it would send a message to all our friends, and I was really hoping to avoid a mass notification. So we decided to delete the relationship status category on our walls altogether. This way, it would disconnect our pages quietly. In addition, I told him I planned to unfriend him in order to avoid hurt feelings through seeing happy pictures on the news feed.’

As ‘Big Data’ connections become more prevalent, luckily so too are the tools that help undo the connections they make. Facebook’s “OnThisDay” feature allows you to turn off friend’s reminders so that you aren’t shown memories of exes that you’d rather have not appeared.

Here at Service Objects, we are constantly looking at connecting the disparate dots of information to make data as accurate and as up-to-date as possible. Whether scoring a lead the second a user fills out their first online form on your website or preventing fraudulent packages from being delivered to a scam artist’s temporary address, having the freshest, most accurate data allows companies to make the best decisions and avoid costly mistakes.

In Customer Service Chat, You Have to do More Than Answer

Customer service chat is popular with companies and customers alike. It’s easy, it’s quick, and it works well on mobile devices. But easy and popular doesn’t always equal good. Read this chat with customer service agent “Jack” at Vizio. It is a set of customer service blunders, large and small.

Here’s the chat transcript

Visitor: Hi I just bought a 50″ M501d-A2R tv. i am trying to set it up. I can’t put in the password to my wifi because my password is longer than the number of characters allowed. I don’t want to reset my password on my Cisco cable router. Can you help?

Jack: Here at VIZIO we pride ourselves in providing best in class U.S. based support. I’m happy to assist you today. How many digits is your wireless password?

Visitor: 26 digits

Jack: The TV will support up to 22 digits. Unfortunately the password would need to be shortened to work with our TVs.

Visitor: Hmm i am not glad to hear that

Jack: I apologize for the inconvenience.

Visitor: Ok. Please email me a transcript of this chat. Thank you.

Jack: You’re welcome! You will receive a copy of this chat transcript as soon as the chat window has closed. Thank you for chatting with VIZIO today. If you have any questions feel free to contact our support team at 1-877-878-4946, online at, or email us at! We would also like you to join VIZIO Fandemonium today to earn points and win prizes only at Thanks again, and have a great day.

Here’s how this chat needs to be improved:

Stop the chest-thumping about being US-based. This should NOT be the first thing Jack says to the customer. In fact, Jack shouldn’t say this at all. It doesn’t matter whether Vizio’s support is based in the US. The customer wants a high-quality chat. He wants a quick, correct, complete answer. Jack’s first statement really causes problems because the support he provides isn’t worth the company’s pride and it isn’t best-in-class. The cultural elitism of this statement is really unattractive, especially given the poor quality of the chat.

Use the customer’s name. The impersonal use of “Visitor” rather than the customer’s name clashes with the parts of the chat that are quite good. Some of Jack’s replies are specific and personal. For example, when he asks, “How many digits is your wireless password?”, it is clear he’s read what Visitor has written. The chat system should be configured to use the customer’s name. Why would any customer service organization want to refer to a customer by an anonymous term?

Be sincere. I was really sad when Jack laid down the classic customer service trope: “I apologize for the inconvenience.” In this case, this statement is insincere and unnecessary. There’s no need for an apology because neither Jack nor Vizio has done anything wrong. And it’s a true service failure to simply apologize when the customer needs help solving the problem.

Help the customer. Don’t merely answer the customer’s question. Visitor got an answer to his question about the length of his password. His is four digits too long. But Jack never helped him. Even if Jack can’t actually help Visitor reset the password on the Cisco router, he should have written something like, “Refer to the user guide that came with your Cisco router to find instructions on how to reset and shorten your password…”

Omit the marketing. Vizio clearly thinks, “We’ve got Visitor’s attention, so let’s pitch him Fandemonium.” But this pitch doesn’t belong in this chat, especially given the poor service Vizio has provided. And it’s not good marketing copy, either. Points? For what? Prizes? What kind? What’s In It For Me?

Reprinted from LinkedIn with permission from the author. View original post here.


Editor’s Note: Service Objects prides itself on customer service and tech support for effective resolutions to all questions, issues and inquiries. We’re always striving to improve our customer support, and have found chat to be an integral part of our everyday communication with those who visit our site seeking answers to their data validation problems.

Author’s Bio: Leslie O’Flahavan, Principal, E-WRITE

As E-WRITE owner since 1996, Leslie has been writing content and teaching customized writing courses for Fortune 500 companies, government agencies, and non-profit organizations. Leslie is a frequent and sought-after conference presenter, a faculty member at DigitalGov University, and the co-author of Clear, Correct, Concise E-Mail: A Writing Workbook for Customer Service Agents. Leslie can help the most stubborn, inexperienced, or word-phobic employees at your organization improve their writing skills.


What is a Data Quality Scan and How Can it Help You?

Marketing Automation and Your Contact Records: A Five Part Series That Every Marketer Can’t Miss (Part 5)

Putting Your Contact Records to the Test!

Now that you’ve read our series on how the quality of contact record data impacts the performance of your marketing automation efforts, we are hopeful that you better understand the importance of correcting and cleaning up these records. The next step is learning how your contact records measure up. To help, Service Objects offers a complimentary Data Quality Scan, providing a snapshot of where your data stands in real time and where improvements can be made.

DQS-ScreenHow does the Data Quality Scan work? It’s simple: we use a small sample of your data (up to 500 records), to not only find out where your data is “good”, but more to the point, what percentage of your data is bad and/or correctable. You’ll receive a data quality score for each of the five critical lead quality components.

  • Email address
  • Phone number
  • Name
  • Mailing address
  • Device

Check out a sample Data Quality Scan to see how each of these components are scored and the detailed information provided.

Once you get your results, you will have an opportunity to work with a Data Quality expert (yes, a real, live person) who will help you in deciding how best to correct and improve your contact records. From start to finish,think of them as your personal guide to improving your contact record data quality.

Leave Your Worries Behind

All in all, the Data Quality Scan is here to help marketers expose data quality problems within their companies, and we know that percentage is likely to be around 25%. It certainly wouldn’t hurt to give it a try and see for yourself, right? At the very least, ongoing data validation and correction services should become a priority within your marketing automation best practices.

And finally, before we sign off on this series, I wanted to leave you with this thought: When it comes to marketing, you have plenty to worry about. Constantly verifying and updating your marketing automation contact records should NOT be one of those worries. Implementing a data governance strategy around your contact records will ensure your records are always accurate, genuine, and up to date!

If you’re new to this series, start from the beginning! Read Part 1: Make Marketing Automation Work for You


Contact Data Governance in Action: It’s All About Validation & Verification

Marketing automation and your contact records: A five part series that every marketer can’t miss (part 4)

Well-integrated and accurate customer data is one of the best assets marketers have at their disposal to effectively personalize and engage customers, drive conversion rates, boost loyalty and trust, and ultimately maximize sales.” (Vera Loftis, UK managing director of Bluewolf, Salesforce experts)

Bottom line: you want the very best long-term outcome for your marketing campaigns, yes? The goal is to build a sustainable marketing automation platform, and in the world of data integrity, validation and verification are the keys to unlocking its success. And really this needs to happen at the beginning— at both the point of data capture AND/OR when migrating to a new marketing automation or CRM platform. Here’s the opportunity you’ve been waiting for!

Migrating data between platforms allows for a fresh, clean start. This is an ideal time for your Data Quality Gatekeeper to launch a preemptive strike against bad data (validate and verify). Essentially, they need to make sure that migrations and the fields being imported are accounted for BEFORE importing the data, i.e., mailing addresses, phone numbers, and email addresses can all be validated and brought up-to-date, as well as attaching demographic and firmographic information to corresponding contact records. And let’s not forget about data formatting and purging duplicate records. This is the time for remediation.

Implementing the right tools

To handle source data issues upfront, organizations need to employ a powerful, automated discovery and analysis tool that provides a detailed description of the data’s true content, structure, relationships, and quality. This kind of tool can reduce the time required for data analysis by as much as 90 percent over manual methods.” (Jeffrey Canter, Exec. VP of Operations at Innovative Systems, INC.)

Now you’ve followed our recommendations and have your gatekeeping system, or data governance strategy, up and running, but perhaps you’re thinking, “what’s the best way to do regular data quality check-ups and maintenance from here on out?” Honestly, you need a systematic solution in place that you can trust and rely upon. One that will continually provide THE most accurate, genuine, and up to date data for your company. AND you will want a program that, once it’s set up and running, will allow you to get on with other important jobs. Seriously, it should be as simple as “set it and forget it”.

Look no further! Service Objects provides a comprehensive set of solutions that will validate and verify your data as it’s being migrated or captured in real-time. Let’s take a look at just some of our Data On-Time Solutions (DOTS) you can choose from:

Lead Validation: A real-time lead validation service that verifies contact name, business name, email, address, phone, and device while providing an actionable lead score.

Email Validation: Verifies that the email addresses in your contact and lead database are deliverable at the mailbox-level. This feature corrects and standardizes, while assessing if addresses are fraudulent, spam-trap, disposable, or a catch-all.

Address Validation: Uses USPS® certified data to instantly validate, parse, correct, and append contact address data, including locational meta-data and the option for Delivery Point Validation (DPV).

NCOA LIVE: Keeps your contact mailing list accurate and up-to-date with data from the USPS® National Change-of-Address database (NCOALink).

Email Insight: Appends demographic data to emails from proprietary consumer database with over 400 million contacts. Supplies you with valuable info i.e., city, geolocation, age range, gender and median income of the address owner.

GeoPhone: Gives you accurate reverse telephone number lookup of US and Canadian residential and business names and addresses.

Phone Append: Supplies you with the most accurate lookup of landline telephone numbers available.

This is quite a lot of information to digest for sure. So, how do you know “what” solutions will work best for YOUR platform? Depending on your individual needs, our free Data Quality Scan will determine where your contact data might be falling short. Then you can take steps to clean up your contact records for good.

Up next: Part 5: What is a Data Quality Scan and How Can it Help You?

Service as a Differentiator

Service quality is one of the most misused concepts in business. You can’t see it or smell it. It is hard to quantify except in hindsight, even though there are real live academic journals about it. If you look at company websites, they will all tell you that theirs is great, of course – often replete with pictures of smiling attractive people with headsets. But in real life, it is often one of the greatest differentiators between companies.

Here is a personal example. Once I purchased a laptop computer, in part because its manufacturer touted its service and replacement policies. This was important to me, given my frequent travel. But in reality, any service issues I had were met by indifference, bad answers, and “Sorry, the part’s out of stock. Dunno when it will be in.”

So in one of the great ironies of my career, I later visited a consulting client on the West Coast – an organization with an excellent service reputation – and discovered that they shared a parking lot with this laptop maker. And one morning I made it a point to come in early and watch everyone come in to work. My client’s employees were engaged and chipper, while the other company’s employees trudged in with their heads down like they were marching off to jail.

Now, back to my original point about service quality. This laptop maker had the same support automation tools as most people. They clearly had CRM systems and interactive voice response queues. And their support policies, at least on paper, were a cut above their competitors. But they couldn’t deliver what they promised. Clearly, at this company, support was a cost to be reduced as much as possible. And soon after they reduced their costs to zero, because they lost market share and exited the market.

So what really creates good service quality? A marriage of the right policies AND the right systems. When I managed a 24/7 tech support center, here were some of the factors that led us to have both near-perfect customer satisfaction scores and near-zero turnover:

  • Our team constantly educated itself. We devoted an average of over three weeks per year to product and skills training, versus an industry average of less than one week.
  • We constantly benchmarked customer experience. From the way people were greeted to the oversight we gave to inbound cases, we were constantly aware and constantly improving.
  • We measured quality first and productivity second. Did you know that service metrics often kill service quality? When an agent is measured for how quickly they resolve call, they will be quick, by golly – even if you get sent packing with bad answers. Our agents were rewarded for keeping customers happy and working as a team, and only coached for performance when it varied far from the norm.
  • We had service standards that met the needs of our customer base. From personal assigned support representatives to 24/7 access, we delivered what a high-end audience in a mission-critical environment needed.
  • We realized that service was delivered by human beings. Which meant that we went out of our way to keep employees happy, whether it was plenty of individual recognition and professional development, or a team hiring strategy that let people have a say in who joined “the club,” or annual best practices workshops where team input led to real policy change.

All of these mechanics – most of which never show up on a company’s website – are why service leaders like Disney, Southwest Airlines or my former employer deliver a very different service experience from their competitors. Making it happen requires planning, execution, and a mindset that steers people away from whatever is cheapest or most expedient in the moment. Above all, it is one of the most powerful and cost-effective business strategies an organization can adopt.


Editor’s Note: Service Objects was founded around many of this author’s service principles. From the expertise of our staff, to our fanaticism to 99.995% uptime, to our 24/7 customer service, we invest in strategies that lead to a tangible difference in customer experience.

Author’s Bio: Rich Gallagher is a former customer support executive and practicing psychotherapist who heads Point of Contact Group, a training and development firm based in Ithaca, NY. He is the author of nine books including two #1 customer service bestsellers, What to Say to a Porcupine and The Customer Service Survival Kit. Visit him at

Data Breakdown Happens. Know the Reasons “Why” and Protect it from the Start

Marketing automation and your contact records: A five part series that every marketer can’t miss (part 3)

Welcome back! The fact that you’re interested in improving the data quality within your marketing automation platform is apparent, so let’s look at how the data breakdown happens so we can build it back up. The goal is to find sustainable and trustworthy ways to correct and effectively manage data from here on out.

So who’s responsible for ensuring good data quality? Your IT department? If only it was that simple. The fact is, your TEAM is responsible and it starts with a company that incorporates a culture of data quality and best practices. Senior managers must do their part to integrate and maintain solid data quality policies that are easy to follow and implement. Next, marketing managers must screen and clean up new lists before importing. Finally, your sales team must practice diligence when entering customer data. And, even if ALL of these groups commit to following these guidelines in the most stringent and cautious manner, you can bet that mistakes will still happen! We are, after all, human.

Now that we’ve identified the players, we can pinpoint what makes our data corrupt:

  • Inaccurate data: either info is entered incorrectly, or it becomes outdated
  • Duplicate data: multiple accounts are mistakenly set up for the same lead, i.e., some companies do not centralize their data.
  • Missing data: some fields are simply left empty
  • Non-conforming data: inconsistent field values, i.e., using various abbreviations for the same thing
  • Inappropriate data: data entered in wrong fields

Would you agree that the four most common data fields in your lead forms are: name, email, address, and phone? According to Salesforce:

These statistics highlight just how difficult it is to keep up to date and current with changing customer demographics. It’s overwhelming, especially when you consider the rate at which data is being captured. Fortunately, there IS a better way!

InfoGraph_FormTurning strategy into action

We’ve discussed how even the best and most efficient internal processes can be in place yet cannot completely wipe your data clean of corruption. Oh if only…BUT, there are additional steps you can take to continue moving in the right direction. You could implement a Data Governance strategy which “refers to the overall management of the availability, usability, integrity, and security of the data employed in an enterprise. A sound data governance approach includes a governing body or council, a defined set of procedures, and a plan to execute those procedures”. Here’s what you can do:

  • Conduct regularly scheduled data quality check-ups & maintenance scans: By work with an external source to automatically scan, identify, and update your data, you will save money and headaches in the long run. Sending in the Calvary!
  • Bump up personal accountability: Consider training your sales reps and then monitoring their data input via their login info. Like Big Brother, but less invasive.
  • Hire data quality managers: They have oversight of all data input, monitoring, and management. Think of them as Data Quality Gatekeepers – like a mini task force.

Congratulations! You are well on your way to stopping the bad-data-cycle. And the most important step in doing so is: running regular data check-ups and continued maintenance. So how do you set your business up for success? Come back for the 4th part in our series where we show you how to ensure your records are genuine, accurate and up-to-date. It’s as easy as one…two…THREE!

Be sure to start from the beginning! Check out Part 1: Make Marketing Automation Work for You

If you are interested, Service Objects provides a free Data Quality Scan that will give you insight into the quality and accuracy of your contact data. Click here to get your free scan.

Up next: Part 4: Contact Data Governance in Action: It’s All About Validation & Verification


5 Commonly Used Terms and Definitions in International Address Validation Systems

When dividing the countries of the world into regions and sub-regions for the purpose of Address Validation, it is important to find common ground and to use a set of widely adopted terms and definitions.

In the United States of America, (US), we commonly use the terms city, state and zip code when referring to addresses. While that may mostly work for a country like Mexico (MX), it is not appropriate for other countries like Japan (JP) where the country is divided into prefectures instead of states. Not all countries call their sub-region divisions the same thing and many countries have several levels of sub-divisions. To further complicate the matter, not all sub-division levels are necessarily interchangeable from one country to another. For example, a first level sub-region in the US is a state, such as California (US-CA), but a first level sub-region for the United Kingdom of Great Britain and Northern Ireland (GB) is a country, such as England (GB-ENG).

Every country can have its own particular set of terms and definitions; to try to go over them all would be too complicated and inefficient. Instead, let’s go over some commonly used terms that are helpful when talking about international addresses.

Country Code

An alphabetic or numeric code used to represent a country. Various types of country codes exist for different particular uses, but the most commonly used codes come from the ISO 3166 standard. Part one of this standard, ISO 3166-1, consists of the following code formats:

  • ISO 3166-1 alpha-2 – a two-letter country code.
  • ISO 3166-1 alpha-3 – a three-letter country code.
  • ISO 3166-1 numeric – three-digit country code.

Postal Code

An alphabetic, numeric or alphanumeric code that may sometimes include spaces or punctuation that is commonly used for the purpose of sorting mail. Commonly referred to as the Postcode. Some country-specific terms include ZIP code (US), PLZ (DE, AU, and CH), PIN code (IN) and CAP (IT).

Administrative Areas

The regions in which a country is divided into. Each region typically has a defined boundary with an administration that performs some level of government functions. These areas are commonly expected to manage themselves with a certain level of autonomy. Various administrative levels exist that can range from “first-level” administrative to “fifth-level” administrative. The higher the level number is the lower its rank will be on the administrative level hierarchy. For example, the US is made up of states (first-level), which are divided into counties (second-level) that consist of municipalities (third-level). For comparison, the United Kingdom (GB) is comprised of the four countries England, Scotland, Wales and Northern Ireland (first-level). These countries are made up of counties, districts and shires (second-level), which in turn are made up of cities and towns (third-level) and small villages and parishes (fourth-level). Other common terms for an administrative area are administrative division, administrative region, administrative unit, administrative entity and subdivision.


In general, a locality is a particular place or location. More specifically, a locality should be defined as a distinct population cluster. Localities are commonly recognized as cities, towns, and villages; but they may also include other areas such as fishing hamlets, mining camps, ranches, farms and market towns. Localities are often lower-level administrative areas and they may consist of sub-localities, which are segments of a single locality. Sub-localities should not be confused for being the lowest level administrative area of a country, nor should they be confused as being separate localities.


In general, a thoroughfare is a transportation route between one location and another. On land, it is more commonly referred to as a type of road or route that is typically used by motorized vehicles, such as a street, avenue or highway.

Is Email Dying?

A Quora discussion recently got us thinking about the status of email in our day and age. As a company, we know how important email is, since we verify hundreds of thousands of emails a day. It’s clearly evident to us that people are still using it. But conversations like this made us wonder if email is actually on the decline after all. There seem to be new task management tools, messaging apps, and other alternative email tools that emerge every day that promise to rid us of the need for email, but will these tools eventually replace it altogether?

It turns out that not only is email not dead, it’s actually growing. It’s the way we’re using it that’s changing. Experts predict that by 2019, the number of email accounts will increase 26% to 5.59 billion. Consider even more statistics below:

  • 88% of B2B marketers say email is the most effective lead generation tactic
  • Marketers consistently ranked email as the single-most-effective tactic for awareness, acquisition, conversion, and retention
  • 42% of businesses say email is one of their most effective lead generation channels
  • 122 billion emails are sent every hour

Given these numbers, email is clearly not dying. It is more important than ever before. Anecdotal evidence supports this, too. For example, think about how you use email in your own life. You subscribe to interesting newsletters, you get your receipts emailed to you, your bills arrive in your inbox, you get alerts from your bank, you correspond with friends and family members, customers contact you via your website and their messages arrive in your inbox. The list goes on and on…

Several new task management tools that intend to “replace” email still rely on you to sign up with your email address, and use your email for updates and notifications. If anything, these new tools and services are just a way to leverage and build off of your email, but certainly not replace it. Likewise, whenever you sign up for a new service, that service requires your email address. Your email address will serve as a communications channel between you and the service provider as well as, potentially, your username.

While social media, instant messengers, and online collaboration tools offer an alternative to email, they don’t come close to the ubiquitousness of email. Just about every Internet user has an email address, but just a fraction have SnapChat, Slack, or Asana accounts.

Email is alive and well, and it’s effective. However, email marketing is suffering from a common ailment you need to be aware of: bad data. For example, 88% of users admit to entering incomplete or incorrect data on registration forms. This is troubling for many reasons, but especially due to the fact that a recent study found that 74% of users become frustrated when websites display irrelevant content. How can you personalize your marketing and create a better experience when the data users give you is junk?

Junk data, indeed. Whether users type the wrong address by mistake, check the wrong boxes in your web forms, or fail to notice that auto-correct has changed their entries, this bad data means your marketing and customer outreach efforts will fall flat. It’s hard to make a good impression when you’re addressing them by the wrong name or sending mail to a non-existent address.

In contrast, when you capture correct data in lead forms and on eCommerce sites, not only will your marketing automation and CRM platforms have correct data in them, you’ll also be able to personalize their experience with your company and brand.

In a world where personalization can make a customer feel welcome and appreciated, you need to get good data — even if that customer actually provides you with incorrect data.

Service Objects can catch bad data and clean it in real time with our data quality tools. These data validation tools instantly compare entered data against a massive database containing millions of verified phone and address records, automatically validating, correcting, and appending the data to ensure that you have current, accurate information. Check out how cool cleaning data is with the email validation slider below. Here we show an example of a bad email address in, and it’s corrected version out:

(Slide back and forth to view Before/After)

Email isn’t going anywhere, but if you want to ensure deliverability of your messages, you need good data. Our data quality validation tools are a must for any business that communicates with leads, prospects, and customers using email.

Get a free trial key today and see just how easy it is to clean bad data.

Building Intelligent Applications with Applied Machine Learning

Intelligent Systems are ubiquitous in our daily lives, from facial recognition in cameras to product recommendations in e-commerce. These modern conveniences and others are enabled through machine learning. Machine learning is the process by which behavior of an application is controlled based on prior experience vs rules that are dictated by hand. Recent polls estimate that up to 50% of businesses are now using machine learning in some capacity, with an additional 22% now road mapping projects which incorporate machine learning. Fueling this growth is the massive amount of data being produced by today’s connected systems which provide valuable insights for decision making. In addition, companies such as Amazon, Google and Microsoft provide API’s that make machine learning algorithms more accessible.

Learning style

Machine learning algorithms can be categorized by learning style; supervised and unsupervised. Supervised learners will use training data associated with a label to train the machine and yield the desired output. A supervised learning algorithm is used when the output classes are already known, such as whether an email is spam or not. An unsupervised learner, on the other hand, will not have any provided labels and determine classes through finding structures or patterns in the data.

Machine learning in practice

Choosing the right algorithm is the task of a Data Scientist whose responsibility it is to find insights on the data and make decisions on which learner is appropriate. Service Objects takes the guess work out of finding the right algorithm in DOTS Name Validation. In its latest operation, NameInfoV2, classification of a name is performed to determine whether a provided name is a person, business or unknown. Using sophisticated learners coupled with a proprietary database of millions of names yields a highly accurate system for classifying and scoring a name. This helps clients improve efficiency in sales and marketing operations by knowing whether a provided name is in fact a person or some entity.

Adapting to future needs

Building intelligent applications that can adapt to the needs of our users is paramount to ensuring our clients are receiving the highest quality data. With more intelligent applications on the roadmap, Service Objects edges closer to realizing its mission of ensuring every contact data point is as accurate as possible.

Why We Geek Out Over Name Validation

What’s in a name? Everything — especially if you’re trying to connect with customers and prospects. If you’re emailing, mailing, or calling someone and you have her name wrong, you’ve already lost her.

The importance of name validation APIs

Name validation is becoming increasingly important in the modern world where social media and the Internet allow for a faster-than-ever propagation of bad data. For example, as people opt into various offers, it’s not unusual for auto-correct to change their entries, for a typo to occur, or for the person to enter a bogus name. On other occasions, a name that looks fraudulent and is labeled as such, could really be a legal name. This is the case for this man who legally changed his name to Fire Penguin Disco Panda:


Companies wanting to avoid potentially embarrassing situations like putting a bad name on a piece of mail, or removing a perfectly good contact with a name they think is fraudulent, should consider using a service like DOTS Name Validation, an essential ingredient in marketing automation, business databases, CRMs, and the like. Not only does name validation perform helpful changes such as parsing names into individual fields, fixing the order of names, and returning the gender of the individual, our name validation API runs a variety of checks to ensure the name isn’t a bogus, celebrity, or vulgar name.

Updated name validation scoring algorithms

We recently pushed a major update to our name validation service, including many international names as well as massive improvements to our scoring algorithms. Our name validation database now has almost 5 million first and last names in it.

Our scoring algorithms are where the service truly shines. Even when we get an obscure name that we are not sure about, we look to our algorithms to separate the unknown from the bad. This is where our team likes to geek out. We enjoy thinking of new ways to combine results to identify complex names.

Here’s where we get geeky

We love to get creative with our name validation service. We spend time pouring through lists of celebrities, vulgar names, and any crazy goofy thing we can think of.

What are some of the things we are interested in? We love unusual names. For example, should we consider the names Anakin and Khaleesi as valid now that people are actually naming their babies after these characters? And you can imagine the fun we’ve had talking about Anita Bath and Warren Peace.

We track a lot of vulgar and goofy-type potential names, but what about alterations to those? For example, we might nail the name Hugh Jass, but what about similar names like Hue Jass, Hugh Jazz, Hou Gass, or Hue G. Azz? What if someone submits the name Bob Ba$$? Could we figure out that the intended name should be Bob Bass?

What if a name is submitted that should not be a name like “House on the corner” or perhaps the name of a business instead of a person? These sorts of things can be tricky to identify in an automated system, but our team lives for solving these kinds of problems.

We let our inner geeks out so that we can anticipate and flag bogus, prank, and unusually challenging names. Though our name validation software uses algorithms to score and validate data, they’re powered by both artificial and human intelligence.

Increase Your Company’s Worth With Friendly APIs

Customer relationship management (CRM) software has become a powerful business tool. With a CRM tool such as Salesforce, Microsoft Dynamics CRM, Infusionsoft, or Oracle CRM On Demand, all customer data — including interactions and insights — is centralized for easy access and management.

As powerful as CRM applications may be, they are built for a single purpose: customer relationship management. Integrations, such as a data validation plug-in, can extend the functionality of CRMs, but only if APIs are created, and more importantly, well documented so that the third party developers can actually understand them.

CRM Basics

CRMs provide a centralized location for storing customer data and interactions. Early CRM applications, such as ACT! and Telemagic, were basically databases that stored customer phone and address data along with notes about the customer, recent orders, and so forth. Typically shared over a network, these programs allowed other employees to review contact notes as needed. They were also commonly used to create mail merge documents.

Today, modern CRMs are hosted on the cloud and loaded with robust contact relationship management, marketing automation, and social media features. These sophisticated applications are tightly focused on managing the customer’s journey. That’s what they’re designed for, and that’s what they do best. They are purposefully built to this end.

It’s a huge undertaking to create a piece of software to manage both the customer data (names, addresses, and contact information) and every single interaction across a multitude of channels. Small or large, CRM developers maintain a laser focus on their core product and its purpose. They’re concerned about making sure their software lives up to its core promise. They’re not necessarily concerned about extending their software to accommodate various users’ wish lists.

How User-Friendly APIs Ultimately Improve CRMs

While a CRM may have a plethora of tools built into it, the possibilities become endless when the CRM has an API that can be used by third party developers. For example, when Service Objects is able to integrate with a CRM’s API, we are able to create a data validation plug-in to clean up, standardize, and validate the data contained within the CRM.

This is valuable to everyone involved including:

  • The CRM developer — They don’t necessarily have the time or desire to add functions like data validation because their priorities are focused on the core product. With an API, valuable functions can be added without the developer having to expend resources on them.
  • The third party developer— Third party developers benefit by being exposed to the CRM developer’s customer base.
  • The end user — End users are happy to have external tools available through their company’s CRM platform where they can easily add the unique functions they want.

Creating an API for developers opens the door to new possibilities. A company like Service Objects can use the API to access the client’s data within the CRM, validate it, and then push it back in. With data validation plug-ins, the process is seamless for end users, the data quality improves, the business can operate more efficiently with less waste, and operating costs go down.

But there’s a catch: an API has to be available for a developer to use — complete with meaningful and current documentation. Integrations, such as a data validation plug-in for CRM, are magnitudes easier if the API documentation is up to date and organized.

We implore API creators to work hard to make a good API and supporting documentation. Doing so helps us all, and, most importantly, it helps all of our clients.

The Inevitable Switch to IPv6 or: How I Learned to Procrastinate, Because if the World Ends Tomorrow I Won’t Have to Do It

Despite being created as a replacement for IPv4 back in December of 1995, an official world launch of IPv6 did not come about until June 6th, 2012. Maybe somewhere in the back of our minds we really did think that 2012 would be the year the world ended, so everyone just decided to procrastinate? Whatever the reason for the delay, IPv6 appears to be slowly but surely picking up steam.

Roughly 10% of all users who access Google are doing so over IPv6

Check out Google’s adoption statistics page to see how IPv6 adoption has grown over time. There is also a map of IPv6 adoption per country. While overall IPv6 adoption may only be at 10%, countries such as the United States, Portugal, and Greece are ahead of the curve with a little over 20% adoption. Belgium, however, is leading the way with approximately 40% IPv6 adoption. According to an article by Iljitsch van Beijnum at ArsTechnica, if the current adoption trend continues then we should see 100% worldwide adoption by the summer of 2020, which at the time of this writing is only 4 years away. If you are interested in learning more about IPv6 and IPv4, then I highly recommend reading the article.

Slow business adoption and security concerns

If we take a closer look at Google’s IPv6 adoption graph we see a distinct trend where IPv6 usage spikes on the weekends. This would suggest that more people are using IPv6 at home than they are at work. Many of the world’s major Internet Service Providers (ISPs) pledged to start switching to IPv6 back in 2012, and so far it appears that they have for the most part stayed committed to their promise. Most businesses, on the other hand, made no such promises, and for good reason.

IPv6 is not backwards compatible, so you can only communicate with other IPv6 adopters on a 100% IPv6 network connection. If any part of the connection between the source and the destination does not support IPv6 then it will fail, in which case a failover connection via IPv4 should be made. So immediately we see two reasons for why businesses may not be jumping on to IPv6:

1) The IPv6 infrastructure and user base is still in its infancy.
2) IPv6 adopters will also support IPv4, so why bother setting up IPv6 on your end if you can still use IPv4?

How we currently combat spam and malicious activity

There is also a myriad of concerns associated with switching to IPv6, but let’s look past the initial concerns of migration cost and complexity. Let’s say that we have already made the migration and opened the doors to IPv6 traffic.

We are now in the growing pains stage. The internet can be a scary place, filled with malicious bots and users. Have you ever seen a Distributed Denial of Service (DDoS) attack? The visualization can be quite memorizing (not a DDoS attack visualization, still cool nonetheless), but the reality can be very damaging. How do you feel about spam, of the email variety and not the canned food? If you are like most people then you probably hate spam, and if you are responsible for managing a mail server or firewall then you probably REALLY hate it.

To admins and hackers alike, IPv6 is just another vulnerability waiting to be exploited. So why take the chance? Not everyone is so worried, though.
Currently, our popular choices for fighting spam and other malicious activity is to use statistical and reputation based methods as well as blacklists. These methods are IP version agnostic and they can be used by businesses that have adopted IPv6. However, new and existing business who try to switch to IPv6 may find that they have been locked out of some standard and crucial features that they depend on, such as SMTP, FTP and/or UDP. IPv6 was built from the ground up to be inherently more secure than IPv4, but some ISPs are blocking critical features for everyone rather than risk letting a single malicious user run amok.

Switching entirely to IPv6 is not worth the extra work

Even with IPv6 and its almost limitless number IP addresses, ISP will group many users together under the same small address space instead of segregate them into their own small pool. Some ISPs have learned the hard way for why this was not a good idea as the whole address block will get blacklisted. ISPs now know that grouping IP addresses together under the same small blocks is dangerous, but instead of changing their deployment model many have opted to simply just lock it all down until more businesses complain or a better solution arises. Since IPv6 is still relatively new, ISPs and businesses haven’t quite figured out all of the best practices yet. The overall community consensus, for now, appears to be that IPv6 is just not worth the extra effort.

Eventually, the IPv4 address market space will saturate

There is almost little to no incentive for businesses to switch to IPv6 until the IPv4 address space reaches near detrimental saturation. This is not to say that IPv6 adoption will not continue to grow, because it will. As more mobile devices hit the consumer market and the ‘Internet of Things’ continues to expand, the adoption of IPv6 will not only grow, but become necessary. However, protocols such as SMTP for sending mail will likely remain on IPv4 because of the community reluctance to support them on IPv6. Many ISPs are already recommending that their IPv6 clients make use of third party mail provider services instead of configuring their own mail servers as they normally would.

IPv6 adoption will likely grow first and foremost for device support and domain hosting, but for protocols outside of HTTP, it is likely that they will hold onto IPv4 for as long as they can. Most likely until either better support and security become available or until a better solution presents itself.

Opinionated Software – Choosing Your Vision

In software development, there are two common approaches to architectural design. One school of thought, un-opinionated, is to make software agnostic and be as flexible as possible to allow the developer to make decisions on how to design and solve problems correctly. On the other hand, some believe the best software, opinionated, should only realize one true vision, the right way. This design paradigm suggests software should pick a side and stick to it. Following the opinionated approach, design decisions have been made already limiting the options available to the developer.

Opinionated Software

Some examples of highly opinionated software include Ruby on Rails, AngularJS and Ember. These software packages share the common characteristics of making certain tasks simple to the developer by following the already predesigned path, sometimes referred to as the “Golden Path”. Following this approach can be advantageous to a developer when a problem or task maps directly to one of these predesigned paths. This can, however, present challenges to the developer when functionality outside of a package’s design is desired, creating additional effort to solve.

Un-opinionated Software

In contrast, un-opinionated software such as the .NET technology stack offers the developer freedom in design choices. Some choices available to the developer include which language they prefer working in, be it C#, F#, VB.NET and virtually any language that is .NET compatible. Flexibility allows the developer to choose the right tool to accomplish a task. The potential downside of this is that with so much flexibility it may be difficult to develop a solution the framework does not provide assistance with, leaving less experienced developers with sub-par solutions.

Our Vision

Our vision is to offer developers the flexibility in the decisions they choose to make in integrating our services. Naturally, since our services are exposed over HTTP, they can be integrated with any language or platform that supports HTTP connections. The applications that are consumed by our clients follow best practices in architecture and input/output structures. This ensures flexibility in request/response formats as well as simplicity in obtaining results. As well, we are bound by the WSDL contracts we deliver to ensure consistency in response format.

Leveraging SSD (Solid-State-Drive) Technology

Our company recently invested in SSD (solid-state-drive) arrays for our database servers, which allowed us to improve the speed of our services. As you likely know, it’s challenging to balance cost, reliability, speed and storage requirements for a business. While SSDs remain much more expensive than a performance hard disk drive of the same size (up to 8 times more expensive according to a recent EMC study), in our case, the performance throughput far outweighed the costs.

Considerations before investing in SSD


As we researched our database server upgrade options, we wanted to make sure that our investment would yield both speed and reliability. Below are a couple of considerations when moving from traditional HDDs to SSDs:

  • Reliability: SSDs have proven to be a reliable business storage solution, but transistors, capacitors, and other physical components can still fail. Firmware can also fail, and wayward electrons can cause real problems. As a whole, HDDs tend to fail more gracefully in that there may be more warning than a suddenly failed SSD. Fortunately, Enterprise SSDs are typically rated at twice the MTBF (mean-time-between-failures) compared to consumer SSDs, a reliability improvement that comes at an additional cost.
  • Application: SSDs may be overkill for many workloads. For example, file and print servers would certainly benefit from the superior I/O of an SSD storage array, but is it worth the cost? Would it make enough of a difference to justify the investment? On the other hand, utilizing that I/O performance for a customer-facing application or service would be most advantageous and likely yield a higher ROI. In our case, using SSDs for data validation databases is a suitable application that can make a real difference to our customers.

How SSDs have improved our services

Our data validation services rely on database queries to generate validation output. These database queries are purely read-only and benefit from the fastest possible access time and latency — both of which have been realized since moving our data validation databases to SSD.

SSDs eliminate the disk I/O bottleneck, resulting in significantly faster data validation results. A modern SSD boasts random data access times of 0.1 milliseconds or less whereas a mechanical HDD would take approximately 10-12 milliseconds or more. This is the difference in time that it takes to locate the data that needs to be validated, making SSDs over 100 times faster than HDDs. By eliminating the disk I/O bottleneck, our data validation services can take full advantage of the superior QPI/HT systems used by modern CPU and memory architectures.

10 Data Analytic Tools To Help You Better Understand Your Marketing ROI

A global review of data-driven marketing by GlobalDMA and the Winterberry Group found in its survey of 3,000 marketing professionals that nearly all recognize the importance of data in advertising and customer experience efforts, with over 77% saying they’re confident in the practice and its prospects for future growth. That study also noted that spending on data-driven marketing grew for 63%of respondents in 2013, and 74% said that growth will continue this year1.

Reliable data analysis is essential if marketing dollars are to be spent on initiatives delivering the highest ROI. If you’re a business owner questioning your own marketing ROI, there are a number of new analytics tools now at your disposal. To better understand your company’s marketing spend, check out the following roundup of ten analytics tools:

YouTube Channelytics

If YouTube is part of your audience outreach plans, check out YouTube Channelytics. Their helpful data analysis interface lets you do everything from check your subscriber rates to understand your top traffic sources and audience device preferences. Know if most of your views are coming via mobile devices, analyze your view counts, and even monitor view counts for specific time periods. What an awesome tool to monitor your video marketing ROI, right?



If Instagram is an essential component of your content marketing strategy, consider adding TinyMetrics to data analysis plans. TinyMetrics offers a number of powerful tools including hashtag, audience, and time analytics. Know the best times to post, track follower counts, and maximize the potential of your audience engagement opportunities. TinyMetrics offers handy export capabilities, making it easy to share your visual marketing analytics with your team.



If you’re overwhelmed by the volume of Google Analytics data at your disposal, consider incorporating Whatagraph into your data-diving quest. Whatagraph lets you create infographic reports from your Google Analytics data. Easily understand numerous data points including site visits, bounce rates, and top-performing site pages.


If you analyze how you are spending your time across multiple marketing platforms, you might develop a better understanding of how to improve your efficiency. Esper offers calendar analytics for both Google calendars and Microsoft Exchange calendars. Used by the likes of Uber and Dropbox, Esper lets you analyze your calendar by event categories and garner insightful data as to where your time might be better optimized. Create detailed data visualizations for a clear picture of your current time expenditures.


Gahint will help you understand fluctuations in visits to your website. Available from KPI Watchdog, gahint uses your Google Analytics data to offer detailed visitor information based upon trigger events. Understand increases/decreases in organic traffic as well as social referral fluctuations.



For those who frequently A/B test their marketing efforts, ChangeAgain is an interesting option to explore. Fully integrated with Google Analytics, ChangeAgain lets you analyze everything from bounce rates to page views and session length. If you’re going to be A/B testing your marketing anyways, doesn’t it make sense to use the power of Google Analytics to better understand your test results?



If your company is utilizing email marketing for customer outreach, investigate Saydoc for document analytics. Saydoc offers data analytics on everything from time-on-page to real-time open rate information. You can use Saydoc to create time-limited, permission-based documents and incorporate electronic signatures into your email outreach efforts.


If you are making a concerted effort to connect with mobile-enabled consumers, consider integrating into your mobile marketing efforts. lets you create QR codes for mobile-enabled consumers and helps you understand interactions with your QR codes thanks to built-in analytics. Create a QR code from an URL, text input, or even from your favorite social media interface. Their powerful platform even lets you create a mobile landing page from a QR code. Who knew QR code marketing was so easy?



If you want to optimize everything from your email marketing to your social media outreach, consider adding Ziplr to your growth marketing plans. Ziplr lets you create custom URLs for every link you share and analyze interactions with your links thanks to detailed analytics. Monitor interactions across multiple social networks and email marketing campaigns, understand which geographic locations are most fruitful for your brand, and which types of devices your audience is using. If you’re sharing links anyways, doesn’t it make sense to understand audience interactions with those URLs?


Service Objects

Bad data has far-reaching negative impacts on the costs and performance of your marketing and sales campaigns. With Service Objects data quality tools, we can identify and correct bad data in your marketing platform, resulting in significantly improved performance of your marketing and sales efforts. Service Objects’ data correction and enrichment services ensure that your marketing lists and contact records are accurate, genuine and up-to-date, allowing you to focus on converting and selling.


Integrating a number of data analysis tools into your growth plans not only helps you understand your audience, it helps you understand where your marketing efforts could be better optimized. If data analysis reveals a higher percentage of audience engagement is coming from email marketing versus social media marketing, you can tweak your outreach strategy to focus more resources on email campaigns.

Understanding which methods are producing the highest value for your business allows you to adjust your marketing efforts on an ongoing basis. With powerful data analytics tools at your disposal, you never have to worry about throwing good money after bad on outreach efforts that aren’t producing significant ROI for your business. Do you think you’ll be investigating any of the aforementioned analytics resources for your company?


The True Cost Of Bad Data In Your Marketing Automation

Inbound marketing and marketing automation platforms promise to make your marketing more effective, and they have the potential to live up to that promise. However, reality often tells a different story — especially when bad data plays a starring role.

Marketing automation platforms like Eloqua, Hubspot, Marketo, and iContact are great tools that can help you connect with your leads and customers. But they are just that, tools. The idea of marketing automation tools is promising, but poor execution and bad data will limit your success.

The cost of bad data

You pay for every contact residing in your marketing database. If your data quality is bad, you are wasting time and money. Data quality suffers for several reasons. Some data starts out clean before going bad due to address or phone number changes. Meanwhile, it’s not uncommon for users to enter bogus information into lead and contact forms. For example, 88% of online users have admitted to lying on online contact forms.

Bad email addresses mean your messages never arrive as intended; the same is true with bad postal addresses, plus you’ve just wasted money on postage or shipping, and bad phone numbers waste your sales and marketing team’s time calling bogus numbers. Improving the data accuracy within your marketing automation platform could save a ton of money.

How much money is at stake? It’s more than you may realize. Applying Deming’s 1-10-100 rule, it costs $1 to prevent bad data, $10 to correct it, and $100 if it is left uncorrected1. So, if you had just 10 bad records, that would be $1,000 wasted. Chances are, you have far more than 10 bad records in your marketing automation software. Approximately 10 to 25 percent of B2B contact records contain critical errors2.

Moreover, using bad data has a cascading effect on the organization. Not only are you expending valuable resources to capture leads, each lead, whether good or bad, takes up a “seat” in your marketing automation plan — with each seat costing money.

The cost to contact bad leads is real. Some of the more obvious costs include printing and postage cost for direct mail and outbound calling, which average costs are about $1.25 per attempted call. Even email costs money, albeit not much (roughly $0.0025 per email), but this adds up over time if left uncorrected.

There’s more to data accuracy than cost savings alone

PrintLooking beyond obvious costs, it is important to understand the cascading impact of bad data on other areas of your business. For example, even though you are using the latest and greatest real-time CRM or marketing platform, if the data is bad, your CSRs will begin to doubt the effectiveness of the platform. This can lead to a lack of confidence in your data, poor morale, and poor performance.

Another example is the impact on your marketing intelligence reports and decision making. Marketing to bad leads will result in “false-negative” data. Since these leads do not respond (because the data quality is bad), your marketing campaigns’ performance will be dragged down.

If you don’t like to throw money away, cause undue stress on the team, or make decisions based off of bad data, improving the data accuracy of your marketing automation software can go a long way toward solving these problems. If that’s not compelling enough, consider this: clean records improve contact conversion rates by 25 percent2.

Service Objects can help ensure that the promise of marketing automation becomes a reality in your inbound marketing strategy. Our data quality tools correct and improve the data in marketing automation platforms, resulting in better performance. Benefits include reduced cost per lead and cost per sale, more reliable performance data, increased contact rates, increased response rate, reduced cost to contact, and more sales.

Isn’t it time you banish bad data from your Marketing Automation Platform?




Is Your Business Prepared to Handle New Tax Rates on Millions of Different Addresses?

With every new year, tax season quickly approaches. Rates change all the time across the United States and Canada. Every month brings new tax rates across cities, counties, states, provinces, and even smaller local districts. There are even more changes each quarter, with January bringing the most tax rate changes of them all.

In addition to new tax rates, boundaries and borders are constantly changing. New ZIP codes are formed, and boundaries of ZIP codes, cities, and counties change monthly. All of these changes can wreak havoc on retailers and businesses who must calculate, collect, and disperse sales and use taxes. An address that was in one jurisdiction on one day could be in another the next day — and it’s your responsibility to calculate and collect the correct taxes despite all of these changes.

With myriad local, state, and provincial sales and use tax jurisdictions, applying the correct tax rates is challenging for businesses of all sizes. It’s virtually impossible for the average business to keep on top of ever-changing tax rates affecting them and their customers without some sort of assistance. For companies that do business across the United States and Canada, sales tax software is an absolute must due to the sheer volume of tax jurisdictions involved and the constant changes. Even small businesses can benefit from a sales tax API as local sales tax rates can fluctuate from one city to the next — and even within a given city if special districts have been created.

First, it’s not just about getting the most current sales tax rates each year, quarter, or month. It’s important to have a system that is collecting the most current address data as least once a month. Even if your customers do not move, their ZIP codes may have changed, or a new special district may have been formed in their area. By having the most accurate address information for your customers, you will be better able to comply with sales and use tax requirements. Not only does our address validation service validate your address data, we are constantly updating our database to reflect the latest tax changes including ZIP code, boundary, and rate changes.

The team at Service Objects anticipates these changes each month and quarter. We knew January would bring a host of changes, as the new year always does. We worked extremely hard incorporating the new changes for January 2016 into our system. 15 different states — Alabama, Arkansas, Arizona, California, Colorado, Georgia, Florida, Illinois, Montana, Minnesota, New Mexico, Ohio, Oklahoma, Nevada, and South Dakota — had new tax rates. In all, over 350 cities and over 60 counties across the United States had new rates. This could mean that hundreds of thousands, perhaps even millions, of addresses now have new tax rates.

If you don’t have sales tax software or current, validated addresses, how will you know if you’re applying and collecting the correct rates? The short answer, you won’t know.

Our address validation and sales tax software solutions solve this problem every month. Sign up for a free trial today.

Using Email Newsletters to Accelerate Growth – 7 Must-Know Tips for Business Owners

According to a 2015 report from eMarketer, research conducted by Econsultancy found that 79% of global agencies indicated that email marketing was rated effective by more agency marketers than any other channel. Email marketing outranked SEO, social media, and content marketing for businesses wanting to effectively grow their brand. If you’re a business owner hoping to capitalize on the revenue-driving success of email newsletters, there are a few crucial factors you need to consider. Before you integrate email outreach into your business-building efforts, make sure you’re well aware of the following email newsletter tips for maximum ROI:

Use a Real “From” Email Address

Don’t send your email newsletters from a ‘no reply’ address. The whole point of sending newsletters is to boost engagement rates. Opting for a ‘no reply’ address immediately tells subscribers you’re not interested in hearing their feedback.

Test, Test, and Then Test Some More

A/B test your newsletter content and subject lines. Avoid words and content styles that will trigger email spam filters. Using all caps, too many exclamation points, and words like prize and sweepstakes can instantly send your emails directly to recipients’ junk folder.

Make it Worth Their While

Think of your recipient first. You’re taking up their valuable time by sending a newsletter. They were kind enough to subscribe to your email content. Do everything in your power to ensure they don’t unsubscribe from your newsletters. Is the content of your newsletters valuable to you or to your subscribers?

Also, make sure your newsletters aren’t one big block of text. Create content that is easy to digest while delivering maximum value to your subscribers. If your readers end up wondering ‘why on earth did they waste my time sending this?’ you’ve completely missed the mark. You want readers to be glad they subscribed and feel like they received helpful information in your newsletters.

Don’t Forget Mobile

Before you hit send, view your newsletter on multiple mobile devices. An ever-increasing number of emails will be opened on a smartphone or tablet. Double-check everything from load time to image resolution.

Keep it Simple

Avoid adding extra fluff to your emails. While a company logo and business branding is essential, forgo background images that add unnecessary distractions to your content.

Send it to the Right People

Make sure you’re not wasting your money sending your newsletter to email addresses that don’t exist anymore, or worse, emails that never existed in the first place. Email Validation will prevent bad email addresses from going into your CRM as well as continue to ‘clean’ your lists as you use them.

Give Them an Easy Out

Although it might seem counter-intuitive, always give your subscribers an easy way to unsubscribe from your newsletters. While your communications might be valuable, some subscribers don’t want an overwhelming number of newsletters in their email inbox. In some countries, there are anti-spam laws that require you to include an easy unsubscribe option.

Focusing on the quality of your email newsletters can help keep your subscribers interested. Your content needs to be focused on solving the needs of your loyal followers, but the way in which you deliver your content can have a huge impact on interaction rates. Combining helpful newsletters tools with a top-notch email strategy will ensure your business communications are warmly received. Are you ready to make a concerted effort this year to grow your business via email newsletter marketing?


How IP Validation Can Help Prevent Fraud

Have you ever been in a business with a sign that says, “We reserve the right to refuse service”? When doing business in person, merchants may be able to detect warning signs of potential fraud. Perhaps the name on a credit card is not the same as the name on the customer’s ID card. Maybe the customer appears overly nervous. Maybe the customer’s one hundred dollar bills seem too new or out of place. In order to protect themselves from fraud, these merchants may invoke that right and refuse to proceed with the transaction.

Though the warning signs of fraud are different when doing business online, you can protect your business by using IP validation.

What is IP Validation?

The DOTS IP Validation service is one of many tools to help prevent fraud. It does so by validating the IP address of online customers. IP addresses can reveal the general location of users. For example, if you use an Internet service provider (ISP) in Los Angeles, California, your IP address will indicate that you are in the Los Angeles area. This information is transmitted to websites as you use the Internet.

Most people have no reason to hide their IP addresses. In fact, most are unaware that they even have one.

How IP Validation Helps Prevent Fraud

Now, suppose you have an online customer who says that he is located in Los Angeles but is actually located in New Delhi, India — wouldn’t you want to know about this deceit?

With IP validation, you can compare the customer’s IP address with the address claimed. In the example above, you’d immediately discover a mismatch between New Delhi and Los Angeles — a sign of potential fraud. Since IP validation takes place in real time, you can immediately invoke your right to refuse service. In other words, the transaction can be halted before fraud can take place.

Ah, but fraudsters and malicious users know about IP validation, too — and they’re tricky. To escape detection, they often attempt to hide their true location from merchants by using network proxies.

The term proxy is defined as an entity that is used to represent the value of something else. Proxies are like substitutes, surrogates, or stand-ins. With these definitions in mind, a network proxy serves as a substitute for a user’s actual network IP address. It’s a fake.

Network proxy services are readily available around the world. While there are many legitimate reasons to use network proxies including corporate networking, access control, and security and privacy concerns, bad guys often use network proxies to obscure their locations.

Let’s revisit the user in New Delhi who claims to be in Los Angeles. He’s gotten smarter and is now hiding behind a proxy. His IP address no longer provides you with the crucial clue you need to detect the user’s actual location. Thus, IP validation won’t work — or will it?

DOTS IP Validation service can detect when an IP address is a part of a proxy network. Though the IP address and the claimed location may match, the fact that the customer is using a proxy is a red flag. It’s telling you that the user may be a fraudster or a malicious user and that caution is warranted.

While the user may or may not have a valid reason to use a proxy, wouldn’t you want to be alerted that something is awry before you do business?

The warning signs of online fraud are out there, but you need a means of discovering them. Help protect your online business from potential fraud by using IP validation. Try it out now with a free trial key.

Revealed: 4 Fresh Lead-Generation Tools for Closers

Are you looking for fresh lead-generation tools to help boost sales? Do you have a rock star sales team already, yet know your sales force could close even more contracts if they had access to better leads? There are excellent new resources being developed for business owners and marketers thanks to entrepreneurs within the lead-generation sector. Following are four you might want to investigate if skyrocketing sales and happy customers are on your must-accomplish list:


OptinMonster offers a WordPress plugin for lead-gathering. You can create custom exit forms for your site to capture email subscribers for your business. Features include A/B testing capabilities, conversion tracking, and page-specific target messaging. If your business is using WordPress as its content management system, OptinMonster is definitely worth investigating.

Right Hello

Right Hello generates leads for businesses using a proprietary algorithm. Once qualified leads are generated, Right Hello helps connect you with potential customers via personalized emails and social media interactions. Your sales team doesn’t have to worry about discovering high-quality leads and can instead focus on closing sales with satisfied customers.


If you want to close more deals with businesses in your local area, ProspectWise is worth investigating. ProspectWise offers data on local businesses, obtained by having paid prospectors visit businesses. In addition to obtaining business cards from each establishment, lead generations sleuths interact with owners and staff to understand the needs of each company. Data is then passed onto ProspectWise customers capable of fulfilling the required needs of local business owners.


Sparta offers a handy platform to help turn your sales staff into supercharged lead-generation dynamos. You can create custom competition challenges for your staff to gamify the lead-generation and sales-closure process. Run multiple challenges at the same time for different departments and have team members compete against rival departments.

These are just four of numerous tools being developed by entrepreneurs within the lead-generation sector. Discovering fresh resources for your sales team can help increase sales while igniting team camaraderie at the same time. Will you be considering any of the above-listed tools for your sales force?

While generating leads is important for your business, making sure they’re accurate and up-to-date is even more so. All the leads in the world won’t help you if their contact information is incorrect. Be sure to clean your newly sourced leads with Lead Validation before importing them directly into your CRM to effectively reach your customers.

No image, text reads Service Objects Tutorials

Improving Canada Address Deliverability

Each year, over 140 million pieces of mail are marked as undeliverable. This happens for several reasons which include incorrect address format, address no longer exists, and mail refused by an addressee to name a few. This article seeks to clarify some of the most common questions regarding Canadian address types and proper formatting techniques to ensure the highest degree of deliverability.

Unit number formatting

There are three acceptable formats that are accepted by Canada Post. The first designation places the unit number ahead of civic number, separated by a hyphen in the Delivery line. The second designation places the unit number at the end of delivery line prepended by unit type. The third designation and least common places the unit number between the addressee and delivery line. See examples below:

Address casing

In general, most mail processing machines are designed for sorting addresses which are printed in upper case format. This differs however with the methods in which data is typically stored in Enterprise CRM systems, which usually prefer proper casing of address information for readability. The decision of which preferred format depends on the application, but for direct mail it is recommended to use all upper case.

Municipality name changes

There are instances where communities can amalgamate and customers continue to use the old name. This can present challenges to proper delivery of mail, which is referencing an older municipality name which may no longer exist, or be designated for an area outside of the current area. It is recommended to ensure the municipality name provided is most current for the address provided.

Address types

Address types that are serviced by Canada Post fall into two categories, civic addresses and postal installation type addresses. A civic address consists of a street address number, street name, municipality name and postal code. A postal installation address consists of the description of the type of delivery, which may be general delivery, lock box number or route service number, municipality, province and postal code. The table below provides a reference for how these address types are addressable by:

Moving To A New CRM? Clean Your Data FIRST!

Are you moving to a new customer relationship management (CRM) solution? While you’ve likely included various IT costs and end-user training in your migration budget, have you considered cleaning your data before your move it into your new CRM? Cleaning your data before a CRM migration is a best practice, one that will solve data quality problems, reduce costs, and position your new CRM for a successful implementation.

The pitfalls of migrating dirty data

Bad data, whether it’s a wrong address, misspelled name, duplicate or redundant, or flat-out fraudulent, is costly — and chances are, you have a lot of it. In fact, an analysis by DataBluePrint, The Cost of Bad Data by the Numbers, estimates that anywhere from 10 to 25 percent of data is inaccurate. This can seriously impact your bottom line time and time again.

For example, let’s say you have a mailing list of 100,000 and that 20 percent of your addresses are bad. That’s 20,000 mailers that disappear into the void, costing you print and material costs, postage, and manpower. Not only that, this waste happens every time you run a direct mail campaign. Meanwhile, you may have inaccurate customer data, resulting in lost or delayed orders, unhappy customers, and bad PR.

Why fix the data problem during the data migration?

First, it’s much cheaper to fix data quality issues as part of the migration than it is to let them fester. Rather than continually sending mail to bad addresses, cleaning the data will immediately solve the problem (and many others). Data hygiene quickly pays for itself anytime you do it.

When migrating data, it makes sense to fix the data problem as PART of the migration project. This is the perfect opportunity to focus on data hygiene. Just as most people clean out their junk before moving to a new home, cleaning bad data before you move creates a fresh start.

Cleaning data during migration is much easier than doing it after your new CRM goes online. After all, your old system is familiar, making it much easier to find and resolve data quality issues. If you wait until the data is in your new CRM, you’ll probably have a much harder time because everything is new and different.

It’s generally easier to approve a data hygiene project as part of the CRM upgrade than as an after-the-fact add-on. When included as part of the upgrade, the data hygiene element is understood to be a contributing — and necessary — success factor. If you tag it on after the migration is complete, you’ll probably encounter pushback and repercussions for lacking the foresight to clean the data before the move.

Fixing bad data during data migration is easier than you may think with the right tools

As part of the data migration, you will need to export your data from your legacy system and then import it into your new CRM. However, before you import it, you have the opportunity to clean it using an external data validation tool such as Service Objects’ data verification API. Service Objects’ CASS-certified data hygiene tools quickly identify and clean bad data by comparing records against a series of massive databases containing millions of accurate, standardized contact records.

This intermediary step is well worth doing as it instantly weeds out the bad data, leaving only accurate, verified data to import into your new CRM. If you’re planning a move, don’t forget to do a deep cleaning — of your data.

Sources: The Cost of Bad Data by the Numbers

Use Name Validation to Get your Customer’s Name Right

Name Validation

It’s very important in a lot of ways – it’s one of the easiest ways for someone to provide fake information on a web form but can be very tricky to properly detect. Take this example of a real piece of mail:


More processes that accept this sort of data are being run by computers. Less often human eyes review as the entire process from start to finish is becoming fully automated.

Name validation can be easily overlooked as an unnecessary addition, but the ramifications of making mistakes can be far reaching. Small mistakes can be very embarrassing, larger ones can lead to a big PR black eye for a company if a very embarrassing mistake makes its way onto the internet.

What’s going on behind the scenes

At Service Objects, we are always looking for ways to improve all data inputs at the point of entry, and name validation is no exception. We have millions of known first and last names from around the world and algorithms honed over years of work to weed out oddities in names. We are looking for celebrity names, vulgar words, words from a dictionary and things that just plain look like garbage or bogus. We constantly strive to improve our algorithms and take pride in identifying fake names.

In the example above it seems obvious that the name is bad, but to an automated process is it safe to say this is bad? What about a valid name such as Martita Boobier, which contains questionable words? What about something like Letit Boobra which doesn’t appear vulgar but also doesn’t appear to be a valid name as well? The goal of DOTS Name Validation is to properly place these names into the appropriate category to take the worry out of an automated process improperly placing them.

Avoid adding bad names to your CRM in real-time

Bad names such as “Trucker Bob”, “Doctor Nick”, “Homer Simpson”, and “Felix the Cat”, names that don’t appear to be names such as “The Big Bang” or “Service Objects”, or names that just appear to be complete garbage such as “Asdf Blah”. DOTS Name Validation can properly identify many cases that might otherwise slip through the cracks without proper review.

Service Objects integrations can help improve your contact data quality, help with data validation, and enhance your business operations.

Service Objects Integration Patterns

Request and reply

Usage: The request and reply pattern utilizes the HTTP protocol to transfer data to Service Objects Web Services where the data is processed, validated and/or appended to and returned to the client.

Example: A custom web form which captures a user’s contact information to be inserted into a CRM or database. In this example, a developer would integrate a POST call to Service Objects on form submission to validate the user’s contact fields prior to inserting data into a CRM or database.

Request-ReplyBest Practices:

Timeouts: Whether integrating Service Objects Web Services in an application or web page, implementing the correct client timeout setting is vital to the function service. If the timeout is too long, application performance or user experience can be degraded. Conversely, setting the timeout too short will not provide the service with proper time to perform its function and return a result over the network. A general rule of thumb for client side timeout settings is around 3 seconds for most of Service Objects Web Services. However, there are some services which potentially require a longer timeout due to the nature of work being performed. We always recommend consulting with Service Objects Support to get the best-recommended settings for the service being integrated.

Failover: Having redundancy available to our clients is vital for mission critical systems which cannot afford downtime or lack the ability to process incoming information. Service Objects offers multiple redundant datacenters to meet the demands of these systems. We encourage our clients to make use of these locations in their integration logic in the event of a slowdown or issue at one of our datacenters. Coupling failover integration with proper timeout settings provides a reliable system which ensures zero downtime in critical systems.

Security is an equally important consideration when transmitting user information over the net.Service Objects strongly recommends transmitting all calls be made using https. This ensures all client information including license key are encrypted during transport.

Remote call in

Usage: The remote call in pattern utilizes the HTTP protocol for Service Objects to make requests to remote applications for available records to be processed. If available, these records are exported for processing and upserted into the remote application.

Example: A Salesforce application which flags incoming leads to be processed by Service Objects. Service Objects would act on the client’s behalf and periodically check for contacts that are available to be processed. If contacts are marked as available to process, Service Objects would bulk export contacts to process and reinsert into the system as a post processing step.

RemoteCallInBest Practices:

Credentials: When acting on the client’s behalf to process records, we always recommend creating a user account with API level credentials which is easily identifiable. This prevents access from being inadvertently revoked by system administrators which can cause issues in application workflow.

FTP batch upload

Usage: Service Objects provides an SFTP user account to the client for batch uploads to be processed and returned. With each batch file that is processed, a report is included in the package returned to the client with metrics on the data that was processed.

Example: Client X joins into agreement with Service Objects to batch process records. Service Objects provides whitelisting to the client’s IP for access to the SFTP account. On demand, a batch file is securely transmitted via SFTP protocol to Service Objects for processing. Each data file is preprocessed to ensure field mappings are adhered to and records are ready to be processed. Once completed, the processed files are packaged along with a metrics report for Client X to retrieve.

FTP BatchBest Practices:

File Format: Service Objects regularly accepts multiple file formats, however it is recommended that all files delivered for processing be encoded in standard utf-8 character encoding with Windows line ending format.

Header Format: In the setup process, Service Objects will receive a sample file for processing from an FTP client. It is recommended to provide a header line along with the data to be processed for simplified identification of fields to be processed. It is not uncommon to receive input files with more than 50 fields which can make identification of input fields more difficult without a header line. We also recommend simplified location of input fields within the input file. If the input file will have fields other than the input fields, the input fields should be grouped together towards the left side of the document if possible.

Column Format: Service Objects recommends that columns should be quote enclosed to ensure that all files are valid CSV. In some cases, address information can break CSV files which can halt processing to fix any necessary columns.

How FCC’s Ruling Affects Your Call-Centric Business

Did you know that text messages are now considered phone calls, at least according to the FCC’s ruling on the Telephone Consumer Protection Act (TCPA)? Earlier this year, the FCC issued a ruling clarifying several points of contention surrounding automated calls. The new rules expand the definition of “calls” to include text messages and restrict the use of autodialers. They also provide greater protections for consumers and allow telecommunications provider to use robocall blocking technologies. In all, 21 petitions were addressed by the FCC, making this ruling one of the biggest changes to the TCPA since the creation of the do-not-call list.

A Brief History of the TCPA

The TCPA dates back to 1991 when it was first passed to amend the Communications Act of 1934. The TCPA restricts telemarketing along with the use of autodialers, prerecorded voice or artificial messages, and fax machines. The Do Not Call Registry Act of 2003 and the FTC’s Telemarketing Sales Rule are closely related.

In 2013, the FCC updated the TCPA to restrict SMS text messages, require an automatic opt-out mechanism for robocalls, require prior written consent for wireless calls, and eliminate the “established business relationship” exemption among other changes.

According to the FCC’s press release on the new ruling, “complaints related to unwanted calls are the largest category of complaints received by the Commission, numbering more than 215,000 in 2014.

The FCC’s Recent Updates to the TCPA

The new updates attempt to address these complaints, and they’ll likely prove challenging to call-centric businesses. Among the nearly two dozen changes include updates that:

  • Allow telecommunications service providers to use robocall blocking technologies to help stop unwanted robocalls
  • Allow consumers to revoke their consent at any time, in any reasonable manner
  • Clarify that text messages are considered calls and subject to the same rules
  • Prevent a consumer from “inheriting” a previous subscriber’s consent to receive calls
  • Clarify what an automatic telephone dialing system is and that human intervention (like clicking a button) is not sufficient to overcome this a system’s status as an autodialer
  • Clarify that Internet-to-phone text messaging technologies are considered automatic telephone dialing systems
  • Address reassigned numbers, requiring companies to stop calling a reassigned after one call
  • Address third party consent, clarifying that people in another person’s contact list have not given consent to receive robocalls from applications downloaded by that person

How the FCC’s New Ruling Affects Your Call-Centric Business

If your business uses automated dialing systems to call or text consumers, you must comply with the TCPA and this new ruling or face hefty penalties. At a minimum, you will need to be prepared to:

  • Promptly remove consumers who’ve revoked their consent from your list.
  • Treat text messages as if they were phone calls and ensure that your texting practices comply with the TCPA.
  • Avoid soliciting your customer’s acquaintances using autodialers. You can still ask for referrals and make personal phone calls, but don’t use an autodialer and definitely do not continue calling if the referral asks you to stop.
  • Identify recently reassigned numbers and remove them from your list promptly. If your customer has moved and changed his or her phone number, using phone validation software can help prevent calls to the new subscriber.

These recent changes are widely considered good for consumers but challenging for businesses who use autodialers. Learn more about the new FCC ruling at the FCC’s website.

Why ‘Address Line 2’ Should Never Be Offered In Address Forms

You see address line 2 all the time. Your own web forms probably even have a field for it. However, did you know that address line 2 doesn’t really exist — at least in the U.S. Postal Service’s eyes? Not only does the USPS not require an address line 2, it doesn’t even acknowledge its existence.

USPS addressing standards

According to the USPS’s postal addressing standards, a complete address consists of just three lines:

Recipient Line
Delivery Address Line
Last Line

An example of a complete address using the three-line standard is:

John Doe
123 Main Street, Unit 21
New York City, NY 10001

Note that placing “Unit 21” on its own line, commonly referred to as “address line 2,” would result in a non-standardized address. While a human should be able to figure out that John Doe lives or works in unit 21, automated processing systems could have trouble.

John Doe2

Though address line 2 does not technically exist, the USPS does allow for additional information in a secondary address line (such as “deliver to dock 23.”) However, that information should be considered more like a comment area; it should not contain any deliverable address information. Our address validation software does scan address line 2 for this type of information, but there’s no guarantee the software will know what to do with it.

Suites and apartment numbers should be placed at the end of address line 1 while recipient details like name and company should go above the address.

What’s wrong with including an address line 2 field on your online forms?

Businesses commonly include an address line 2 field on their online forms, inviting end users to split address information as they see fit. When presented with two address lines, it’s only natural for users to separate floor, suite, and unit numbers into two separate lines. Some users will use address line 2 to add additional information such as “ATTN: John” or “Cross street: 2nd Avenue.”

In short, too much information can be mixed up in address line 2, making parsing out important information difficult and inconsistent. For example, if the recipient’s name is mixed into address line 2 along with an apartment number or letter, it may not be entirely clear to the address validation system what the intention of the address is since the name should have been the first line (above the address) and the apartment number should be placed in the address line itself. Situations like this can often be fixed with address validation software, but the likelihood of getting a perfect address match is reduced since there are so many ways address line 2 can be filled in.

Another issue with presenting an address line 2 for end users to complete is it invites them to mistakenly enter an alternative address line 1 (for example, their home and work addresses if both are in the same city). If both address lines 1 and 2 contain complete, proper addresses, the address validation system cannot determine the originally intended destination.

As the saying goes, garbage in, garbage out. The closer to USPS standards you can get initially, the more likely it is for an address to be cleanly validated, and the more likely it is for your mail to arrive at its proper destination. Even though our software is constantly updated and improved to handle and fix improperly structured addresses, it’s always best to strive for clean input data when possible.

Should you eliminate address line 2 from your online forms?

If you want to invite garbage in, by all means keep asking for an address line 2. If you’d rather cut the confusion and get cleaner data from the start, stop using address line 2. USPS doesn’t require it — and doesn’t necessarily know what to do with it.

Some end users don’t know that they need to enter apartment or suite numbers to the main address line 1. You can help make address input more obvious to end users by adding an optional field to the web form labeled “unit number.” You could then append the unit number to address line 1. End result: less confusion, more consistent address validation, and better deliveries.

Why Credit Card Fraud May Double In The Next Few Years, And How To Protect Your Business

The United States is finally catching up to its European neighbors in terms of adopting the more secure EMV chip credit and debit cards. EMV, which stands for Europay, MasterCard, and Visa, is considered less vulnerable to credit card fraud than magnetic stripe cards because their embedded, encrypted chips cannot be copied by counterfeiters. Despite the ongoing shift to EMV chip cards, projections point to a doubling of credit card fraud from about $3.1 billion to over $6.4 billion in 2018.

How is this possible? Fraudsters will have fewer opportunities to clone cards with illicitly placed magnetic readers, so counterfeit cards will likely become less prevalent. Just as bankers, merchants, and consumers shift to EMV chip cards, fraudsters will have to shift to other devious means of committing credit card fraud. One area prime for fraud involves “card not present” (CNP) fraud.

CNP fraud takes place when someone uses someone else’s credit card to pay for a transaction but does not physically present the card to the merchant. For example, when placing an order over the phone or online. It doesn’t matter if the card has a secure chip or not if the card is not present to be scanned and verified. So long as the fraudster has the credit card’s number, billing ZIP code and CVV (Card Verification Value) number.

CNP fraud is expected to displace the gains made by EMV chip cards, nearly dollar for dollar. This happened when Canada adopted EMV cards a few years ago. According to the Canadian Bankers Association, counterfeit card fraud went down from $245.4 million (CAD) in 2008 to $111.5 million in 2013. At the same time, CNP fraud increased from $128.4 million in 2008 to $299.4 million in 2013.

How to prevent CNP credit card fraud

An article on the subject posted on offered a three-pronged approach to prevent CNP credit card fraud:

  • Tokenization, which makes stolen data useless 
  • Behavioral analytics, which helps in making authorization decisions 
  • “3-D secure,” which adds another layer of security such as a PIN or password 

Take CNP one step further by using order validation tools to verify customer information and detect signs of potential fraud. For example, DOTS Order Validation will produce an overall quality score based on inputs such as customer and credit card details.

Order Validation cross-validates data against seven distinct contact validation services such as BIN Validation, Address Validation, and Phone Exchange. Mismatches and risk factors (such as if the cardholder’s name is bogus, the shipping and billing addresses are different, or the card is a gift card or from a country of high risk) are immediately flagged before the transaction reaches the payment gateway. Using the quality score and warnings from our Order Validation API, you’ll be better able to detect potential CNP fraud before it happens.

Tips To Consider When Calling A Web Service For Large Batch Jobs

Web services are great tools for completing one or more tasks that you would otherwise not have the resources to complete on your own. Web services can be relatively quick as well, returning a response in tenths of a second. While a tenth of a second may seem fast, web services in general are regarded as slow processes and are not commonly considered when performing large batch jobs. This is because a batch may consist of millions of records, and if each web request took a tenth of a second to complete then it would take approximately 28 hours to complete one million requests. Every millisecond counts when performing large batches, so it is not uncommon for web services to be regarded as bottlenecks. However, with the right preparation and integration, making a call to a web service is almost no different than making a call to a local database. You may actually be surprised to discover that potential bottlenecks exist in areas that you may not have previously thought of.

Network connection

Web services rely on an internet connection and they normally communicate via HTTP or HTTPS on port 80 and port 443, respectively. Large batch processes typically run on designated servers with elevated security, and it is not uncommon for a network admin to lock down a server so that it cannot access the internet. Be careful not to let your admin block internet communication on the machine that will be calling the web service and ensure that the network is in good condition.

DNS lookup

Depending on the platform you are on and how your environment is configured, your application may be performing a DNS lookup for every web request, regardless of what the DNS time-to-live (ttl) is set to. Depending on how your local DNS resolver is configured, the average DNS lookup time should range between 10-50 milliseconds. There is no need to perform a DNS check for every single request. Instead, take advantage of the DNS cache and ttl so that a DNS lookup is only performed after the ttl has expired.

Connection leaks

Most platforms will have connection pools available to quickly perform requests. Keep in mind though that the number of connections in a pool can be exhausted, so always remember to properly close and dispose your connections when you are finished using them. If a connection is not closed then it will remain open and unusable until it returns to the connection pool. This can sometimes take 30 seconds for every connection depending on which framework you are on, and your application will be forced to wait until a connection becomes available or worse yet your application may crash. The problem with connection leaks is that they do not occur right away. Instead, they commonly pop up unexpectedly after your batch job has been long running. This can sometimes mean that all of the work that was performed up to the point of the error will be lost, leaving you empty-handed and possibly behind schedule. Connection leaks are not just limited to web service calls either. They can occur in all types of connections, such as database calls. So be sure to check your entire application for potential connection leaks and not just the web service calls.

Database design

Many batches are performed against large datasets. Make sure your tables are designed and indexed to support fast read and insert times. In general, inserts are faster than updates with where clauses, so try to avoid update commands if possible. You can’t really blame a web service for being slow if your local database queries are slower than the web service call. When processing a large record set it is sometimes faster to load parts of the data into memory, call the web service as you iterate through the in-memory record set, store the results in memory and then when done iterating, insert the results in bulk to your destination table.

Simultaneous requests

Web services accept simultaneous requests, so instead of processing one record at a time, you can instead process multiple simultaneous connections to complete a batch in a fraction of the time. Most platforms will allow you to perform asynchronous requests, which can be used to process multiple web requests at the same time. Asynchronous requests will use a thread other than the main program thread to perform the web request, and therefore will consume additional resources. In general, most applications run on a single worker process. Depending on the platform, a single worker process may have between 12 to 24 threads at its disposal, with the number of threads being configurable or dynamically managed. The number of threads available to a worker process is also determined by the number of cores available on the CPU. It is important to not spawn too many asynchronous requests as doing so can lead to performance degradation in your application, local machine and other areas such as your network connection. It is best to test your application with a small number of simultaneous requests first and then work your way up. Take the time to evaluate the performance of your application and the health of your machine when performing each test. In general, most batch jobs can be quickly processed by 10, 20 or even 40 simultaneous requests. Larger batches may require 100+ simultaneous requests, but be aware that doing so could expose/introduce deficiencies in other areas of your program, local machine and/or database.

Good application design can require careful consideration in many areas that are sometimes not commonly apparent. The contents of the short list above were aimed towards web services, but the basic list can easily apply to any scenario where a call to a method that resides outside of memory is necessary.

5 Tips To Save Time And Money On Shipping

If you’re like most business owners, finding ways to save time and money is one of your top priorities. After all, a penny saved is a penny earned. One area you might not have paid too much attention to is shipping — and it’s prime for cutting costs and hassles. Use the tips below to save both time and money on shipping:

1. Avoid Guesstimating Your Shipping Costs

Chances are, you’ll overpay as you err on the side of caution. For example, your package may weigh less than a pound, yet you might estimate its weight at two or three pounds “just to be safe.” The same is true of a package’s dimensions. Over guesstimating a package’s weight or size could put you in a higher price range than necessary. Meanwhile, underestimating the package’s size or weight could result in time delays as the shipper returns your package for insufficient postage. This is a risky choice as shipping delays translate into unhappy customers (which is why most businesses overestimate their package sizes and shipping costs). Avoid overpaying by using a postage scale and a ruler to accurately weigh and measure your packages.

2. Do Your Research And Choose The Right Service

While one shipping company may have lower rates on smaller parcels, it may be the costlier choice for larger ones. It pays to compare prices. While you’re at it, choose the right type of service for the package. Is overnight service essential or would ground service be acceptable? Would an alternative shipping method, such as Greyhound Package Express or DHL Express, cost less?

3. Increase Sales By Offering Convenient Returns

Though offering to pay for return shipping may seem counterintuitive when you’re trying to cut your shipping costs, you may want to consider adopting such a policy. According to a Forrester Consulting study conducted for UPS, retailers that offer convenient, inexpensive returns are likely to see an increase in sales, customer loyalty, and incremental revenue. 

4. Use Address Validation Software

Are you shipping your packages to the correct address? Address validation can flag you to a potential shipping problem, allowing you to correct the issue before you ship the package and find out the hard, expensive, and time-consuming way. By validating addresses before you ship your packages, you’ll have fewer packages returned to you as undeliverable, fewer upset customers wondering where their packages are, and lower shipping costs as a result. 

5. Shop Around For Shipping Supplies And Buy In Bulk

Are you still buying your mailroom supplies at the local stationary or office supply store? Though convenient, you’re probably paying too much. Again, this comes down to doing your research and shopping around. Buying in bulk also reduces your shipping costs overall. The savings could be substantial.

The Problem With Bad Address Validation

Street, avenue, boulevard, and court are but a few of the many suffixes used in addresses. Add in Spanish or French variations like corte or rue and the list gets even longer. If you use the wrong suffix, such as Elm Street instead of Elm Avenue, your package may not arrive. While businesses use address verification services to avoid this problem, sometimes bad address validation backfires and changes a correct address to an incorrect one, costing businesses 

Recently, we received an email from a client needing advice about fixing a problem with her address. She said that when ordering packages on several occasions, the USPS had changed her address from what should have been a ‘Heights’ suffix to a ‘Road’ suffix. As a result, the Post Office deemed the address undeliverable because the address with the ‘Road’ suffix didn’t exist — and it returned all of her mail to the sender. 

It didn’t matter that she had entered the address correctly when ordering items online; the address would be changed to “Road” time and time again. She asked us how to fix the problem so she could properly receive packages in the future, wondering if she should call USPS or every company she orders from.

We ran her address through our address verification service and found that it would return the correct “Heights” suffix on the address. Therefore, the USPS has her correct address and is not the root of the problem. It turns out that other address verification services were changing “Heights” to “Road” when validating her address upon checkout. 

This caused the customer a great deal of inconvenience. Incorrect data in the address validation database also resulted in lost shipping costs on the business side and an erosion of trust. Businesses that repeatedly ship a package to the wrong address, despite repeated corrections, aren’t likely to earn that customer’s referrals or ongoing business.

Our data and expert algorithms allow for finding the correct address and specifically helped in this case. Not only is having address validation necessary, having the correct validation service — one that will both validate the address and confirm that it truly exists — will save both businesses and customers time and money.

DOTS Order Validation Is Fine Tuned

We listened to our clients feedback and performed some fine tuning on our DOTS Order Validation service to better meet their needs.

Our DOTS Order Validation service performs multi-function verifications including address validation, BIN validation, reverse phone lookup, email validation, and IP validation. Our proprietary normalizing algorithm inspects the various customer components and returns a 0-100 quality score on the overall validity and authenticity of the customer. 

With the fine tuning, quality scores in many of the test categories were re-balanced to no longer be as strictly penalized. Changes were also made so that the service could handle a wider variety of cases where data may or may not be available.

While the service still errs on the side of caution and will remain strict,  it is no longer as heavy handed with its penalties. Clients will still encounter orders that require review, as it is part of the service’s duty to make clients aware of suspicious looking orders, but they will also see many more valid orders pass. This means that the number of orders that need to be manually reviewed will be lower and will consume less time.

Click here to learn more about DOTS Order Validation.

Stop the Hate: How Customers Can Learn to Love Telemarketers

It’s a joke everyone knows: people hate telemarketers. But when telemarketing is part of your business plan, you need to know how to do it. The last thing your business needs are annoyed customers. Use the tips below to help your customers appreciate your calls.

1. Set expectations

Obviously, warm calling is easier than cold calling because you already have an established relationship, but your existing customers may be caught off guard by your calls. One way to overcome this is to let them know what to expect when you ask for their phone numbers. For example, if you regularly call customers a few days before you have technicians visiting their neighborhoods, let them know you’d like to call them when you’ll be in the area as a courtesy. If you can frame it as a benefit, even better. Likewise, when collecting information from prospects, set expectations such as “a representative will call within one business day with a quote.”

2. Verify phone numbers

Use a phone validation service to ensure that you have the correct phone number for your customers and prospects. Even if you’ve just recently collected the phone number, validation is important to weed out bogus entries, inadvertent typos, and out-of-service numbers. If you’re working off of a list where you have not obtained written permission to call the consumer — a must when calling mobile phones, phone verification can identify whether the phone number is wireless or not. Verifying phone numbers helps you to avoid calling the wrong people, wasting time, and giving consumers yet another reason to hate telemarketers.

3. Provide value when you call

You know what’s in it for your business when you launch a telemarketing campaign, but what’s in it for your customers? Are you just playing a numbers game or is there real value in your outreach? What can your customers get from being on your telemarketing list that they can’t get elsewhere? How can you make them look forward to your next phone call? This goes back to point number one, setting expectations. Remind them why you’re calling (for example, “I’m calling because you asked us to call to offer early access and exclusive discounts to our clearance sale”).

Setting expectations, verifying phone numbers and providing value when you call are three essential steps you can take to stop the hate.

Amex Express Checkout: Why ECommerce Companies Should Still Check For Fraud

Amex Express Checkout is a new checkout option that recently went live this month, and unlike the other payment methods it is not an e-wallet in the conventional sense. Instead of creating and managing payment accounts for one or more payment methods, the Amex system leverages a user’s existing account information, so when a user is ready to check out all they need to do is enter their Amex username and password. The same credentials they would use if they were to log directly into their online Amex account. That’s because they actually are.

Like the other payment methods, the Amex system is focused on security and ease of use. The idea being that the card holder never has to enter their credit card information, instead the data is securely sent directly to the online store from American Express and so both the online merchant and the card holder can rest assured that the provided information is current. With other e-wallet systems, it is up to the card holder to update their card information if a card expires or a new card number is issued and so on.

There is Still a Potential for Fraud, Even With Fast Checkout Options

One would think that with so many secure payment options available today that an order validation service in your checkout system would be unnecessary, but that’s not the case. These payment systems may be designed to be secure and easy to use, but there is still a potential for fraud. These checkout systems primarily only fill out the billing details and have nothing to do with the shipping details or other aspects of the order. A service like DOTS Order Validation can take a look at the individual details as well as the big picture to help prevent fraud. Also, not all customers are going to use these sort of checkout systems; so an online store will almost always have an option available for a user to enter their own checkout information. Otherwise, the store risks losing a potentially large customer base.

If a fraudster has an opportunity to circumvent additional security then they are going to take it, and they will most likely not choose these checkout systems. Which means that a service like OV will play an even larger part in helping fight fraud. Overall, e-commerce shopping cart programmers would do well to make use of these e-wallet systems and use a service like DOTS Order Validation. A service like OV is not in competition with these type of systems and it will still have a place in helping prevent fraud.

The Definition of ‘Garbage’ in Email, and How to Get Rid of It

We all know garbage when we see it — or do we? In the case of Service Objects’ email validation, we have our own definition of garbage and we actively seek it out. We want to find garbage in email and warn you about it so that you can make the most informed decision possible about accepting or rejecting it.

How do we define garbage?

The dictionary defines garbage as follows:

  •  Wasted or spoiled food and other refuse, as from a kitchen or household.
  • A thing that is considered worthless or meaningless: a store full of overpriced garbage.
  • Computing unwanted data in a computer’s memory.

Okay, we’re off to a good start. It’s safe to say that garbage is essentially: trash, worthless or meaningless items, or unwanted data. When it comes to email garbage, our email validation service flags email addresses that contain what we consider garbage. Our developer guide currently explains that our garbage flag:

“Indicates if the email address is believed to contain ‘garbage-like’ keyboard strokes and/or ‘garbage-like’ characters.” 

Garbage-like keyboard strokes might include pure gibberish or typos gone to the extreme. Garbage-like characters could include symbols and characters not typically included in email addresses. For example, hyphens are commonly used in URLs. Thus, an email address with a hyphenated domain, such as wouldn’t necessarily qualify as garbage. On the other hand, punctuation marks like exclamation points and commas are not used in URLs. Thus, an email address such as joe@example, or joe@example! would probably be considered garbage.

It’s not just about special characters


There’s a lot going on here behind the scenes, though. Many bots will use valid email addresses, such as addresses, with randomly generated email addresses. Though theoretically, they could be real, most often they look like complete gibberish or garbage.

Our DOTS Email Validation service checks for known names and dictionary words, various keystroke patterns, vowel and consonant patterns, and special characters that are syntactically valid but often rejected by most mail servers.

Garbage detection is important. Not only is sending messages to bogus email accounts a waste of time, but it could also get you flagged by ISPs. If you have a lot of garbage emails in your database, you’ll also have a high bounce rate. ISPs may think you’re a spammer when you’re really a victim of bots that randomly generate email addresses on web forms.

Just because an email may be considered deliverable, it does not mean it is good. That’s why we run multiple integrity checks, including garbage email checks, as part of our email validation service. If an email is flagged as garbage, then it means you probably shouldn’t accept it.

How to Recover from an Incorrect Address

It happens all the time: customers accidentally provide incorrect addresses. Sometimes autocorrect on their phones or computers is to blame. Other times, it’s an honest mistake. Whatever the reason, most customers who enter incorrect addresses won’t realize their mistake. When their order doesn’t arrive as expected, they’re not going to be happy about it. Right or wrong, you’ll bear the brunt of the blame for delivery delays.

The best way to recover from an incorrect address is to catch and correct it before shipping the item or delivering a service. The two main ways to do this are:

  • Calling the customer to confirm the address
  • Using address validation software.

Calling the customer

Having your shipping clerk or delivery driver call the customer to confirm the address is good old-fashioned service. However, it takes time, requires the customer to actually answer the phone or call you back, and is sometimes not done for various reasons.

Using address validation software

Address validation software is a more efficient, completely automatic, way to verify and correct address information. Depending on the address validation API service you use, you can even get exact longitude and latitude codes to plug into your mapping software. You can also find out if an address is a business or residence, if it’s vacant, if it’s a general delivery address, and more.

Address validation software verifies and corrects addresses in real time, catching errors instantly. By using an address verification API, you can prevent delivery problems due to incorrect address information.

Getting back in your customer’s good graces after an incorrect address

What if you didn’t call the customer to confirm the address or use address validation software and the provided address was incorrect? You’re going to have an unhappy customer, and you likely won’t find out about an incorrect address until it’s too late. Use the tips below to get back into your customer’s good graces:

  • Immediately correct the address in your system so the error doesn’t happen again.
  • Accept responsibility, even if it’s not your fault. After all, address verification technology exists, and it’s readily available and affordable.
  • Make it up to your customer. Start with a genuine apology, and then think of a way to make it up to the customer. Can you waive the shipping costs, ship the replacement package overnight rather than ground, offer a discount on a future order, or include freebies in the replacement package you send? If you’re not sure what to do to make it up to the customer, ask. Do whatever it takes to turn the situation around. Remember, it’s much cheaper to keep an existing customer than it is to get a new one.
  • Finally, start using address verification software so you won’t be in this position again. Not only will it prevent problems, it will help to improve productivity and customer satisfaction while also reducing costs associated with deliveries to incorrect addresses.

Telemarketing and the Benefits of Telephone Data

As you likely know, telemarketing is subject to numerous laws and regulations, some of which can cost a company a lot of money if they don’t comply. For instance, the largest Do Not Call settlement to date resulted in a $7.5 million civil penalty payment. The Telephone Consumer Protection Act (TCPA) is the primary legislation impacting telemarketers. This act established the national Do Not Call registry in 2003 and has since been updated to address robocalls and mobile phones.

Since the updates, it’s not just the Do Not Call list that telemarketers need to be concerned with; telemarketers also need to steer clear of calling wireless telephones unless they have express written permission. Fortunately for telemarketers, data products exist that can help them comply with these regulations. Below are a few of the benefits of telephone data for telemarketers.

National Do Not Call Registry Compliance 

The FTC makes Do Not Call registry telephone data available to telemarketers at its website. You must register with the national registry in order to view this data and import it into your dialing software. You must update your Do Not Call database every 31 days. Third party data providers also offer telephone data verification services to ensure that your database is kept current with the National Do Not Call registry as well as state registries and even the Direct Marketing Association’s Mail Preference Service Listing.

The Importance of Mobile Telephone Data Verification  

While compliance with all applicable Do Not Call registries is an absolute must, these Do Not Call lists don’t necessarily address calling wireless phones — unless the wireless subscriber placed the wireless number of the appropriate lists.

With the Do Not Call list, people are expressly opting out of being called by telemarketers without consent. With wireless phones, they do not need to opt out. Rather, the telemarketer must have prior written consent before making robocalls or autodialed telemarketing calls to mobile phones. This FCC rule went into effect in late 2013. The tricky part for telemarketers is figuring out whether a phone number that’s not on the Do Not Call list is fair game or not. If it’s a mobile number, telemarketers need written consent.

So, how do you figure out if a phone number in your database is a landline or a cell phone? With telephone data from Service Objects. Not only can our various telephone data products validate phone numbers, they can also help telemarketers see whether a phone is a cell phone or a landline. 

Use Service Objects’ phone validation APIs in conjunction with National and State Do Not Call lists to ensure that your database contains only those phone numbers that you can call without running afoul of telemarketing regulations.

Keeping Up with Ever-changing Sales Tax Rates

In most cases, selling physical products in the United States also means calculating, collecting, reporting, and paying sales tax. If you have multiple stores, operate a delivery service, or ship products to customers, this can quickly become complex due to sales tax requirements at the state, city, special district, and county level. Not only that, frequent tax rate changes at any of these levels can affect how much you must collect.

Basic sales tax collection

Let’s say you run a general store in a small community and that you do not ship items. In this simple case, you’d have to collect sales tax for the state, county, and local municipality. If your store is located in a special city or county tax district, you may have to collect tax for that district as well. You’d also need to keep track of the sales tax collected for each entity and report and pay those taxes on a monthly or quarterly basis. So, that’s three to four agencies, each with its own territory, tax rate, payment schedule, and reporting requirements. In addition to collecting, reporting, and paying sales tax to each applicable tax authority, you’d also need to pay attention to tax rate changes and adjust your collections accordingly.

This is sales tax collection at its most basic, and it’s confusing enough as it is.

How sales tax collection becomes complicated

As you broaden your service area, sales tax management becomes even more complicated. For example, let’s say that you start selling products on the farmer’s market circuit. Each farmer’s market is in a different city, each with its own local sales tax district. Though the county and state may be the same, you now have new local sales tax districts to interact with.

It gets even more complicated when you start shipping products to customers. If you have a presence out of state, such as a warehouse or even a temporary presence such as a trade show, you’ll have do deal with state and local taxes in that state, too. The more you expand, the more difficult it becomes to manage all of these sales tax authorities and their ever-changing sales tax rates.

To further complicate matters, depending on the location, the tax rate could be calculated based on street address, ZIP code, city, or county. It is not uniform.

Just when you think you have it all figured out, the sales tax rates will likely change. Some go up, others go down, others stay the same. For example, in April, Service Objects updated its FastTax product with 125 tax rate changes across 32 counties, 8 county districts, 62 cities, and 23 city districts. Trying to keep up with all of these changes is nearly impossible without software.

The cost of manual sales tax management

Can you manage sales tax collection, reporting, and payment without software? Absolutely, but there are costs involved including:

  • Penalties — If you do not collect, report, and pay the correct amount of sales tax, you will likely be responsible for paying the balance as well as penalties and interest. Collecting too much sales tax is problematic, too. If you cannot locate the customer to refund the difference, that extra money is usually property of the state and must be turned over. The potential cost would be the erosion of your customers’ trust in you. 
  • Time — Does your accounting department have time to contact each tax authority on a monthly or quarterly basis to verify current tax rates? When we update our FastTax software, this becomes an all-consuming task carried out by a team of individuals. Using the wrong tax data will also require time spent on making corrections, reaching out to customers, and issuing refunds. 
  • Customer service — As a seller, you are placed in the position of being a tax collector. Even though you are not collecting the tax for your benefit, your customers won’t necessarily see it that way. If you collect the wrong sales tax amount and later have to go back and ask for more, your customers will not be happy. If you collect too much and need to issue a refund, they may appreciate the mini windfall but have doubts about buying from you in the future. Your reputation could take a serious hit, especially if a customer has found that you’ve overcharged for sales tax and did nothing to rectify the situation. 

Keeping up with ever-changing sales tax rates is essential. Avoid these potential pitfalls by using Service Objects’ FastTax software. This real-time API, which is updated monthly, calculates the correct sales tax rate across all levels (city, state, county, city district, and county district) based on street and address-level tax information. Learn how to save time, money, and your reputation with FastTax by contacting us today.

Are You Using Address Verification from Your Shipping Company?

address-verifcation-mailboxTypos, misspellings, and carelessness take their toll on deliverability, making address validation essential for any business that relies on USPS, FedEx, UPS, and other shipping companies. We’ve all been there: maybe it’s a simple misspelled word such as “Mane Street” instead of “Main Street,” or maybe it’s a few transposed numbers in a street address. Now that mobile devices have taken over the world, autocorrect isn’t helping matters, either.

Fortunately, shipping carriers understand that mistakes happen — and it’s in their best interest to make sure that packages arrive as expected, even if the address is incorrect.

The shipping carriers’ solution to incorrect addresses

Think about it, would you rather the shipping carrier figure out where the shipment is intended and get it there on time or would you prefer that the carrier return the package to you? If the package is returned, you’re going to look bad to the customer. The delays could be excessive, too, reminding your customer that you let her down with each passing day. Thus, when a shipping carrier uses some sort of address verification service to conduct its own investigation and get the package to its intended destination, it’s offering a valuable service.

Of course, you can expect to pay for that service. When shipping carriers use an address validation service to correct an incorrect address, they typically impose some sort of “address correction” or “charge-back” fee. While the shipping carrier may have acted like a hero, saving the day comes with a hefty price tag. For example, correcting an address with FedEx Express US freight services will cost you $60 per correction. This makes FedEx Ground’s $11 charge-back for an incorrect shipping address look like a bargain (and trust us, it’s not).

So, what do you get for all that money? The customer gets her package, so that’s a plus. However, it’s what you won’t get that makes using a shipping carrier’s address validation service a bad investment: You won’t get the customer’s corrected address!

Your accounts payable clerk will likely pay the bill, but it may never cross his mind to dig deeper to find out the customer’s correct address and update your database. The next time this particular customer places an order, the order will ship to the address on file — and it’s still incorrect. Again, your shipping carrier’s address verification service will kick in and save the day. And once again, you’ll be charged an outrageous address correction fee. This cycle could continue with each transaction, eating into your profits needlessly.

A better address validation service

Few businesses survive on bad investments. Instead of repeatedly paying shipping carriers address correction fees, a better investment involves cleaning your own data using an affordable address verification service such as Service Objects DOTS Address Validation.


Remember those $11 to $60 fees imposed by shipping carriers for using their address validation service? What if you could proactively verify — and correct — addresses automatically and in real time for mere pennies?

Kathleen Rogers, Director of Engagement at Service Objects recently discussed the shipping companies’ exorbitant penalty fees with Virtual Logistics Inc., along with the benefits of using Service Objects’ address verification service.

“For just pennies,” she explained. “We can correct, standardize, and give you back delivery point validation scores to help you ensure three things: Your customers’ packages arrive quickly and efficiently; the shipping address is valid and occupied thereby reducing loss of merchandise to fraud; and less of your packages come back as undeliverable, saving valuable operational resources and reshipping charges.”

She also explained that unlike addresses corrected by shipping carriers, the corrected address is immediately updated in your database. This ensures that the same mistake is not repeated.

Addressing mistakes happen, but you shouldn’t have to pay exorbitant fees over and over to correct them (and temporarily at that). Real-time address validation from Service Objects is a far superior investment.


Cutting the Cord… But is the Cord Really Cut?

The great migration from traditional landline continues. After all, nearly everyone has a cell phone, and many families have multiple cell phones. In addition to competition from wireless phones, the traditional landline also faces competition from VoIP services such as Vonage and Skype. Meanwhile, cable companies now bundle phone service into their television and Internet packages. With numerous alternatives available, the cord cutting trend appears to be stronger than ever… but is the cord really cut?

Not everyone is cutting the cord on their landline. Some have migrated their existing landline phone numbers to alternate platforms. The Centers for Disease Control and Prevention regularly tracks US phone use. While this data and other research documents the migration away from traditional landlines, other telephone validation insights suggest that the trend may have reached its peak.

For example, in the last 10 years we have found:


Not everyone migrated: 35 percent of traditional landlines have been disconnected. There are a number of reasons why subscribers might disconnect phone service including moving to a new location, death, or even a preference to simply be left alone. According to the CDC, about 3 percent of homes do not have any phone service whatsoever.


Business and homes move to cable providers: 17.6 percent now use cable carriers such as COX, Comcast, and Charter for their phone service. Cable providers regularly offer bundled packages that include high-speed Internet, cable television, and phone service — often with unlimited nationwide long distance and other advanced calling features.


About 6 percent of numbers have been ported to VOIP services like Vonage, Skype, and Google Talk: VoIP services route calls over the Internet and are generally offered at lower prices than traditional landline services. VoIP plans often include either nationwide or international long distance, making them particularly attractive to those with geographically dispersed family members or businesses with large long-distance bills. 


Cutting the cord but keeping the number: Nearly 5 percent of landline numbers now directly ring to cellphones. So, while they cut the cord per se, these users have kept their original phone numbers. Local number portability rules went into effect in late 2003, helping to make this an option. The customer of record for a landline or mobile phone number has the ability to reassign the number to a different carrier. Thus, if you want to keep your landline number but cut the cord, you can port that number over to a cellphone.


No change here: About 37 percent of landline owners have kept their landlines with the original provider. That’s a hefty percentage, comparable to the number of disconnections. When looked at this way, you’ve got just over a third who have kept their landline, roughly a third who have disconnected (maybe they died or moved away?), and about a third who have officially chosen an alternative such as a cable carrier’s phone service, VoIP, or wireless. It makes you wonder, is the cord really cut?

From mobile phones, VoIP, and bundled cable phone services to plain old telephone landline service, today’s telephone subscribers have more ways than ever to connect — with or without the cord. Due to telephone number portability issues identifying phone service types is critical for establishing trust and for complying with federal laws.

Service Objects has several reverse phone lookup products that can determine a phone number’s service type (landline, VoIP or wireless) with near perfect accuracy. Contact us for details.

Or start a free trial today!


Why Does a VoIP Line Show My Old Address?

The good old days: You’d move, get a new phone number, and your utility records would show your correct address. Though it wasn’t necessarily ideal, address accuracy pretty much took care of itself. Today, we have VoIP and telephone number portability. Modern and convenient they may be, however, address changes no longer take care of themselves. For example, it’s not unusual to perform a reverse phone lookup and find an old address. Why is that?

voip-reverse-phone-lookupThe problem has a lot to do with how Voice over Internet Protocol (VoIP) phone numbers work compared to traditional land lines. Rather than being physically installed at an address, VoIP phone numbers are virtual and routed over the Internet. As such, you can make and receive VoIP phone calls at your home, your office, or even a hotel room when traveling. When you move, simply plug in your VoIP device at the new home and you’re good to go.

Say that’s all you do. Now perform a reverse phone lookup. Your VoIP line still shows your old address. Here’s the short answer: You forgot to tell your service provider that you have a new physical address.

Since your VoIP line is not anchored to a single address or physical location, the phone service provider doesn’t automatically have a means of knowing the correct address. Most VoIP providers rely on their users to update their location information. 

This information isn’t just important for the sake of accurate reverse phone lookup data. It’s vital in terms of VoIP 911 service. After all, if your reverse phone lookup information shows your old address, emergency responders will be routed to your old address. 

According to the FCC’s Consumer Guide: 

“…VoIP customers may need to provide location or other information to their VoIP providers, and update this information if they change locations, for their VoIP 911 service to function properly…”

The FCC requires each service provider to obtain the customer’s physical location before activating new service so that emergency responders have an accurate initial address for 911 calls. However, changes of addresses remain largely the customer’s responsibility. The FCC requires service providers to “provide one or more easy ways for their customers to update the physical location they have registered with the provider, if it changes.”

Changing your physical address with most VoIP providers is usually a simple matter. If you’ve noticed that a reverse phone lookup has the wrong address, take matters into your own hands by updating your current address with your VoIP service provider.

Because the onus is on customers to update their addresses when they move, VoIP lines are often considered unreliable when it comes to identifying where a caller is located. This leads to complications in Emergency Response Services and can sometimes be exploited by prisoners on parole, but those are topics for another time.

To learn more about Service Objects’ reverse phone lookup service for VoIP, wireless, landline VoIP and toll-free telephone numbers, click here!

Or start a free trial today! 


Name Validation and the Colorado Rockies

Have you ever misspelled a customer’s name? If so, then you know that the backtracking you need to do in order to remedy the problem can take some time and effort. Misspelling customers’ names, especially in service-oriented businesses, can easily derail your customer service and marketing efforts. Think about how much time and money you’ve already spent in order to learn about your target customers, their preferences, and how best to reach out to them. Then you blemish what should have been a pleasant moment of contact by misspelling their names. 

name-validationConsider these two Colorado Rockies blunders of 2014. First, they honored the spectacular batting average of Troy Tulowitzki, and gave away 15,000 baseball jerseys that spelled out the celebrated shortstop’s last name as Tulowizki. The Rockies did a well-timed damage control by posting on their Facebook page an apology that acknowledged the mistake. Then barely two weeks later, the same Major League Baseball team introduced in its merchandise a batch of new souvenir cups that had Nolan Arenado’s last name printed as Arendo. The Rockies’ third baseman Arenado won a Gold Glove and was thus referred to in the souvenir cups as Golden Arendo.

Examining these examples from a branding perspective, you might conclude that botching people’s names can make your prospective customers lose confidence in what you have to offer. The lack of attention to detail in something as crucial and downright personal as a name says a lot about your ability to deliver, say, whatever marketing claims you are making about your product or service. If the slip-up happens only once, then people can still be forgiving. However, if you commit the same oversight over and over, then it can suggest a lack of attention to detail or customer service. Imagine how potentially alienating it can be for an eager shopper—one who is clearly interested about your offerings because he may have gone through the trouble of signing up to your lead generation form—to see his name being misspelled on follow-up communications. In some cases, mucking up your time-honored personalization routine by messing up a customer’s name can even make or break a sale.  

This is the era of highly targeted and location-based marketing programs, social media, mobile wallets, smart wearables, advanced analytics, and drones that deliver merchandise at the doorsteps of shoppers. Consumers have developed higher expectations, and they formulate their purchase decisions based on those expectations. Marketers, on the other hand, can now wield a plethora of digital tools to streamline, as well as make more profitable, their day-to-day business operation. 

Getting your customers’ name spelled correctly shouldn’t be one of your issues. This tedious part of customer service can be verified, corrected and flagged automatically by a real-time name validation service. Our name verification API service intelligently validates not only your customers’ names but can also identify gender that is associated with the first name, helping you to address them properly. 

If you equip your business with the right tools to support your branding, marketing and customer service efforts, you’ll reduce the time, energy and money spent on fixing errors.


Why Your CRM is Ineffective

Ever wonder why your CRM isn’t as effective as you had hoped it would be? Here’s the reason: Bad data! According to a recent article published on Health IT Analytics, the following takes place about every 30 minutes:

  • 120 business addresses change
  • 75 phone numbers change
  • 30 new businesses are formed
  • 20 CEOs leave their jobs

With data changing this quickly, it’s no wonder your CRM database is loaded with bad data.

How does bad data effect your CRM?

crm-iconFor starters, bad data is unnecessarily wasteful. Let’s say that you have a sales pipeline filled with hot B2B C-level leads along with a direct mail nurturing campaign. Some of those CEOs will leave their jobs while another portion will move to new addresses. If you don’t update your CRM database accordingly, your mailings will be a waste of paper, postage, and manpower. As time goes by, more CEOs will leave or move, rendering your hot lead database even more ineffective.

It’s not just marketing that suffers. Customer service can suffer too. For example, it’s not uncommon for customers to provide online retailers with incorrect or misspelled addresses. While the original error may have been the customer’s, who do you think they will blame when their package never arrives?

Unnecessary waste, increased costs, and reduced customer satisfaction erode operational efficiencies and adversely impact the bottom line — and it all begins with bad data.

How to improve your CRM’s effectiveness

If bad data is responsible for your CRM’s ineffectiveness, the solution is simple: improve your data quality. But how? Few companies have the luxury of hiring a full-time staff member to confirm and correct contact data. Not only would that, this would be difficult-to-fill, tedious and never-ending job. Fortunately, it’s a job perfectly suited to automation.

By integrating a data validation API from Service Objects into your point-of-sale or CRM software, it becomes possible to validate, standardize, correct, and append contact information in real-time. If a contact moves to a new address but doesn’t notify your company, your database will not degrade. This bad data will be corrected by the data validation API, which compares the existing contact information against a massive USPS database containing the latest address changes.

Service Objects offers several data validation APIs covering everything from address and phone number validation, email address validation, and Geo codes to lead validation, order validation, BIN validation, IP address validation, and demographics.

Data changes at a rapid pace. Is your CRM able to keep up? Improve its effectiveness by combating bad data at the source.


Health IT Analytics, Battling Bad Data to Grow Market Share with Big Data/