Service Objects’ Blog

Thoughts on Data Quality and Contact Validation

Email *


Posts Tagged ‘Customer Support’

Why Providing Feedback is Important to Improving Software

Developing user-friendly software and an amazing user experience requires listening to users. As a software developer, we rely on user feedback to continuously improve our data validation APIs. As a user, you may not feel compelled to provide feedback to software developers, but if you value a great experience, your role is an essential one.

Examples of User Feedback Collection Mechanisms

You’ve likely encountered user feedback collection features in various forms. For example, Microsoft Office prompts you to click on a happy or sad face in order to express user suggestions. This simple menu is available from the File tab, allowing you to tell Microsoft what you like, what you dislike, and your suggestions.

You may also find that user feedback is naturally baked into installed software, accessible via the Help menu.

If you’re a .NET developer, you’re probably familiar with the Visual Studio Experience Improvement Program. The little helper icon in the top right of the program has become the ubiquitous symbol for feedback, client support, and general help desk tasks. With just a click of a button, users can instantly share their experiences with the software.

What about when an application crashes? You’ll often be prompted to send a crash/bug report to the developer. These reports may even contain hardware/software configurations, usage patterns, and diagnostic information helpful to developers — and all you need to do is click a button.

These are but a few of the many ways that modern applications send information back to the software company. Obtaining user feedback as well as any crash/bug report information is crucial to the development of a piece of software or service. This information helps software developers isolate where and why problems occurred, leading to product updates.

User Feedback Challenges: Privacy Concerns

But what about “Big Brother” or other potential snoops? With the various means of providing feedback and the different collection schemes (opt in / automatic), privacy concerns are valid. With these data collection tools baked into the software, it is hard to know how much information is actually being sent back to the company. It could range from the harmless crash/bug report from a software crash or diagnostic information to controversial GPS breadcrumb data.

However, many people don’t want other entities collecting data on them or analyzing their usage patterns. While not all software is intentionally spying on you, it would be nice to know what exactly is collected. More often than not, it’s unclear what’s collected and how it’s used. This lack of transparency concerning data collection inevitably leads to unease, which is why many users opt not to participate in “Experience Improvement Programs” and other data collection schemes.


Another challenge for developers is that not all companies have installed software on client’s devices, making data collection challenging, even when users are willing to opt in. For example, the normal avenues for collecting data, such as hardware/software configurations, from users are not seamlessly integrated with web-based technologies such as web services or certain SaaS. Many companies struggle with this and must use other means of getting user feedback.


Despite privacy concerns and a lack of openness, the bottom line is that user feedback is valuable. When utilized properly, the information can be used to fix existing problems in the software as well as lead to new features. The reason why subsequent versions of software are so much better than 1.0 is directly related to user feedback.

How Service Objects Gathers User Feedback

Service Objects does not collect data on clients, so the privacy concerns discussed above are irrelevant. Potentially sensitive data processed through our services is not monitored or collected. This is a highly sought after data validation “feature” for our clients, but at the same time, it presents a challenge for us to gather detailed user feedback.

We offer several ways for our customers to provide feedback: You can connect with us via phone (805-963-1700), email, and via support tickets.

Any user feedback we receive is taken very seriously and can lead to bug fixes, updates, and even new services/operations. A great example of this is the FindAddressLines operation in DOTS Address Validation 3. The operation was initially born to help a particular client and has been utilized to great effect to clean up messy address data.

If you have any feedback you would like to share to help us improve our data validation services, we encourage you to reach out to us at anytime.

Best Practices for List Processing

List processing is one of the many options Service Objects offers for validating your data. This option is ideal for validating large sets of existing data when you’d rather not set up an API call or would simply prefer us to process the data quickly and securely. There is good reason to have us process your list: we have high standards for security and will treat a file with the utmost care.

As part of our list processing service, we offer PGP encryption for files, SFTP file transfers, and encryption to keep your data private and secure. We also have internal applications that allow us to process large lists of data quickly and easily. We have processed lists ranging from tens of thousands of records to upwards of 15 million records. Simply put, we consider ourselves experts at processing lists, and we’ll help ensure that your data gets the best possible return available from our services.

That said, a few steps can help guarantee that your data is processed efficiently. For the best list processing experience – and the best data available, we recommend following these best practices for list processing.

CSV Preparation

Our system processes CSV files. We will convert any file to the CSV format prior to list processing. If you want to deliver a CSV file to us directly, keep the following CSV preparation best practices in mind:

Processing international data – If you have a list of international data that needs to be processed, make sure the file has the right encoding. For example, if the original set of data is in an Excel spreadsheet, converting it to a CSV format can destroy foreign characters that may be in your file. When processing a list of US addresses, this may not be an issue but if you are processing an International set of addresses through our DOTS Address Validation International service, then something like this could highly impact your file. One workaround is to save the file as Unicode text through Excel and then set the encoding to UTF-8 with BOM through a text editor. Another option is to send us the Excel file with the foreign characters preserved and we will convert it to CSV with the proper encoding.

Preventing commas from creating unwanted columns – Encapsulating a field containing commas inside quotation marks will prevent any stray commas from offsetting the columns in your CSV file. This ensures that the right data is processed when our applications parse through the CSV file.

Use Multiple Files for Large Lists

When processing a list with multiple millions of records, breaking the file into multiple files of about 1 million records each helps our system more easily process the list while also allowing for a faster review of the results.

Including a unique ID for each of the records in your list helps when updating your business application with the validated data.

Configure the Inputs for the Service of Choice

Matching your input data to ours can speed up list processing time. For example, some lists parse address line 1 data into separate fields (i.e., 123 N Main St W would have separate columns for 123, N, Main, St, and W). DOTS Address Validation 3 currently has inputs for BusinessName, Address1, Address2, City, State and Zip.  While we can certainly manipulate the data as needed, preformatting the data for our validation service can improve both list processing time and the turnaround time for updating your system with freshly validated data.

These best practices will help ensure a fast and smooth list processing experience. If you have a file you need cleansed, validated or enhanced, feel free to upload it here.

How Constantly Changing Sales and Use Tax Rates Can Impact Customer Satisfaction

For businesses engaged in commerce, it is a real challenge to stay on top of ever changing sales and use tax rates. One minor tax rate change can have a direct impact on your customers, costing your business significant time, money and resources.

In addition, the complexity of tax laws continue to increase every year, with constant changes in tax rates, and tax jurisdictions that often go beyond simple measures such as ZIP codes or municipality. And the risks to businesses for non-compliance are potentially severe.

The result from charging customers the wrong rate can have a significant impact on a business. Unhappy customers cause customer service issues, can have a negative impact on employee morale and have a substantial financial impact from processing refunds or collecting outstanding money owed.

All this means that maintaining tax compliance – particularly in today’s business environment, with everything from multiple distribution channels to e-commerce – requires planning and processes to become a smooth-running, cost effective part of your business.

Learn more about this important topic. Register for our upcoming webinar on May 23, 2017 and hear Geoff Grow, CEO and Founder of Service Objects, as he discusses:

  • What sales and use taxes are and how they are calculated
  • The concept of Nexus and when an out-of-state business is liable for collecting sales or use taxes
  • What the Streamlined Sales and Use Tax Agreement is and why it doesn’t work
  • Recent legal rulings that can affect your business
  • The important role geo-location plays in calculating rates
  • The impact on your business when your customers are charged the wrong rate
  • The benefits of leveraging a third party data provider who is an expert in providing the most accurate and up-to-date rates in the US and Canada

Taking Service Objects for a Test Drive

You’ve found Service Objects, you’ve read about our services so now you want to test drive. Well, there are several ways to do just that and as I go through the options you will also get a pretty good picture about how you can integrate our services into your applications or processes. Testing can be a tedious task, delving into the unknown of a third party process, but we make it easy for you to jump right in by giving you several ways to test such as our Quick Lookup Tool, DataTumbler, Batch Processing and our real-time API services.

Quick Lookup Tool
The Quick Lookup tool is used to test one off examples on our website.  It is as simple as navigating to the Quick Lookup page, selecting the particular service you are interested in, filling out the fields and clicking submit. You’ll receive a real time response from the service containing the results of your test.

Since our services often offer multiple operations the Quick Lookup pages will inform you which operation is being used for the particular service in the form. If there are other operations you are interested in testing then we have you covered there as well with links to the other operations.

The DataTumbler is a PC-based desktop application you can download from our site to run tests on multiple records. If you have ever used Excel then this application will be easy for you to drive. It works like a spreadsheet where you can paste multiple records for processing, in real-time.

Here are the basic steps: Choose your service, choose your operation, drop your data in and click validate. Choosing the service and desired operation is important because often it will change the input columns needed for the records to process properly. In the screenshot above you can see that there are 5 input columns designated by the yellowish cell background. Here we have inputs for Address, Address2, City, State and Zip. If your particular purposes do not require Address2, for instance, then that column can be removed by simply clicking on the “Customize Input Columns” button and removing it from the input.  You can do the same thing for the output columns as well but in that case you would need to access the “Customize Output Columns” popup.  The output columns are designated by the cells with the greenish background.

You can also add additional columns that are not predefined by the application by right clicking a column and selecting “Insert Column”.  This is handy for situations where you want additional data to stay together with your test like a unique identifier or other data unrelated to testing the service.

Once one of the validation buttons at the bottom is pressed, the DataTumbler will make requests to our services in real-time and populate the output columns as the data is being processed.

To get the application please call Customer Support at 1.800.694.6269 or access Live Chat and we will get you started.

Batch Processing
Batch processing is another way you can test drive our services.  When you have a file ready to go for testing you can simply hand it over to us.  We will process it and send back the results to you along with a summary.

This is one of the more preferred ways to test drive our services for several reasons:

  • We can see the data you send us first hand and give you feedback about what the file looks like with respect to items like formatting.
  • By seeing all the fields of data we can quickly recommend the most beneficial service.
  • It gives us an opportunity to see the results with you and go over any interesting data points from a data validation/cleansing expert point of view.

All that is needed for this is a test file containing the records you want to try out. The file can come in several formats including txt, csv and xls to name a few. You can send us the file directly to an email or for a more secure means we can provide you a secure ftp for the file transfer. We can also handle working with encrypted data when an extra security layer is needed. An additional way to get us your test file is through our web site itself. You can drag and drop a file and be on your way. Once we have the file we will process it against the service you are testing and return the results along with a summary breakdown of the processing.

If your test run is a success and you’re ready to move forward with a larger file, we can also run one-time paid batch. Clients often use a this as an initial data scrub before a switching to our real time API or automated batch system which will run batches virtually on demand.

Integrating the API
The last way you can test our services is by implementing our API in your code. Most clients use the API when they integrate with us so testing this way gives you the closest representation of how your production process will work with our services.

When it comes to doing a direct software integration test we have you covered. We make it easy to integrate and get testing quickly by means of sample code, code snippets, step-by-step integration walk-through blogs, developer guides and NuGet for Visual Studio.

We have sample code for C#, Java, PHP, Rails, Python, NodeJS, VB.Net, Classic ASP, Microsoft SQL Server, APEX and Cold Fusion.  This list does not mean that we stop there.  Additional sample code can be requested and our team will review to find the best solution for you.  When applicable, our sample code will be available in both REST and SOAP.  All of our examples will implement best practices and demonstrate failover.

If you are a C# developer and use Visual Studio you will have us at the hands of your finger tips.  Using the NuGet Manager in Visual Studio you can have our API injected into your code and ready to go.

All of our walk-through tutorial blogs and documentation are presented in a straight forward and easy to understand format and as always the Service Objects team is here to assist with any questions or help you may need.

When it comes to test driving our services we give you options to make it easy. A trial key will give you access to all the options I mentioned. Going through these options also gave me a chance to show you how you can integrate with our services.  The beauty behind the way we set this system up is that you can become fully integrated and tested before you even purchase a live production key.  In all of these cases, usually only two things need to be updated when switching from trial testing to live production use.  In the Quick Lookup Tool you will need to switch the “Select Key Type” to “Customer Production Key” and then use your live production key instead of your trial key.  In the DataTumbler you will be similarly swapping those fields out as well.  When it comes to doing a code integration you just need to update your endpoints from to and the trial key for a live production key.

Whenever you want to test one or two records or a whole file, simply put, our team has you covered.

We’ve Raised the Bar to a 99.999% Uptime Guarantee

For many years, we’ve provided a Service Level Agreement with 99.995% availability guaranteed. This equates to less than 26 minutes of downtime annually, or less than 2 minutes and 11 seconds monthly. We’ve consistently achieved and exceeded this promise to our customers year after year, but wanted to take this commitment up a notch…

We’re excited to announce that we increased our Service Level Agreement to a 99.999% uptime guarantee, equating to less than 5 minutes of service downtime annually, or less than 26 seconds monthly!

What is a Service Level Agreement (SLA)

SLAs make use of the knowledge of enterprise capacity demands, peak periods, and standard usage baselines to compose the enforceable and measurable outsourcing agreement between vendor and client. As such, an effective SLA will reflect goals for greater performance and capacity; productivity; flexibility and availability; and standardization.

At the same time, an SLA should set the stage for meeting or surpassing business and technology service levels, while identifying any gaps currently being experienced in the achievement of service levels.

SLAs capture the business objectives and define how success will be measured, and are ideally structured to evolve with the customer’s foreseeable needs. The right approach to SLAs result in agreements that are distinguished by clear, simple language, a tight focus on business objectives, and ones that consider the dynamic nature of business to ensure evolving needs will be met.

How We Do It

Multiple data centers provide redundancy by using redundant components, systems, subsystems, or facilities to counter inevitable failures or disruptions. Our servers operate in a virtualized environment, each utilizing multiple power supplies and redundant storage-arrays. Our firewalls and load-balancing appliances are configured in pairs, leveraging proven high-availability protocols, allowing for instantaneous fail-over.

Compliance is an important benefit of professional data centers. In today’s business climate, data often falls under government or industry protection and retention regulations such as SSAE 16 standards, the Health Insurance Portability and Accountability Act, and the Payment Card Industry Data Security Standard. Compliance is challenging without dedicated staff and resources. With the third party data center model, you can take advantage of the data center’s existing compliance and audit capabilities without having to invest in technology, dedicated staff, or training.

Data Security & Management
We’ve invested in “bank grade” security. Several of our data centers are guarded by five layers of security, including retinal scanners. All systems are constantly monitored and actively managed by our data center providers — both from a data security and a performance perspective. In addition, we operate our own in-house alerting and monitoring suites.

Geographic Load Balancing
Another key factor for ensuring uptime has to do with geographic load balancing and fail-over design. Geographic load balancing involves directing web traffic to different servers or data centers based on users’ geographic locations. This can optimize performance, allow for the delivery of custom content to users in a specific region, or provide additional fail-over capabilities.

Ensuring a high level of uptime comes down to: redundancy and resiliency, compliance, geographic load balancing, great data security, and 24/7 monitoring. All of these factors are equally important and contribute to our 99.999% uptime results — guaranteed!

We Won’t Let Storm Stella Affect Your Data Quality

A macro-scale cyclone referred to as a Nor’easter is forecasted to develop along the East Coast starting tonight and estimated to continue throughout Tuesday. In addition to typical storm preparations, have you ensured your data is also ready for Storm Stella?

Although we cannot assist you directly with storm preparations (water bottles, canned foods, batteries, candles, backup generators, blankets…etc) we will always ensure the integrity and reliability of our Web services. Since 2001, we’ve been committed to providing a high level of uptime during all types of conditions including storms, even Nor’easters. All of which comes down to: redundancy, resiliency, compliance, geographic load balancing, great data security, and 24/7 monitoring, contributing to our 99.999% availability of service offerings with one of the industry’s only financially backed service level agreement.  We take great pride in our system performance and are the only public web-service provider confident enough to openly publish our performance reports.

To ensure you are fully prepared for this storm in particular, it is important to note that our primary and backup data centers are in separate geographic locations. If an emergency occurs, you can re-point your application from our production data center to our backup data center.

The failover data center is designed to increase the availability of our web services in the event of a network or routing issue. Our primary data center hostname is: and our backup data center hostname is

You can also abstract the actual hostname into a configuration file, in order to simplify the process of changing hostnames in an emergency. Even in the case where your application handles failover logic properly, an easy-to-change hostname would allow your application to bypass a downed data center completely, and process transactions more quickly.

For most clients, simply updating their application to use our backup data center hostname should immediately restore connectivity. Your existing DOTS license key is already permitted to use our backup data center and no further actions should be needed.

Many of our clients with mission critical business applications take this action of configuring for failover in their application. We are available 24/7 to help with best practices and recommendations if you need any assistance before, during or after the storm!

Tech Support in the Age of Instant Gratification

We live in an age where an overabundance of information and resources are just a few clicks away. Even physical goods can be delivered to your front door the very same day you order.  People want and expect to have the similar convenience and response times when they need technical support.

Us tech support experts here at Service Objects completely understand that. One of our core values is to offer outstanding Customer Support to our clients. We have a good day at the office when we can quickly and effectively answer questions about our services, resolve issues and get the data hungry masses up and running with their validation service of choice.  To help ensure that our customers can get back to using their validation service for their business we have several avenues where people can seek support.

24/7 Phone Support

Do you have a pressing issue after hours?  We understand that this can be exceedingly stressful and frustrating.  We want help you get it resolved as quickly as possible.  If you do ever run into an after hours support issue call our office phone number (1.805.963.1700) and follow the prompts. Once directed leaved a message with a detailed description of the issue you are encountering and the best way to contact you and a member of our team will typically contact you within 20 minutes.

LiveChat Through our Website

Have a quick question that you want answered right away? Like: what URL you should be using? What does this particular response mean from the service? Is this an expected response from the service? Are there any current issues occurring with the service?  Is there a different endpoint I should hit for a different operation? Questions like this are examples of ones we would be happy to answer in our LiveChat on our website. Simply navigate to our website during business hours and someone will be able to start a LiveChat with you once you are available.  Once they do, simply state the question or issue you are experiencing, along with pertinent account information and we will happily assist in any way we can.

Support Tickets

The primary method to address and keep track of all our support inquiries is through our support ticketing system.  Whether you call in, use LiveChat or send us an email, most technical support issues will get sent to our ticketing system and we’ll use it to quickly and effectively address any issues or questions you may have.  To create a ticket, simply email or click here and you can fill out the form to get a ticket created. Feel free to use any of the above channels to contact us and we’ll be glad to offer any support that we can!

A Commitment to Fanatical Customer Service Leads to “FindAddressLines”

Service Objects runs an agile Engineering team that likes to be ready to run with a great new idea at any given moment. We view it as one of the cornerstones of our fanatical customer service plan.

As soon as we learn about a challenge that a prospect or client is experiencing we’re excited to find a customized solution. A recent example of this is the release of a new operation for our Address Validation-US service called FindAddressLines.

DOTS Address Validation-US is one of our core services which takes as input, two address lines, city,  state, as well as postal code and does an excellent job of cleaning and standardizing even grossly misspelled addresses. In a perfect world, data is collected and properly placed where it needs to be to facilitate the validation. But we know the world isn’t perfect and one particular client had lists of addresses in which there were extra Address lines (sometimes up to 5) or even key pieces of data entered into the wrong columns altogether. FindAddressLines was born initially to help this client, and many others moving forward, clean up these types of problematic issues.

Let’s take a look at how it works:

In the example above, the first three rows work as expected using Address1 and Address2 as the inputs. However, on row 5, the address returns a missing secondary number because the suite number fell into Address3.  When you get to row 6 there is nothing to go on unless you are looking at Address4 and Address5 specifically.  Our FindAddressLines operation allows you to submit up to 10 lines (columns) including a city, state and zip and we do the work to make sure the right data makes it into the right locations.

Here’s an even messier example:

As we can see in row 4, the data was pushed out past the zip column. This can easily happen when importing data from a database to a spreadsheet if care isn’t taken for potential delimiters like commas.  With FindAddressLines, we are able to assign the extra columns as inputs and let the service figure it out.  The example in row 5 has a completely jumbled address that might have occurred from a corrupted database or just extremely messy data collection.  Again, we can use FindAddressLines to solve this one as well.

We enjoy talking to both current clients and prospects alike to determine what their needs are and what new features and services we can put together to help them improve their unique processes.  Our team is 100% committed to our customers’ success and can often rapidly put together a new solution to solve almost any problem you’re experiencing.

Ruby on Rails Integration Tutorial

Ruby on Rails (or more colloquially, “Rails”) is a server-side web application framework that provides its users a powerful arsenal to create web pages, web services and database structures. Rails utilizes a model-view-controller (or MVC for short) framework that provides an easy way to separate the user interface, database model and controller.

One of Service Objects highest priorities is getting our clients up and running with our DOTS Validation web service(s) as quickly as possible.  One way we do this is by providing sample code and occasionally, step by step instructions.  So if you are partial to using Rails, this tutorial will provide everything needed to get up and running.

What You will need

  • A DOTS license key, click here for a free trial key. We’re using DOTS Email Validation 3 for this tutorial.
  • Your Favorite Text Editor (Sublime, Notepad++ etc)
  • Ruby On Rails installed on your test machine.
  • Familiarity with the Command Line

Creating a Default Project

First navigate to the directory via command line in which you would like the project created.  Once the command line is in the desired directory, run the command “rails new EV3Tutorial.” You may call the project whatever you prefer, but this will be the name of our tutorial.

After the project creation is finished you should have a brand new Ruby on Rails project.  You can launch the server by simply being in the project directory and entering “rails server” into the command line.  The default port number the project runs on is 3000. We’ll be using port 3001 for this project; the port number can be specified by entering the following command to launch the server: “rails server –p 3001”.

If you run the server and navigate to the applicable localhost URL where it has been launched, you’ll see the following page:

Ruby on Rails Tutorial Image One

This page is the default page for any new Rails project. From here, you find more documentation about working with Rails.

Generating a Model and Controller

We’ll need a controller to make the actual call to the DOTS web service and a model to pass information to our controller.  Simply type “rails generate controller request new show” and the necessary controller and views will be created. This command will also create two new views; one titled “new.html.erb” and the other called “show.html.erb”.  The command line should look similar to the following.

Ruby on Rails Tutorial Image 1

After this command is run, the controller folder and the views folder will have some new files created. It should look like the following:

The “new.html.erb” file will be our default page, so to make the application load it automatically open the routes.rb file in the config folder of the project and make the routes page look like the following.

For this example, we’re using the “ValidateEmailAddress” operation which has 4 separate inputs. EmailAddress, AllowCorrections, Timeout, and LicenseKey.   These in turn, will be part of our model that will pass values from the view to the controller.  So before we fill in the html page, we’ll create the model and include these values in its definition.

Enter the command “rails generate model request” in the command line. The values in the model can be specified via the command line, but it is usually easier to enter it in the model file. Locate the model schema in the following folder:

To add values to the model, make the model schema on your file look like the following.

In order to use the model it will need to be migrated. To do this enter “rake db:migrate” into the command line to commit the model.

Creating the Views           

Now that our model has been created and committed to the database, we can pass values from the view to controller. To do this, we’re going to create a simple web form with the inputs to send to the DOTS Email Validation service. To do this, add the following code to the “new.html.erb” file.

Now that our input page is set, we’ll make sure our “show.html.erb” file is ready to display some output values from the service. We will include a brief if statement in the show that will determine whether or not an error object is present in the response from the service. If one is present we’ll display that if not, we’ll display the valid results. Make your show.html.erb file look like the following.

These output fields haven’t been instantiated or set to any values yet but that will happen in our controller.

Integrating Logic into the Controller

Now that our views are set up, we’ll need to put some code in the controller to make the actual web service call to the Service Objects web service. To make this call, we’re going to use the gem “httparty” which will allow us to make a RESTful web service call. Be sure to place this gem in your projects gem folder and run a “bundle install” in the command line.

We won’t go over all the elements of the controller file in this tutorial, but rather just touch on some important parts of the logic to be integrated.  The screen shot below highlights some important aspects of calling the webservice.

The code above has a primary and backup URL.  Currently they both point to the Service Objects trial environment. In the event that a production key is purchased, the primary URL should be set to and the backup URL should be set to These two URLS along with the accompanying logic that checks for a fatal error (TypeCode 3 for DOTS EV3) will ensure that your application continues to run in the event that the primary Service Objects data center is offline or experiencing issues.  If you have any questions about the failover logic, please don’t hesitate to contact Service Objects and we will be glad to assist further.

The code above calls an internal function, “processresults,” to display the values to the user. Here is the accompanying screen shot illustrating some of the logic of this function.

Depending on what response comes back from the service, this code will display the values to the user.  For your application, you will likely want to create a class that will automatically serialize the response from the service into something a bit easier to use; but we are showing the above logic as an example of how to parse the values from the service. Note: the @ev3response, @ev3info and @everror are present to help parse the response hash that httparty automatically creates for the response.

Final Steps and Testing

Our project is ready to test out. For this example, we’ll use the email to get a valid response from the service.

Entering these values in to the text boxes, along with a valid license key, should result in the following output.

That completes our Ruby on Rails tutorial.  If you have any questions about this, any other tutorials or sample code please don’t hesitate to contact us! We would love to help you get up and running with a Service Objects web service(s).

3 Things to Consider When Signing a Cloud Computing Contract

Business man signing a contract

Cloud computing entails a paradigm shift from in-house processing and storage of data to a model where data travels over the Internet to and from one or more externally located and managed data centers.

It is typically recommended that a Cloud Computing Contract:

  • Codifies the specific parameters and minimum levels required for each element of the service you are signing up for, as well as remedies for failure to meet those requirements.
  • Affirms your institution’s ownership of its data stored on the service provider’s system, and specifies your rights to get it back.
  • Details the system infrastructure and security standards to be maintained by the service provider, along with your rights to audit their compliance.
  • Specifies your rights and cost to continue and discontinue using the service.

In addition to the basic elements of the Contract listed above, here are three important points to consider before signing your Cloud Computing Contract.

1. Infrastructure & Security

The virtual nature of cloud computing makes it easy to forget that the service is dependent upon a physical data center. All cloud computing vendors are not created equal. You should verify the specific infrastructure and security obligations and practices (business continuity, encryption, firewalls, physical security, etc.) that a vendor claims to have in place and codify them in the contract.

2. Disaster Recovery & Business Continuity

To protect your institution, the contract should state the provider’s minimum disaster recovery and business continuity mechanisms, processes, and responsibilities to provide the ongoing level of uninterrupted service required.

3. Data Processing & Storage

  • Ownership of data: Since an institution’s data will reside on a cloud computing company’s infrastructure, it is important that the contract clearly affirm the institution’s ownership of that data.
  • Disposition of data: To avoid vendor lock-in, it is important for an institution to know in advance how it will switch to a different solution once the relationship with the existing cloud computing service provider ends.
  • Data breaches: The contract should cover the cloud service provider’s obligations in the event that the institution’s data is accessed inappropriately. The repercussions of such a data breach vary according to the type of data, so know what type of data you’ll be storing in the cloud before negotiating this clause. Of equal importance to the breach notification process, the service provider should be contractually obligated to provide indemnification should the institution’s data be accessed inappropriately.
  • Location of data: A variety of legal issues can arise if an institution’s data resides in a cloud computing provider’s data center in another country. Different countries, and in some cases even different states, have different laws pertaining to data. One of the key questions with cloud computing is, which law applies to my institution’s data, the law where I’m located, or the law where my data’s located.
  • Legal/Government requests for access to data: The contract should specify the cloud provider’s obligations to an institution should any of the institution’s data become the subject of a subpoena or other legal or governmental request for access.

The Cloud Computing Contract is for the benefit of both the consumer and the provider. While it can be highly technical and digitalized, the Contract will ultimately establish the partnership between the parties, and following these steps should help mitigate any potential problems.

Service Objects is the industry leader in real-time contact validation services.

Service Objects has verified over 2.5 billion contact records for clients from various industries including retail, technology, government, communications, leisure, utilities, and finance. Since 2001, thousands of businesses and developers have used our APIs to validate transactions to reduce fraud, increase conversions, and enhance incoming leads, Web orders, and customer lists. READ MORE