Connecting Business Applications with Mobile Customers via Twilio’s SMS Service

In the constantly changing and ever-growing market of mobile technology, businesses face the challenge of finding methods to effectively communicate and connect with their target customers who are always on the move.

In this post, I’ll outline how you can integrate Twilio’s SMS service into a .NET web application in a few simple steps. We’ve recently integrated Twilio into a client’s web application as a means to communicate with employees after-hours, when employees have left for the day but a situation arises that needs further attention.

Before we get into the coding, let’s take a quick look at other methods that are used to send SMS messages from applications.

Email-to-SMS Gateway functionality has existed for a while and offers some of the same benefits, but with a few caveats.

  1. What if you don’t know the customer’s carrier? You need this in order to utilize it (for example, if your customer’s number is 555-777-8888 and is a Verizon customer, you can text them by sending an email to 5557778888@vtext.com)
  2. You cannot control who the message is sent from, for some carriers is the email address of the send, for some it’s the email address of the server
  3. You cannot control what number delivers the message, so the customer may not be able to respond back to the message

Twilio’s SMS service overcomes these hurdles in simplistic fashion. Twilio’s SMS service allows web applications to communicate with mobile customers with a few simple lines of code.

Let’s get started. First, you’ll need to add the Twilio service into your project by following these steps (note that this is for .NET 4.0 application). The installation process can be found here: https://www.twilio.com/docs/csharp/install

Once you’ve installed the service, you’re ready to add your SMS code.

In this case, our function is setup to receive a Phone Number that we are going to send a message and the Body of that message.

Public Sub SendSMS(ByVal receivePhoneNumber As String, ByVal messageBody As String)

Dim accountSID As String = ""
Dim authToken As String = ""
Dim objTwilio As Twilio.TwilioRestClient = Nothing
Dim objMessage As Twilio.Message = Nothing
Dim sendPhoneNumber As String = ""

When you sign up for your Twilio account, you’ll receive the following credentials within their application.
authToken = "111111"
accountSID = "222222"
sendPhoneNumber = "5553211212"

Instantiate your Twilio object
objTwilio = New TwilioRestClient(accountSID, authToken)

And send!
objMessage = twTwilio.SendMessage(sendPhoneNumber, receivePhoneNumber, messageBody, "")

End Sub

So now we’ve sent our message. It’s sent to the end user via a telephone number, just as if you were sending a text message from your phone. What if the end user wants to respond back? They can, with no additional effort on their end. The message will be sent back to the Twilio number, and Twilio will redirect the response to your web application.

Within the Twilio client, you can setup a Messaging URL, which will define where Twilio POSTs the response to (for example , http://www.yoursite.com/receiveTwilioResponse.aspx). Let’s setup that page.

When we setup receiveTwilioResponse.aspx, we’ll need to capture the Page Request and parse out the response message.

Protected Sub Page_Load(ByVal sender As Object, ByVal e As System.EventArgs) Handles Me.Load

Dim fromPhoneNumber As String = ""
Dim messageBody As String = ""

First, we’ll get the Phone Number of the end user who is responding back to us
If Not Request.Form("From") Is Nothing Then

fromPhoneNumber = Request.Form(“From”)

End If

Next, we’ll get the Message Body from the page request
If Not Request.Form("Body") Is Nothing Then

messageBody = Request.Form("Body")

End If

And that’s it! From here, we can handle this data how we see fit.
Call LogDataToDatabase(fromPhoneNumber,messageBody)

End Sub

This is simplified for example purposes. You will want to setup error handling and logging in conjunction with sending the message, and will need to clearly define the business logic of when and who to send messages to.

Learn more about the service here : https://www.twilio.com/sms

Advertisements

Mirth Connect – Not Just HL7

Mirth Connect is an open source application used for healthcare data integrations, specifically designed for HL7 interfaces. It is widely used for transfer of health data between two parties in an order to results cycle. But what happens when you need to transfer data to another party who isn’t using an HL7 interface? Limiting your inbound/outbound capabilities to only the HL7 format may prevent adding new customers. But, adding a second interface/solution may not be worth the cost of adding the new customer. The need to invest in a second solution is resolved by the different Connector Types available within Mirth Connect.

In this post, I am going to show you an example of how to interface HL7 data to a Web Service. We implemented this back in 2012 for a client. Our client had an existing HL7 interface and was receiving HL7 orders, processing the claim, and sending back HL7 results. They had a potential customer also looking for an electronic data transfer. However, they were not using HL7 formatted data. The potential customer used an application that received data via a secured SOAP request. Because this was a single client, it was unlikely that building a separate interface specifically for the integration would be the best business decision. So, we explored the possibility of using existing functionality (Mirth Connect) to send data to this web service.

Mirth has the ability to transform data between multiple formats. In this case, we were able to use the JavaScript writer to take the necessary data points (via Source Transformers) and complete the proper SOAP request. While Mirth Connect offers a Web Service Connector Type, we found the best flexibility in the JavaScript writer, where you can customize all the functionality and you are not limited by the default behaviors of the template.

Check out our example below, and find more information on Mirth Connect here: http://www.mirthcorp.com/products/mirth-connect

// open a connection to the Web Service
var url = new java.net.URL("https://testinterface.com/Document.asmx");
var conn = url.openConnection();

// set the connection Timeout
var connectTimeout = 30000;

Here we are returning the XML from our Source tab under the Transformers. This isn’t the only way to do it, but this worked best for us and our current solution.

var messageString = messageObject.getEncodedData();

You could also do something like this, where you build out the Request XML and use Transformer variables that you include as XML Node Text

var messageString = "<SOAP:Body><Patient>" + messagePatient + "</Patient><MRN>" +  messageMRN + "</MRN></SOAP:Body>"

conn.setConnectTimeout(connectTimeout);

Here we needed to use Request Headers, which we found easier to setup in the JavaScript writer than the Web Service Connector.

conn.setRequestProperty("ID", "123");
conn.setRequestProperty("USERNAME", "ALGONQUIN");
conn.setRequestProperty("PASSWORD", "PASSWORD");
conn.setRequestProperty("Content-Type", "text/xml; charset=utf-8");
conn.setDoOutput(true);

// Write our XML data to the Web Service URL
var outStream = conn.getOutputStream();
var objectStream = new java.io.OutputStreamWriter(outStream);
objectStream.write(messageString);
objectStream.flush();
objectStream.close();

var inStream = new java.io.BufferedReader(new java.io.InputStreamReader(conn.getInputStream()));

Microsoft SQL Server 2012 – AlwaysOn

Microsoft SQL Server 2012 – Always On

As technology advances, so does the need (and want) for instantaneous information. The goal is always “How can we get information as quickly as possible?” The same is the standard for disaster recovery. How quickly can we recover in the event of failure? The answer in many cases is seconds. One of the newest options for High Availability is Microsoft SQL Server’s AlwaysOn technology. The name is self-explanatory–AlwaysOn is designed to have data and information always available, even in the event of disaster. Disaster Recovery time is minimized with SQL’s AlwaysOn ability to sync multiple databases across multiple database servers.

I had the recent experience of using SQL’s AlwaysOn technology to set up a redundant application environment. The purpose of this blog post is to summarize my experience with the technology. By no means is this a start-to-finish coverage of the technology.

Overall, the perception of AlwaysOn amongst users has been very positive (simply Google “SQL Always On” and you’ll see other posts stating its benefits). I, too, was impressed with its simple setup and configuration and how well it handles failover scenarios (verified during our pre-launch testing). With all technology, there were some downsides, but I felt that the positives of AlwaysOn far out-weigh its drawbacks.

Availability Groups

  • Replication in AlwaysOn is no longer thought of as individual databases to individual servers. Complex systems can be replicated as a group, not as individual databases, minimizing setup and configuration time for replication. The one downside is that the replicas are limited to four sets, so one group can only be replicated to 4 different locations.
  • Synchronous and Asynchronous Setup
    • Synchronous Replication – Transactions are not completed until committed to all databases in the Synchronous Availability Group
      • The first thought I had when hearing this process was that it may be performance impacting since SQL will need to commit the transaction to multiple databases on multiple servers.
      • We tested this process to ensure synchronous replication to multiple databases would not cause performances degradation.
      • We used SQL’s query performance statistics to ensure that the performance of data read/writes were as efficient with the new environment as they were in the old environment.
      • As a side note, there were other factors that affected query performance times (for the better – new environment, new OS), so our test wasn’t a true comparison of SQL performance with and without AlwaysOn replication. For the purpose of our test, we needed to ensure that the AlwaysOn replication was not a detriment to application performance.
    • Asynchronous – Transactions are queued and committed to this Availability Group at a later time. Transaction completion is not dependent on committing to this database group. This is similar to Database Log Shipping in older versions of SQL Server.
      • In previous versions of SQL Server, Log Shipping would require the database to be inaccessible during restoration of the transaction logs. With AlwaysOn, the database is always online and accessible.
      • This allowed us to use the Asynchronous process for other purposes, in this case Read-Only reporting. We took long-running reporting queries and calculations and removed that processing need from the transactional database and server altogether, providing an indirect performance benefit from AlwaysOn.

Automatic Synching

  • One of the most frustrating aspects of SQL Log Shipping is when the process gets out of sync and the transactions are not being properly applied to the replicated database. This would generally result in manually copying and restoring a database to the replication server and restarting the transfer process
  • This headache is alleviated with the Asynchronous Replication of AlwaysOn. AlwaysOn polices itself for the data replication, automatically re-synching the database if a network connectivity loss or other event disrupts the replication process

Licensing

  • Unfortunately, AlwaysOn is only available with the Enterprise Edition of the SQL 2012 License, so you’ll need justification to spend the extra money for the Enterprise license. This may not be the best DR solution for smaller applications.
  • However, licensing is setup to use Active vs. Passive server. This means that you don’t need an additional license for your replication servers if it is only being used in a failover scenario, which allows you to limit the number of licenses you will need.

Again, this was just an overview of my experience with Microsoft SQL Server’s Always On functionality. In the event that you are in the process of setting up a new environment using Microsoft SQL Server, I highly suggest considering AlwaysOn to handle the High Availability/Disaster Recovery requirements. See Microsoft’s overview of AlwaysOn here: http://msdn.microsoft.com/en-us/library/ff877884.aspx

Database Organization

Great organization is the key to success in many aspects of our lives. Usability is a key feature in the applications we build, yet these factors don’t always carry over into database organization, especially in growing systems. Just because your database isn’t the face of the application, doesn’t mean it shouldn’t have the same sense of flow and organization that the application does. Maintain and organize your database as you would other areas of your life (i.e. your home). Organization should be purposeful, have a basic sense of flow, and be ready to show off when necessary.

Table and Column Naming Conventions

I often see table and column naming conventions that appear cryptic, coded or simply incorrect. Table and column names should reflect the data that they store. Find a standard in your application and stick to it; it will make maintaining and troubleshooting the application in the future a whole lot easier. You wouldn’t organize your office files and label them “x” or “col1,” so why do it with your database?

Related and Duplicate Data

Keeping redundant data can cause multiple issues with your application, including rapid growth in your database and confusion about which data is accurate.

In the same sense, storing related data in unrelated tables can cause the same issues. It falls in line with the same point regarding table and column naming conventions. Data should be grouped together as it makes sense for the application. Doing this will most likely improve the performance of your application; databases with normalized, properly indexed data translate to higher performing applications

Routine Maintenance

Vehicle engines and home utilities require routine maintenance–your application database is no different. Applications grow and evolve over time and you need to ensure that your databases grow and evolve with them. Maintaining database indexes is the key to keeping your application performing at a high level. Scheduling weekly, monthly, or quarterly reviews of your database indexes will help maintain these efforts.

Review the growth of your database tables; it’s easy to assume transactional table data needs to be stored indefinitely. Know how long your application needs to review and access historical data. Data can be easily archived out to a reporting database to minimize the size of your transactional tables

The simple, every day organization and guidelines that make you successful should reflect in your application database. Remember, bad habits and sloppiness in setting up your application database are the quickest way to cause problems for your application.