I recently migrated a local SQL Server database to SQL Azure, Microsoft's "database in the cloud". It was a smooth process and I wanted to share my experience.
Bear in mind that I am not a "Microsoft guy" at all (or a "Azure guy" or "SQL guy" for that matter). I'm just a developer helping a friend. I have never sworn allegiance to a technology stack and I'm not trying to sell you anything.
A few years ago I helped a friend start a business by building her an invoicing application. Efficient record keeping and cash flow were her keys to success. She needed something quickly, completely custom, and on a tight budget of $0.00.
My natural urge was to build a web application but hosting costs and some technical details meant a desktop app was the way to go. I ultimately built a Windows Forms application which used an on-site SQL Server Express database. It's wasn't the prettiest app in the world but it worked extremely well for her.
My data backup strategy was clunky but adequate. It consisted of scripted SQL exports/backups to different machines in the office and a small utility I wrote which backed up the same (encrypted) data off-site. I'm leaving out some details but it was basically like this:
I was never totally comfortable with the data backup situation once it was in production. The machine in the office running SQL server was a single point of failure. Automatic replication/clustering was not in the budget so the best I could do was have a cold standby. I also found myself writing code to automatically notify me if any of the scheduled off-site backup scripts failed due to a network or power outage.
So many things can go wrong when your data is on-site. It's all probabilities and risk management of course, but when you're the only person who can fix things and you live several hours away, you begin to worry about needing to suddenly sacrifice a weekend to perform mysterious and heroic acts.
A few days ago I got a call from my friend that she couldn't use the application suddenly. The root cause was her ISP sent a technician to fix their slow Internet and, in the process, reconfigured the LAN. The database was probably still running but inaccessible. I couldn't walk her through it over the phone, and since I didn't have remote access any more, I cleared my schedule and prepared the family for an unexpected road trip.
The night before leaving it dawned on me that I could now use SQL Azure to solve this issue. I had an up-to-date copy of the data off-site I could use to migrate to Azure. The database could live in the cloud, get backed up in a myriad of ways, and all the end user would need is Internet access. For a small-scale database like this one (< 50 MB), a "Basic" instance is only ~$5/month so the price was right.
I had to answer a few questions before I committed...
Once you migrate to SQL Azure, aren't you "locking yourself in" to the platform? Sort of. It's not like a special API you're coding to, it's just remote SQL Server. I could get the data back out and restore it on a local SQL server if I really had to. I could instead do more of a "lift-and-shift" to the cloud (my own VM running my own install of SQL server) but that would be too expensive and time-consuming. Ultimately I wasn't worried about lock-in given my constraints.
On the flip side, what if the SQL Azure platform goes away? Microsoft has a track record of killing off technologies suddenly. This is very true! But if you haven't noticed recently, Microsoft is all-in on Azure, more than any other product in their history. They're now giving away Visual Studio Pro and making server technologies open source and cross-platform for the first time. They're not doing this because they're cool or nice. They are sacrificing traditional software licensing revenues to expose as many developers as possible to their seductive Azure on-ramps.
What about latency? I tested the latency between the office and the Azure data center I was interested in. It was obviously much more than the latency in the office but fortunately for the specific application it didn't seem to affect the user expeirence too much. I noticed that like most other Azure services, like web sites, the the first operation tends to be slow because the service needs to warm up. After that everything seems nominal.
What about security? The Azure platform itself has nice security features but nothing is "secure". Things are only secure in specific ways. For my needs, all client-server communication should be encrypted (TLS) which it is. Also, I needed to lock down access to the database to be from only specific IPs which the platform forces you to do. I have security measures built-in to the database itself so it's not a risk if it gets stolen.
If locking down IP addresses isn't good enough, I can always opt for a VPN connection from the office to Azure.
In a perfect world, the app would talk to a HTTP API of my own design with it's own layers of security which would in turn talk to a data store, but that would require more time and money.
Performing the migration
Here's how I did it. I'm sure there are other ways. I noticed this technique doesn't work in the "preview portal". I found a helpful blog post on the subject for guidance.
Sign up for Azure and create a blob storage account
This is straightforward. You'll need to create a new blob storage account (and a container inside it) only for the process of importing your local database.
Make sure your database is ready for SQL Azure
This seems to be a moving target, but I couldn't migrate my database as-is because one of my tables was missing a primary key (it didn't need one, the table had one row for tracking the schema version :)). If your database has things like user-defined functions or other funky stuff you may need to investigate further. I always keep my databases simple and my logic in application code so I was fine.
Upload the .bacpac to blob storage
In SSMS, right-click the db, Tasks, Export Data Tier Application. Then select Save to Windows Azure and select which blob storage container you want to upload it to.
Create the new SQL Azure instance
In the Azure portal, select New, SQL Database, From Import. Specify the blob storage location of the .bacpac file and supply username/password for the server's forthcoming auto-generated administrator account.
Allow IP addresses
Once the server and DB instance is created, select it and click the Configure tab. There you can add allowed IP addresses. Keep in mind this is for the server it runs on and not the database instance, which makes sense but isn't immediately obvious in the portal. You can get there by clicking the database instance then "manage allowed IP addresses".
This is optional. The platform automatically backs up your data with a retention policy based on your service level (seven days' worth for Basic for example). You can configure it to store exports in blob storage for longer periods of time for additional backups.
I also plan on writing a small utility to automatically transfer those exports from Azure to some other secure location.
Configure SQL users
When you create a new SQL Azure instance you also create an admin account. This admin account is like the "sa" user in SQL server as it has God-mode enabled. I didn't want my users to have that so I created SQL server users by running scripts on the remote database instance via SSMS, Visual Studio or other tool. It's also a good chance to test your connectivity to the database.
Here's an example of creating a new user and giving them read/write access. Your mileage may vary.
-- run this on the master database as the admin account to create a user CREATE LOGIN [SOME-LOGIN] WITH password='SOME-PASSWORD'; GO CREATE USER [SOME-USER] FROM LOGIN [SOME-LOGIN] WITH DEFAULT_SCHEMA=[dbo]; GO ------------------------------------------- -- run this on your target database to grant specific access rights CREATE USER [SOME-USER] FOR LOGIN [SOME-LOGIN] WITH DEFAULT_SCHEMA=[dbo] GO exec sp_addrolemember db_datareader, [SOME-USER] go exec sp_addrolemember db_datawriter, [SOME-USER] go
Set up your application
In my case, I just needed to change the connection string in the app.config. You can find a link to the connection string to your Azure DB on the "dashboard" page for the instance. Just remember to change the user accounts, etc.
Of course NEVER commit user names and passwords to source control. You'll need to devise a process to change the config at some point during your app's deployment process.
Also, since I was running a desktop application I upgraded to .NET 4.5.1 to take advantage of the new reselliant ADO.NET connections which is very handy in a cloud scenario.
Aside from converting my database to be Azure-compatible the only issue I ran into was a bizarre authentication error during one of the several times I rehearsed the database migration process. Yes, I rehearse things like this several times.
When I went to create the SQL Azure database from the Import, I got an error in the portal that said something like Login failed for user ‘username’. This session has been assigned a tracing ID of ‘xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx’. Provide this tracing ID to customer support when you need assistance. (Microsoft SQL Server, Error: 18456).
Apparently when I created the SQL Azure database, I selected a "orphaned" server I created previously. That server must have had a different admin user name/password then I wanted and the import failed due to an auth error.
I learned from this that whenever you create a new SQL Azure instance, you should probably opt to create a new server each time instead of re-using an existing one. When you delete a database instance the server it runs on still remains for some reason. In my case, it from from a previous deployment rehearsal. I'm sure there are some benefits to the one-to-many relationship between servers and database instances but not in my case.
Migrating to SQL Azure was relatively painless. It's an example of an Azure on-ramp that Microsoft is so focused on these days. I had some reservations but in the end my friend's business is on more solid ground and that's what matters.