In this post we are going to dig in to the code a little on how we are doing SQL synchronization across client and server. As I mentioned in my last post about our training software for the mine, we’ve got to be able to run and execute these trainings underground, then sync that data back to the main server. We also need to take all data that gets saved to the main server, and ensure that it gets pushed back out to every single client DB.
First thing to do was figure out how I want to set up my databases on in the UWP app. After a bunch of digging in Google and plenty of quality time with Microsoft documentation, I found this article did a really good job of setting up databases.
I am going to change the concept and models for this example to protect anything that might be private to our client, so for our example, we will do the extremely original example of a notes app. A user on a tablet will write a note and save it to a local SQLite database, then when they client syncs, it will save all local notes to the master database, and then download any new records that were saved from other devices syncing.
So we will start by defining a very simple Note model for both the client and the server:
public class Note { [Key] public long LocalID { get; set; } public long NoteID { get; set; } public string UserName { get; set; } public string Text { get; set; } public string Title { get; set; } public DateTime CreatedOn { get; set; } public int SyncVersion { get; set; } public bool Deleted { get; set; } }
public class Note { [Key] public long NoteID { get; set; } public string UserName { get; set; } public string Text { get; set; } public string Title { get; set; } public DateTime CreatedOn { get; set; } public int SyncVersion { get; set; } public bool Deleted { get; set; } }
Some things to note about the models. The client side model does not use the same primary key as the server side model. These records will only ever be assigned a NoteID on the server side, as it serves as the master id across all databases. This makes it really easy when it is time to sync the client to figure out what records are new, by simply pulling all records that have a NoteID == 0. There’s also a couple of fields on the model that exist specifically to provide information we need while syncing, that is the SyncVersion and Deleted fields. Since all records are in multiple databases at the same time, in this system we can never delete a record directly out of the master database. If this were to happen, the record would still be live and in play on all the client side databases. So instead of doing a true delete, we set this flag to true for that record, and filter out all deleted records when querying the databases locally.
The SyncVersion field is the big deal. It’s the cornerstone that the whole sync process is built around. The idea is to create essentially a version number for the entire Notes table. Starting at table version 1, any time a record is updated or saved to our Notes table, we stamp the effected record with the table version by saving it in the SyncVersion field. For example, the first 2 Notes I save will have SyncVersions of 1 and 2. If I were then to edit the first record, when I save it, the SyncVersion would automatically be updated 3. My next inserted Note would have SyncVersion of 4. So with every insert/update, we save the effected records SyncVersion to the number one higher than the highest SyncVersion in the table. After a lot of research I settled on a method for maintaining this SyncVersion column that leveraged database triggers.
CREATE TRIGGER [dbo].[Notes_VersionUpdate] ON [dbo].[Notes] AFTER INSERT, UPDATE AS BEGIN -- SET NOCOUNT ON added to prevent extra result sets from -- interfering with SELECT statements. SET NOCOUNT ON; DECLARE @mx int; SET @mx = (SELECT MAX(SyncVersion) FROM Notes); UPDATE Notes SET Notes.SyncVersion = (@mx), @mx = @mx + 1 FROM Inserted i WHERE Notes.NoteID = i.NoteID END
Now that I have a simple way of tracking whats has changed at the database level, I can work on actually saving notes I’ve created in my UWP app to the master database. On the server side, I am simply taking a new Note, saving it to the database, and sending it back to the client with it’s NoteID changed. In the client code, I grab all Notes that have a NoteID of 0, and send them to the server. When the server send me the record back, I resave it, with the NoteID now set.
public async Task SaveNotes() { try { using (ApplicationDbContext db = new ApplicationDbContext()) { List<Notes> records = db.Notes.Where(r => r.NoteID == 0 && r.Deleted == false).ToList(); foreach (var rec in records) { string endpoint = url + "SaveNotes"; var content = new StringContent(JsonConvert.SerializeObject(new { record = rec }), Encoding.UTF8, "application/json"); var result = client.PostAsync(endpoint, content).Result; var objectData = await result.Content.ReadAsStringAsync(); var obj = JsonConvert.DeserializeObject<Note>(objectData); rec.NoteID = obj.NoteID; db.Entry(rec).State = Microsoft.EntityFrameworkCore.EntityState.Modified; } db.SaveChanges(); } } catch (Exception e) { throw e; } }
public ActionResult SaveNotes(Note record) { db.Notes.Add(record); db.SaveChanges(); return Json(record); }
Well cool, we now have a method of sending saved notes from our UWP to our Azure server, but we’re still missing one big portion, how do I get all the notes other people have saved and save them to the local database on every device? This is where the magical SyncVersion field really brings it all together. By looking at the biggest SyncVersion value in our local SQL table, we know what version the table was at precisely at the last time we sync. So, if I ask the database on for records with a SyncVersion bigger than mine biggest, I will get back only the records that have changed since the last time I synced.
public async Task<List<JObject>> GetSyncData2(string entity, int since) { try { string endpoint = url + entity + "?since=" + since; var content = new StringContent("", Encoding.UTF8, "application/json"); var result = client.PostAsync(endpoint, content).Result; var objectData = await result.Content.ReadAsStringAsync(); var obj = JsonConvert.DeserializeObject<List<JObject>>(objectData); return obj; } catch (Exception e) { throw e; } } public void SyncNotes() { using (var db = new ApplicationDbContext()) { int maxVersion = db.Notes.Max(n => n.SyncVersion); var data = GetSyncData2("Notes", maxVersion); foreach (var d in data) { long id = d.NoteID; using (var db2 = new ApplicationDbContext()) { var cc = db2.Notes.FirstOrDefault(c => c.NoteID == id); if (cc != null) { var obj = item.ToObject<Note>(); db.Entry(obj).State = EntityState.Modified; } else { var obj = item.ToObject<Note>(); db.Competency_Codes.Add(obj); } } } db.SaveChanges(); } }
public ActionResult Note(int since) { return Json(db.Notes.Where(l => l.SyncVersion > since)); }
So we’ve now covered all the bases we need to have a system that syncs notes across all devices, by leveraging a master database server hosted on Azure. This is a hastily thrown together toy example of what were are doing at a much larger and more robust scale with our app for training miners, but it should give you a strong idea of what’s possible. In our real app, we’ve added in syncing for files, as well as a lot more features to make the syncing process faster and more robust.