Using Cordova

I had only created one Android app before and that was done in Java as a learning exercise. It was a great learning exercise but a poor app.

Having created the Munro Bagger website with other students from CodeClan, I wondered how easy it would be to create an app with the same functionality. For those not familiar with the term, a “Munro” is a mountain in Scotland with a height of 3000ft or more above sea level. The Munro Bagger website allows the user to review the weather on all 282 Munros to help them decided which peak to target and when. When our group was creating the website we felt that we needed a good solution for people on the move so an app seemed like a good idea.

Cordova seemed like a simple solution because it provides a wrapper for JavaScript which allows it to run on Android, iOS or Windows phones providing you have the development environment for the target (or use PhoneGap). It seemed too good to be true so it had to be done. I have read criticisms of the technology from people who say that Cordova apps don’t look or behave like native apps. Well take a look at the screenshots in this post. Does it look like an Android app or not? It even behaves like an Android app; you can go to the Play Store, download it and find out for yourself. I have come across a few things that don’t work quite the way they I would like but nothing that really gets in the way of the user experience.

If you follow the link to the Munro Bagger website, you will see that it has a very different look and feel from the app. I changed the look and feel by using React-MDL which was the subject of an earlier post. It is React-MDL that gives the app the look and feel of an app. Cordova just wraps the JavaScript and provides plugins to allow the features of the phone to be accessed.

When I started using Cordova, I ordered a couple of books by John M. Wargo: “Apache Cordova 4 Programming” and the companion book “Apache Cordova API Cookbook”. Good books and useful but, in the year and a bit since publication, Cordova has moved on to version 6 and the Cordova online documentation is pretty good. Nevertheless, the “API Cookbook” is still worth looking at for its example code.

When I first brought React-MDL and Cordova together everything looked great but as soon as I used Dialogs, everything went wrong. On older phone’s, Android’s built-in browser just can’t render Dialogs. I explored using polyfills but eventually found Crosswalk and Crosswalk is awesome!

Crosswalk for Cordova embeds a recent Chromium browser engine in your Cordova app and supports every version of Android back to version 4.0. As soon as I added Cordova to the project Dialogs started to work which is all I was after but the debugging environment was a revelation. You can connect Chrome to your Crosswalk-enabled app whether running on a phone or an emulator and interact with it as if it was a normal JavaScript web app. Amazing stuff.

A Cordova/Crosswalk app is fairly large, certainly much larger than a similar native app would be but the simplicity of migration from Web to app is impressive; it only took a few weeks to replace the UI of the Munro Bagger website with a React-MDL/Cordova/Crosswalk UI and that included a lot of exploration and problem solving. The total size of the Munro Bagger Android app is 27MBytes which is approximately half the 50MByte maximum size permitted by Google.

One other area of app development caused some head scratching and that was the handling of network outages – an app is bound to go offline sometime – but that’s a subject for another day.

Node.js and localStorage

So you’re unit testing your JavaScript using Node, you make calls to and window.localStorage but Node has no equivalent. How do you satisfy the dependency?

I tried using node-localStorage and it worked but as soon as I used webpack to create a bundle I got problems. In my application I make very little use of localStorage so I decided to throw out node-localStorage and create my own implemenation:

var localStorage = {
  _storage: {},

  getItem: function(sKey) {
    if(!sKey || !this._storage[sKey]) return null;
    return this._storage[sKey];
  },

  setItem: function (sKey, sValue) {
    if(!sKey) return;
    this._storage[sKey] = sValue;
  },

  removeItem: function (sKey) {
    if(!sKey) return;
    delete this._storage[sKey];
  },

  clear: function() {
    this._storage = {};
  }
}

Hardly industrial strength but it only took a few minutes and it gets me through the tests.

TDD or not TDD

One of the obvious disadvantages of a short, but very intensive and wide-ranging, training courses like CodeClan is that you don’t get much time to explore topics in detail. TDD is introduced and used but the problems of access to external data are not discussed in any depth.

You can only get so far using Mocha to test the logic of your JavaScript before you need to access a database and, in all probability, that database will require a network connection. You could start at the bottom of the stack, testing each layer before adding the layer above but it becomes increasingly hard to force specific events to occur. This is integration testing rather than unit testing.

So what do you do when you’re limited for time and your test framework doesn’t provide the support you need to isolate the units in your code? You can either take the time out to determine how to implement TDD or plough on using a combination of integration and ad-hoc tests. Using the former approach you will have more confidence in the code you have written but you risk running out of hours to do the job while the latter approach may result in a fully working solution but contain hidden technical debt.

When I left CodeClan, I gave myself a few weeks to implement a Cordova app based on code produced during my time at CodeClan. As you might guess, a lack of time and appreciation of the available tools had meant the original code made poor use of TDD and I was determined to have a working app. There was no time to go back and do things properly so I decided to plough on. The app was completed on time and works well but I was worried. I had learned a lot but I had skipped over a very important subject so I decided to go back and write the tests.

While not the TDD way, writing the tests after the fact is proving to be an interesting experience. I have often heard people say that TDD causes you to write code differently and that becomes obvious very quickly when you try and work out how to disconnect units. I like to think that I have chosen my classes well and followed SOLID principles so the coupling between units is low enough but it took me a while to find the right tools for the job.

“Dependency Injection” and the “Inversion of Control” are two very grand titles for the concepts that underpin Unit Testing and they are explored in great depth in too many places for me to mention. However, in practice all this means is that you have to be able to give the Unit under test the tools it needs to access data from the test environment. In this way you can replace a database (or anything else) by supplying a new data-store that uses an approach of your choice.

I chose the Sinon JS tool to support my test imitative because it is well recommended and widely deployed. After a ton of reading I decided I needed to use a stub. Then I had to work out where to put it; somehow I needed to stub out the call to the network without changing anything else. Here’s a bit of code that called remote database to get a list of mountains.

Mountains.prototype.fetchMountains = function(onCompleted) {
  const url = baseURL + "munros";
  const apiRequest = new ApiRequest();
  apiRequest.get(url, function(rxMountains) {
    this._mountains = rxMountains;
    this._saveToStore(rxMountains);
    onCompleted(rxMountains);
  }.bind(this))
}

After some head-scratching it became obvious that I needed to stub out the apiRequest.get() without losing the code in the callback so this is how I reorganised the code. The function _fetch() can now be stubbed out without losing any functionality.

Mountains.prototype._fetch = function(resource, onCompleted) {
  const url = baseURL + resource;
  const apiRequest = new ApiRequest();
  apiRequest.get(url, function(rxResource) {
      onCompleted(rxResource);
  });
}

Mountains.prototype.fetchMountains = function(onCompleted) {
  this._fetch("munros", function(rxMountains) {
    this._mountains = rxMountains;
    this._saveToStore(rxMountains);
    onCompleted(rxMountains);
  }.bind(this)) 
}

So now the Dependency Injection is done in the test like this:

// Replace _fetch() with a stub
let stub = sinon.stub(mountains, "_fetch");
// When called with arg "munros" return dummy data stub_munros
stub.withArgs("munros").yields(stub_munros);
// Call the method to retrieve the dummy data
mountains.all(function() {});
// Remove the stub before the asserts in case something fails
mountains._fetch.restore();
// Check the stub was called
assert.strictEqual(stub.callCount, 1);
assert.strictEqual(mountains.updateInterval, 0);

Clearly this is not a generic DI mechanism but it will work very well for tests. I wish I had known about this a long time ago! It took a while to work out what I needed to do but using Sinon spies and stubs has been painless.

There are some good examples on the Enterprise JS Blog. I hope you find this useful.

Releasing Your Cordova App

Having tested my first Cordova app to death it was time to get it onto the Google Play store for Beta testing. You can’t use a debug build on the Play store and you can’t use an unsigned release build. Everyone has to sign their release builds so there must be an authoritative description of the process somewhere, right? That was my expectation but the Cordova documentation doesn’t even mention the step. Perhaps that’s not surprising, the signing process is really an Android thing but a few pointers would have made things easier.

The best description of the process that I could find came from an Ionic web page. It’s a five step process:

  1. Remove the console plugin
  2. Build the app for release
  3. Use keytool (which comes the JDK) to generate a key to sign the APK file
  4. Use jarsigner (another JDK tool) to sign the APK file and
  5. Run zipalign to optimise the APK

I won’t go through the details here as the Ionic website does that very well but I will underline the importance of the key: Each version of the app must use the same key so, if you lose the keystore you won’t be able to submit updates to your app.

Update 15-Feb-2017: It seems that there is some mention of signing the APK in the Cordova documentation. It’s not clear, to me at least, whether the Cordova build runs both steps 4 and 5 or only step 4. Further investigation is required but the Ionic web page provides more background information.

Mobile First?

When three of us created Munro Bagger back in our days at CodeClan, we didn’t give mobile clients or responsive design much thought. We knew there was a high probability that our users would use mobile devices because of the subject matter of the website but we also know that we had enough on our plate worrying about React, Rails weather forecasts and user authentication.

By the end of the project the site was responsive and usable on a phone but it doesn’t give the same user experience as an app. That got me wondering if it was possible to create  a website that had the look and feel of an app so I went looking for React tools that would allow me to dress up the website to look more like an app. I found a few and I tried a few but it is a confusing landscape and documentation often seems to assume a high level of expertise which impedes progress.

In the end I decided that Material Design was the way to go; after all, many Android apps had already been implemented using that style (or “design language”) so a website implemented using Material Design was half way there. I needed a React implementation so I tried Material UI which was very well reviewed but I hit problems and I could not fix them so I went back to the drawing board and found MDL (Material Design Lite). For a while I contemplated wrapping the CSS and JavaScript provided by MDL in my own React components but it looked a bit tricky. Luckily, when digging around a little more, I found that someone had already done just that. I fell in love with React-MDL.

After I had installed React-MDL, I was able to make very rapid progress and within a few days the website looked a lot like an app. When things didn’t look quite right I was able to use the element inspector to find which CSS classes had to change. I found it very transparent and easy to use. I recommend it.

I stopped converting the website to MDL at the point it was going to need some major surgery and decided to start work on a Cordova app but that’s another story. The lesson from this post is that, with hindsight (and fewer time constraints), it would have been better for us to consider the needs of the mobile user first. React-MDL would have made many things easier.

Rails and Emails

Having just installed our Rails server on Heroku, we were looking for a way to send out emails to confirm email addresses. The line of least resistance seemed to be Gmail; everyone loves Gmail and there were quite a few articles on the subject. So we set it up and it worked… for a while.

It seems that Google have increased the security of the mail system and the option to use less secure apps no longer allows our Rails server to send emails. Instead Gmail warns that an unknown device in Virginia (i.e. a Heroku server) is trying to send email using the account and has been blocked. Even saying that the device belongs to me (which it does… kind of) didn’t improve matters. A different approach was required.

Rails had been configured to use SMTP to send email via Gmail so it made sense to adopt another email service that supported the same approach. Mailgun is offered by Heroku and seemed to fit the bill.

As soon as it’s installed using Heroku, a bunch of Mailgun environment variables are setup and a “Sandbox” email domain is created to get you up and running. The Gmail configuration  was replaced by this:

config.action_mailer.smtp_settings = {
 :port => ENV['MAILGUN_SMTP_PORT'],
 :address => ENV['MAILGUN_SMTP_SERVER'],
 :user_name => ENV['MAILGUN_SMTP_LOGIN'],
 :password => ENV['MAILGUN_SMTP_PASSWORD'],
 :domain => 'yourdomain.heroku,com',
 :authentication => :plain
}

To get this to work you have to specify authorised recipients (i.e. email addresses) via the Mailgun console (which is accessed via the Heroku Overview page). Up to five authorised recipients are allowed which is enough to demonstrate that everything is setup okay but nothing more. To get it working fully, you have to have your own domain.

Mailgun suggest that you set up a subdomain for sending emails but you can use Mailgun to forward incoming emails to mailboxes of your choice (Mailgun does not offer IMAP or POP3 access) as well as to send outgoing emails so it may be easier for you to use the top-level domain. It certainly looks that way to me.

Once you have entered your email domain in the Mailgun console you can see the settings it needs to be entered on your DNS server. Once the DNS settings have propagated, you  take the username and password that Mailgun has created for your new email domain and update the environment variables MAILGUN_SMTP_LOGIN and MAILGUN_SMTP_PASSWORD. Then everything should work.

Having done this, I’m not sure why I contemplated using Gmail in the first place. You live and learn.

Rails, Devise and APIs

As a newcomer to Rails and having done a fair amount of background reading, it seemed to make sense that I use Devise to secure access to the API that I was creating. However, in the end I gave up, I could not get the authentication to work reliably and I could not stop Devise from sending HTML out in response to a JSON request. I know that better men than I would have fixed the problem but, as I dug into the issues, I got the feeling that I was going to spend more time removing functionality than I would spend writing what I needed from scratch.

Devise does a lot but, to my mind, a lightweight API doesn’t need much. However, there isn’t much out there to tell you what you need to do. While I was digging around, I came across APIs on Rails which gave me the confidence to remove Devise and start again. I started to work through the examples in the book and had problems with a few of them but there is a Github repository where the issues are discussed and solutions posted. It was not the simplest of processes but it was much more transparent than using Devise.

When it came to Chapter 5, entitled “Authenticating Users” I opted to use Json Web Tokens (JWTs) because they can carry information in much the same manner as a cookie which would, I believed, simplify the implementation server-side. Support for JWTs was added to the project via a Ruby Gem which proved to be well documented and easy to use.

In simple terms the approach is this:

  1. When a client starts a session on the server, they provide their credentials and, in return, are provided a JWT which contains information to identify the client.
  2. The client includes the JWT in the authorization header of every HTTP request sent to the server.
  3. The server opens the JWT, checks the content and allows the request to proceed if everything checks out. If it doesn’t check out, an unauthorised (401) error is returned immediately.

Much simpler than struggling with Devise! Once I got my head around it, it was easy enough to implement. If you are interested, you can take a look at the code on Github.

Rails and Devise are very powerful but so much happens under the hood that they are a minefield for new users who are trying to do something a little out of the ordinary.

 

Mountain Weather

One of the many projects that I worked on during my time at CodeClan was a group project to create a website showing all the Munros that would be sunny today, tomorrow and the day after. The idea being that most people would want to avoid bad weather on the hills and would gravitate to sunny weather if they could easily work out which Munros would have a share of the sunshine. For those unfamiliar with the term, a Munro is a mountain in Scotland that is over 3000ft. tall. I’ve attached a screenshot to give you some idea of what I’m talking about.

The implementation uses JavaScript/React/Rails and has gone through many revisions but is now hosted by Heroku and you can find it at this address: http://www.munrobagger.scot.

 

The first version of the website used the OpenWeatherMap API to collect forecasts for the Munros. The API allows forecasts to be retrieved for a specified latitude and longitude which seems ideal because each Munro has a known latitude and longitude. However, analysis revealed that there were only 26 forecasts to cover a grand total of 282 Munros and the forecasts were all for low altitude so unlikely to be of any real value.

The current version of the website uses the UK Met Office API. This API does not allow you to request a forecast by latitude and longitude. Instead, the Met Office use their own location ids to identify the forecasts for around 5000 separate locations within the UK. The first challenge is to find which Met Office locations are closest to the Munros. Thankfully the Met Office API provides a site-list which contains the lat. and long. of each location so it is possible to find which of the 5000 Met Office location relate to which Munro and therefore which location id to use to fetch the Munro’s forecast.

To get the list of sites I used this API call (API key ommitted):
http://datapoint.metoffice.gov.uk/public/data/val/wxfcs/all/json/sitelist

Once I had the list of sites, I used the the Haversine formula to calculate the distance between a mountain and a weather site. Here’s a code snippet to do that (I love the way that Javascript handles the Greek characters):

Mountain.prototype.calculateDistanceTo = function( latLng ) {

  // Haversine formula:
  // a = sin²(Δφ/2) + cos φ1 ⋅ cos φ2 ⋅ sin²(Δλ/2)
  // c = 2 ⋅ atan2( √a, √(1−a) )
  // d = R ⋅ c
  // where φ is latitude, λ is longitude, R is earth’s radius
  // note that angles need to be in radians to pass to trig functions!

  function toRadians(x) {
    return x * Math.PI / 180;
  }

  var R = 6371e3; // in metres
  var φ1 = toRadians(this._latLng.lat);
  var φ2 = toRadians(latLng.lat);
  var Δφ = toRadians(latLng.lat - this._latLng.lat);
  var Δλ = toRadians(latLng.lng - this._latLng.lng);

  var a = Math.sin(Δφ/2) * Math.sin(Δφ/2) +
    Math.cos(φ1) * Math.cos(φ2) *
    Math.sin(Δλ/2) * Math.sin(Δλ/2);
  var c = 2 * Math.atan2(Math.sqrt(a), Math.sqrt(1-a));
  var d = R * c;

  return d;
}

When I ran the code I found that the nearest weather site was always right on top of the mountain so there was never any doubt about which location id to use. Exactly what was needed! No other weather service provides this level of coverage for Scottish peaks.

To give an example: Ben Nevis (the highest Munro of them all) has a latitude of 56.796849 and a longitude of -5.003525. Using the site list retrieved from the Met Office and the Haversine formula we find that the nearest location id is 350377.  A five-day forecast for Ben Nevis is retrieved using a request like this (where the API key and forecast type have been omitted): http://datapoint.metoffice.gov.uk/public/data/val/wxfcs/all/json/350377

To have access to the API free of charge, utilisation must stay within the fair use limits which are:

  • No more than 5000 data requests per day and
  • No more than 100 data requests per minute

There are 282 forecasts which are updated at most once per hour so 6768 requests would be needed to stay in sync with the Met Office. This would break the fair use limits so we need to reduce the frequency of update and we need the server, rather then the client, to get the forecasts; multiple clients using the same API key would break the fair use limit very quickly. In practice the Rails server updates the forecasts once every two hours and, while updating, requests a new forecast once a second to ensure that it only ever makes 60 requests per minute.

The Met Office API is a little tricky to use when compared with something like OpenWeatherMap but for an application like this you can’t beat it.

A New Beginning

Having worked as a software development manager in the south-east of England for more years than I care to remember, I decided to return to my native Edinburgh and to get back into full-time, hands-on development.

To get back into the swing of things, I decided to join the CodeClan Digital Skills Academy. I cannot praise it highly enough; the course is a short four months but the opportunity for learning is exceptional. I graduated in Computer Science from Edinburgh University back in the days of Kernighan, Richie and the C programming language so the move to using (rather than reading about) OO, TDD, Ruby, Rails, Javascript, React and the like made the course a real challenge.

So this blog is going to look at the ideas, challenges and discoveries that I bump into along way from this new beginning. I hope you find it all as interesting as I do.