Under the hood of my ‘Expose Yourself’ demo

I ran a live demo at Evening Of and WebDevCon16. The aim of this post is to cover the technology used to make it work.

Megan and I were asked to speak at WebDevCon16 at Solent University. The conference was aimed at bridging the gap between the University and the local digital scene.

As part of our talk, we touched on how to stay up-to-date with the latest technology. The industry is a moving target and learning how to keep up with it is vital. On this, I recommended that you should be playing and experimenting with new tech regularly. It doesn’t have to be production ready, just a small demo or test to see how it works will do.

DSC_0555

I thought I should practice what I’m preaching. So I ran a live demo. The aim of this post is to cover the technology I used to make it work (I won’t go into what the demo was, sorry). I ran the demo again at the Evening Of event too.

Getting technical

To get me started, I used the same code from my wHack-a-mole game I made at hacksoton a while back. It’s a nodejs server that uses Express to handle simple routing etc.

I then plugged one of my favourite technologies into it… websockets!

WebSockets is an advanced technology that makes it possible to open an interactive communication session between the user’s browser and a server. With this API, you can send messages to a server and receive event-driven responses without having to poll the server for a reply.

This allows me to do 2-way communication between connected devices and the server. To save time, I used Socket.IO – which plugs into Express really easily.

The rest was just writing clients to send and receive data to/from the server. I made a really basic web app (I literally just hid/showed divs) which people used on their phones. Then a simple admin view which I could access on my phone, to trigger certain actions. And saving the best to last, I was able to build a presentational view directly into my slides! This is because I just used html, css and js for my slideshow using revealjsp.s. it rocks!!

Browser API’s

During the demo, I used a bunch of really cool browser features. Here’s a quick write up on each one and roughly how I used them.

Battery

Believe it or not, you can actually get somebody’s battery information just using js, using the Battery Status API.

It only works in Chrome, Opera & FF as far as I’m aware, but when it does work you can get all sorts of data. For example:

  • Current battery level
  • Whether you are charging or not
  • How long left until full charge
  • How long until battery is drained

These can have some really interesting use cases on the web. You could serve a low-contrast mode on low-battery (non-charging) devices, to reduce their battery drainage (and therefore stay on your site longer?). Or you could turn off any fancy animations or canvas elements to save battery.

I only used the current level and whether or not it was charging. Here’s the code I used to do it:

// Check for support
if ('getBattery' in navigator) {
    navigator.getBattery().then(function(battery) {
       // Send the battery data back to the server via websockets
        _this.socket.emit('battery', {
            level: battery.level,
            charging: battery.charging
        });
    });
}

Vibration

You can also physically move somebody’s device on the web. You can tell their phone to vibrate. This. Is. Really. Wicked.

The support for this is limited (Chrome & FF) at the mo but it can still be used in really interesting ways, either as instant feedback or to enhance an animation (e.g. rocket landing?).

The Vibration API is really simple:

// From http://www.sitepoint.com/use-html5-vibration-api/
navigator.vibrate = navigator.vibrate || navigator.webkitVibrate || navigator.mozVibrate || navigator.msVibrate || false;

// Check for support
if (navigator.vibrate) {
    navigator.vibrate([400, 100, 400, 100, 400, 100, 400]);
}

Audio

Unfortunately you cannot play audio at will on devices. They need to first be triggered by a user interaction, such as tapping a button. Once the user has interacted, then you’re free to play that audio file as much as you want.

So, to get around this added a ‘Join’ button as the first thing the user does. When tapped, I play the audio file, pause it immediately and rewind it. This bootstraps it for me silently so I can then play it at will.

Join button

Rather than having an audio element on the DOM, I just used new Audio().

var chime = new Audio('/audio/chime.wav');

// Bootstrap audio file
$('.join').click(function(event) {
    event.preventDefault();

    chime.play();
    chime.pause();
    chime.currentTime = 0;
});

// ...at some point in the future
chime.play();

Device Orientation

The last piece of the puzzle is detecting the device’s orientation. Actually getting the data is pretty easy, doing something with it is the hard part.

To save time, I used the Orientation example from MDN as a starting point as it was roughly doing what I wanted.

The only problem was the example wasn’t very performant. So I wrapped it in a requestAnimationFrame():

You should call this method whenever you’re ready to update your animation onscreen. This will request that your animation function be called before the browser performs the next repaint. The number of callbacks is usually 60 times per second, but will generally match the display refresh rate in most web browsers as per W3C recommendation

This is best way to keep your js animations at 60 frames per second, by letting the browser handle it for you. Here’s the rough code I used to achieve this:

var waiting = false;

var tick = function(event) {
    // Do the drawing
    handleOrientation(event);

    // End the tick
    waiting = false;
};

function handleOrientation(event) {
    // Similar code to the Orientation Example
}

var deviceorientationEventListener = function(event) {
    // The 'deviceorientation' event could have fired n times before requestAnimationFrame has
    // finished one tick. So we only do work when requestAnimationFrame has finished its previous
    // tick and is ready for another
    if(!waiting) {
        // Start the tick
        waiting = true;

        // Run tick() at ~60fps
        window.requestAnimationFrame(function() {
            tick(event);
        });
    }
}

// Bind the event listener
window.addEventListener('deviceorientation', deviceorientationEventListener, true);

This got the picture moving on the device, but I still needed to send the new position up to the server so I could display it in the slideshow. Now, I didn’t know how many people were going to be connected and whether or not my server could handle x connections sending updated positions every 16ms. So, I throttled how many times the drawing position was sent to the server to once every 200ms.

// Run this closure every 200ms
var emitPosition = setInterval(function() {
    // Send the socket ID and current X/Y position of the drawing to the server
    _this.socket.emit('drawing-position', {
        id: _this.socket.id,
        x: drawing.data('x'),
        y: drawing.data('y'),
    });
}, 200);

Being able to detect orientation opens the door to some really interesting UI possibilities. You could have a parallax background that moves with your device, to give a 3d effect. Or build custom gestures into your web app, for example: twist the device left/right to move between slides, or up to scroll to the top of the page.

In Summary

Thanks for all the great feedback on this demo, I’m glad people enjoyed it and I hope it’s inspired you to try something new!

evening-of-expose