The day was split into morning sessions, which featured on-stage presentations in a theatre like environment and afternoon technical sessions where delegates would break out into smaller groups to focus on specialised topics. The presentations were ably introduced by Microsoft UK Technical Evangelist Andrew Spooner.
Morning Sessions – Keynote Lectures:
The morning’s first session was a lecture on Open Data from internationally recognised expert Sir Nigel Shadbolt. Sir Nigel was co-lead, along with Sir Tim Berners-Lee, of the UK Government’s project to make many of its data sets available for public use (data.gov.uk).
Sir Nigel and Sir Tim went on to create the Open Data Institute, a non-profit organisation founded in the UK but now with an international presence, to help drive the evolution of open data systems.
Crowd-sourcing of data was a key topic within Sir Nigel’s lecture. He gave a compelling demonstration of crowd-sourced data as a force for good citing Ushahidi, which was created to collect eyewitness accounts of intimidation and violence as suffered by voters in Kenya’s disputed 2007 presidential election. Ushahidi has since been used as a tool in several crisis situations, including the aftermath of earthquakes in Chile, Haiti and New Zealand.
Sir Nigel also highlighted the contribution made by OpenStreetMap following the Haitian earthquake of January 2010. There were no high resolution maps of the affected area at the time of the earthquake, which seriously affected disaster relief efforts. Rescue workers were able to rapidly crowd-source data using OpenStreetMap and build up the view of the area they needed to effectively co-ordinate their response.
A major take-away for me from this lecture was the concept of “linked data“, an example of which is the ability to access multiple data sets related to a specific location via the location’s postal code. The postal code effectively acts as a URI for these linked data sets. Linking data in this way makes it easier to discover, access, and reuse data in novel and potentially valuable ways. The Glasgow Future Cities Demonstrator referenced towards the end of this blog post provides examples of the benefits this approach can offer.
The Lotus F1 team make extensive use of computer and physical modelling and on car telemetry, which means there are petabytes of data sloshing around the various systems that Taylor’s team are responsible for.
The team are constantly proposing, modelling and applying changes to the car. If my notes are correct, the rate at which design changes are made equates to 1 change every 7 minutes over the course of a season, though most don’t actually make it onto the car.
I found the parallels between the continuous delivery cycle to which an F1 team works and the context within which a DevOps team operates interesting and instructive:
- Success is predicated on how swiftly value is realised from change.
- Agility enables the right change to be enacted at the right time.
- Appropriate tooling and process mitigates the risk of a change introducing unwanted behaviour.
F1 seems like an amazing, if extremely challenging, place to be an IT professional. Taylor is a fine ambassador for his team and I suspect many of those in the room for this session will be taking an interest in Lotus F1’s race results going forward.
Andrew Spooner invited Or Arbel onto the stage to discuss Arbel’s messaging app, Yo. I’m perhaps not the target audience, but when I first read about the app earlier this year the word that sprang to mind was not “Yo” but “Why?” Arbel said during the interview that his intention was now to extend Yo, exposing an API so it could be used as a notification service for 3rd party apps.
I can get a better handle on this than on its current primary use case though I remain to be completely convinced. I’m not sure what Yo’s API gives you that you can’t already get from Urban Airship et al., though it seems the Miami Dolphins disagree with me.
To UK based gamers of a certain age Dr David Braben will be forever synonymous with the space based game Elite, which Dr Braben co-authored with Ian Bell in 1984. Elite quickly acquired legendary status due to its immersive nature and compelling gameplay and within gaming circles continues to be spoken of in hushed and reverential tones to this day.
Dr Braben is a founding member of the Raspberry Pi Foundation and CEO of Frontier Developments, and the majority of his session saw him discussing development of Frontier’s new game, and most recent sequel to Elite, Elite Dangerous. While ostensibly a game demo what we actually got was verging on an astronomy lecture, as Dr Braben showcased the game’s realistic modelling of the Milky Way galaxy; players are free to explore any one of the galaxy’s 400 odd billion star systems* for example.
Elite Dangerous is able to model this astronomical number of star systems (sorry, pun intended) within the constraints of the host machine by continuing the franchise’s tradition of using procedural generation when constructing the in-game galaxy. The results are extremely impressive, both technically and aesthetically.
I was fortunate to snag an access code to the playable demo via a prize draw later in the day and look forward to exploring the game myself, prior to its official release in December.
* 400 billion is in the upper range of the galaxy’s estimated stellar population, in case you’re interested. The lowest estimate is around 100 billion, so the development team certainly can’t be accused of taking the easy option. 😉
Professor Cox is well known to UK audiences as a science broadcaster and distinguished physicist. Inviting him to speak at the event seems a very smart move on Microsoft’s part, as his presence may well have been the draw that convinced some delegates to attend.
The lecture was a treat for those of us in the audience with an interest in physics, cosmology, engineering, or the history of science, as all four subjects were addressed in a thoroughly engaging and entertaining way. Professor Cox is a talented public speaker and gifted storyteller; his ability to take the audience on a journey is key to making the sometimes complex source material he deals with so widely accessible. For me the lecture formed a nice companion piece to Professor Cox’s recent TV work, featuring elements from his “Wonders…”, “Human Universe” and “Science Britannica” series.
The recreation of Galileo’s experiment in a NASA vacuum chamber from “Human Universe” was a particular highlight. I love the reaction from the scientists in the room, all of whom obviously knew the expected result but had the look of awestruck children when confronted with the reality.
If you’re a science geek, or have little science geeks in your house, and you have a chance to hear Professor Cox speak in person I strongly recommend the experience.
|Professor Brian Cox (left) and graph.|
During the lunch break I happened to walk past the halls in which the Appsworld mobile exhibition was being held. I launched my beacon scanner app and got the following hits.
Location: 51.509708,0.0246251 UUID: e2c56db5-dffb-48d2-b060-d0f5a71096e0 Maj. Mnr.: 9-1 RSSI: -99 Proximity: Far Power: -59 Timestamp: Nov 12, 2014 13:07:27.024
Location: 51.509708,0.0246251 UUID: e2c56db5-dffb-48d2-b060-d0f5a71096e0 Maj. Mnr.: 6-4 RSSI: -96 Proximity: Far Power: -59 Timestamp: Nov 12, 2014 13:07:28.146
Location: Unavailable UUID: e2c56db5-dffb-48d2-b060-d0f5a71096e0 Maj. Mnr.: 9-5 RSSI: -95 Proximity: Far Power: -58 Timestamp: Nov 12, 2014 13:09:50.880
It’s still quite a rarity to detect anything other than my own development beacons when out and about so this was good to see. I’d have doubtless encountered more if I’d wandered into the exhibition hall and I’d have loved to explore further, but a quick look at the Moto 360 confirmed it would soon be time for the afternoon sessions to begin…
Afternoon Sessions – Technology Tracks:
There was a large and diverse range of sessions available during the afternoon. The most popular seemed to be the “Modern Development with Visual Studio” track, which was significantly oversubscribed. When I walked past the room in which this track was hosted, people were queuing outside the room and were being admitted on a “one in one out” basis. The last time I’d seen such a long queue of my fellow geeks we were all waiting to meet Peter Mayhew.
The “Developing Solutions for the Internet of Things” track that I had chosen was less busy, but still well attended. There were three main sessions within the track:
|Check out Microsoft’s IoT resources online – note the “Internet of Your Things” tagline,
which cropped up a few times during Jeff Wettlaufer’s session.
Jeff Wettlaufer provided an overview of Microsoft’s Azure IoT solution. This was a fairly high level session, which yielded the following points of interest:
- There are 19 Azure datacenters, offering 600,000 cores.
- Once your data is in Azure, you can use Power BI or HDInsight (Microsoft Cloud Hadoop) to analyse it.
- Microsoft knows that not everyone is a .NET developer and is happy to provide support, documentation, and sample code for a variety of languages, including Python, PHP and Java.
- Works with any client device and any client OS.
- Any secure comms protocol is accepted.
- Power Map for Excel looks impressive, it’s a great way to visualise location specific data such as sales for specific retail locations. Definitely worth checking out.
|Power Map in Action – this slide may actually be from the following presentation,
but I’m pretty sure Jeff Wettlaufer introduced the tool.
Wettlaufer ran a real-time demo, which showed data generated from two Raspberry PIs on the lectern in front of him being pushed to Azure queues and then consumed by worker processes. Configuration seemed fairly straightforward and the demo worked as expected.
It was good to hear directly from someone in a senior position on Microsoft’s IoT team, and the fact that Wettlaufer had flown in from Redmond lent emphasis to the importance Microsoft are placing on Azure for IoT.
Here we dug deeper into the components that enable Azure for IoT to work at scale. There were some impressive numbers thrown around and Vasters generously shared his experiences of architecting large scale, highly available systems.
|Azure IoT principles|
Some observations and facts provided by Clemens (not verbatim, so any errors are mine):
- “IoT is when those with the least experience in building very large, secure, high-scale, high-availability multi-node cloud solutions are confronted with having to build them”.
- A Unit of Management in Azure is 420 instances max.
- IoT devices are peripherals to the system, in the same way printers are peripherals on office LANs.
- Do not scale further than you can test (common sense, but perhaps surprising how often this principal is violated).
- Security needs to be as close to the device as possible and communication needs to be one way, where possible. Devices should not listen for inbound traffic – they can’t protect themselves!
- Expecting devices such as domestic hubs to effectively handle security for a plethora of connected devices is likely to end in tears. UPnP is insecure and does not help, turn it off if you can.
- Service assisted communication can help to secure your solution (see Vasters’ slide on the subject below).
|Service Assisted Communication|
Working definitions of scale:
- Hyperscale – up to a million concurrent clients.
- Enterprise – typically 10,000 – 20,000 concurrent clients.
- Web – typically a few thousand or fewer concurrent clients (there are exceptions – e.g. Twitter & Facebook).
IoT will be at the high end of scale, particularly for telemetry and Command & Control systems.
Microsoft have copious documentation on Azure, which I won’t attempt to summarise, but the concept of Event Hubs is worth mentioning here. Event Hubs are the ingress point for IoT data into Azure, which is a key architectural difference between “regular” Azure and Azure for IoT. Once your data has been ingested by an event hub and passed into Azure, you’re then free to store or transform the data as required. Also of note is Stream Analytics, which when used in conjunction with Event Hubs allows incoming data to be analysed in real-time.
|Azure Event Hubs|
|Event Hubs with Stream Analytics|
I got a lot out of this session, Vasters is an extremely effective and honest communicator who is more than happy to share his knowledge and opinions. I’d definitely recommend catching one of his sessions if you’re attending a conference at which he’s speaking.
Dr Birchenall presented a view of Glasgow’s Future City initiative, the result of a successful bid for funding from the UK Government’s Technology Strategy Board (now renamed Innovate UK). In a nutshell, the project leverages technology for the benefit of the city and its communities and provides a learning resource for cities looking to embark on similar schemes.
The city has created a state of the art operations centre, where data from its CCTV network and traffic sensors is presented to the operations team. The operations centre opened in late 2013 and was immediately called into action supporting the emergency response to a tragic accident at the city’s Clutha Vaults, where a police helicopter crashed into the building with devastating and tragic consequences.
Over 200 data streams have been identified and more than 100 have been publically exposed via Glasgow’s Open Data Portal, which includes an interactive open map that stakeholders can use to view locations of various services and events, from allotments to traffic accidents. Developers are also encouraged to use the data to provide alternate views and build additional services.
I think this is an important and potentially influential initiative, if any of the above is at all interesting to you and you’d like to know more I’d strongly recommend a visit to the Future City Glasgow website.
We started the day with a high level view of the possibilities presented by Open Data and this session described how these possibilities were being realised, making the two sessions natural bookends for the day.
|“Where we’re going we don’t need roads” – this DeLorean was parked near the entrance to the DLR station.|
I think I ended the day with a decent grasp of the high level architecture of Azure, both generally and specifically as deployed in an IoT scenario. I picked up a few tips that would probably serve me well and I felt inspired to follow up on some of the things I’d seen and heard.
My thanks and congratulations go to the organising team for putting together an interesting and entertaining event that delivered genuine value, at least to this delegate. Hopefully Microsoft will make Future Decoded a recurring fixture, I’ll definitely be attending if they do.