Is it Winter today? (summary presentation)

We started out with an very basic business idea: Sell coats to cold people.

To maximise sales we needed to identify where in westeros the cold people resides and
we tought that shifting clima (like winter is coming) would give us an adventage.

To identify this shifting clima we deployed an IoT solution with tempearture sensors
throughout Westeros that and would help us react to the changes.

We also created a public webpage ( where the people of Westeros could get answer to the question: is it winter yet? And we could identify buyers for our coats.

Basic buiness idea:




Based on this idea, we created a goal system architecture diagram:

But after spending a day on getting the CDS (Common data service to get up and run), we discovered that the functionality is not production (or in our case) development ready:

In CRM we loaded up with all the houses, members of the house and which region each house is a member of:








We created all the regions in CRM, where the each region has a temprature, threshold temperature and winter status:












When the temperature goes below the threshold. The region changes status to winter.










When a region gets status winter, there is an automatic job creating a marketing list of house members in the region. So that we can start a targeted marketing campaign:












We are currently monitoring all temperatures in Power BI and have maps showing the current temperature for each region:













To make sure that we still have customers, we did machine learning to calculate the churn based on data from Game of Thrones.







Unfortunatly, we were not able to do any projections where our customers actually survived.

As a second way into our webshop, we have the three eyed raven:









The three eyed raven is using cognitive services on our website:














To the services does targeted marketing on you and will show you to your correct categories of the webshop:










On the way there we used a lot of epic shit technology:













And remember: Don’t be Joffrey

Integrations + Development = Fun

We wanted to give you an overview of some of the functionalites we have integrated the last 2,5 days.

We have developed a site where we can register new a House (that needs our protection services), new Threats (mapping between IoT sensors and GoT attacks) and view the Attack History.

In the Attack History, we can see data from Work Orders in CRM, and we show videos that we generated from real scenes in GoT episodes, and yes, we changed the audio of the movies with parts relating to the text the House entered.

As an example:

To find the video scenes that mentiones the text of our choice, we have used the Video Indexer API.

In addition, we have Artificial Intelligence to predict what threats the different houses are likely to face tomorrow. This way we can be prepared and have teams ready. We used Azure Machine Learning studio to make our predective analysis.

We got complaints from our Kings, they have dirty hands, and we got the challenge to integrate voice to have Self Service Support. We used Amazon Alexa to check the status of threat and in case of false alarm, Alexa can cancel the dispatched team.

To have an overview of which attacks we got, our team have a graph where they can manipulate and see the patterns. We used Cosmos DB:


Last but not least, we are using Face recognition to simulate attacks. For this we used Azure Face API.


Posted in CGI

Booking resources in our webshop

We have established process automation through flow, in order to make establish a work order in CRM from our web shop. See the awesome flow we set up below:

To order the user has to log in to the web shop, to get then work order registered to their team. In Dynamics we get an overview over these orders within Field service à work orders.

This work order will then be processed by one of us, using bookings in Dynamics.

We also have a workflow retrieving information from the bookings we enter, which we display in our internal solution – Back Channel.

High Performing Services using Azure Service Fabric

Integration and sharing of data has always been a challenge. There are many aspects that needs to be covered to answer the demands in today’s world of ATAWAD (Anytime, Anywhere Any Device).

To address this our integration backend is utilizing Azure Service Fabric. A platform that provides solutions to a lot of these requirements.


With a node architecture, and automatic scaling (if for example a node goes down, a new instance get provisioned on another node), we can ensure that our data is available all the time.

We also have the possibility to scale out if we notice that demands are high. This can also be scheduled so that we increase our capacity during peak hours.

We can support several versions of the same service, and with defined upgrade domains, a new roll out of functionality does not produce any downtime.


To secure our data the credentials needed for the integration is stored in an Azure Key Vault. Something that ensures separation of duty, and that we make sure that the management of access is separated from development. No more accounts that have the same password for years, and no hard coded credential in source code.

The services itself utilizes Identity Management in Azure AD. We can present different versions to different types of users, or we can block certain endpoints completely.


Performance is crucial, and we need to make sure that we can deliver processed data to thin clients. We don’t want to deliver large datasets for processing to a tablet or phone, even though there are usually enough capacity to do so to a pc.

By using in-memory cache, and a Timer Function that trigger the backend to fetch and process data, we can make sure we reach those goals.

Data is processed by one node, and when the processing is complete this data is then sent to the memory of the other nodes.

All this means that when a call is being made to the backend, the result is presented with minimal latency, and for a user the experience is that it is almost instant.


The API is using swagger for discovery. And it is very easy to get a feel for what data you will get back, and the datamodel, even by inexperienced consumers.

Well-formatted JSON makes it possible for Power Users to consume the data, and we make sure that the data is available for the right people, at the right time.

As Vesa stated during the Keynote; It is a great time to be a developer.

Westeros services

Westeros is an online services hub, where all teams can book favors from all around Westeros, this online services center is build using AngularJS and bootstrap! And can tell you this it’s very RESPONSIVE! Hoping that we are doing Daenerys!

After you are authenticated using your AD account, we are using Office Graph as well to pull some data … I hope we are securing our castle enough!

Westeros online services are driven by speech, you navigate through speech and book through speech. Westeros online services integrates with Dynamics 365, where it pulls data from Dynamics and push bookings to Dynamics.

Finally, the tracking data that have been collected for tracking targets, is also sent through the online store!

Look how responsive it is!


The Dothraki translator is ready for production!

New site collections can now be provisioned with the Dothraki translator web part. The provisioned site will also be mobile friendly with Dothraki branding.

As this blog is being composed the Dothraki translation service has already been used by another team: Acando. So, feel free to use it 🙂

For future purposes this concept can also be used to translate other fictive languages such as Elvish, Valyrian, Hi Valyran and Klingon.


Here a  screenshot from the mobile phone. When you press the button and start speaking to the phone in English, the English output will be shown as text in real time on the phone. So will the Dothraki translation, which then also will be spoken out  through the speaker.





Here is a picture explaining the architecture of how this was developed:

The technologies used to develop this solution are:

– Office Dev PnP
– React
– Azure
– Azure functions
– Bing Speech API





How is this possible you may ask:

  1. We use the the dothraki bing speech api from our SPFx webpart for reading the users microphone and finding out what the user says and translate that to text.
  2. We send the text we get from the bing-speech api to the dothraki azure function that does the actual translation. The azure function then sends back the right Dothraki text to our SPFx webpart.
  3. The SPFx webpart uses browser native synteziser to speek the dothraki text you get back to the user.

Please contact Bouvet Penguins if you need access to the Dothraki translate api, we are happy to let you integrate the azure function in your app. (You need API key) Dont worry ! Your data is never stored any where, your voice is safe 🙂 (GDPR)

Click here to see the source control on GitHub:

The Virtual Raven

When the King needs to send a message to His allies, He relies on a network of ravens. Recently, a new apparatus has been created that allows Him to draft a secret letter without the use of birds. This mystical device is called a virtual raven and will only work when used by the King.

When the King gazes into the device, it opens up a secret window that lets Him enter a secret message, assign a messenger and a recipient.

The message is safely hidden away in a magical cloud that is inaccessible to anyone but the intended recipient. It cannot be read while in transit, and so it cannot fall into the wrong hands.

The code itself is displayed on the apparatus, and I can assure you that it is unreadable to human eyes. Only by using a magical device entrusted to the rightful Kings can the message be deciphered.

What follows is a technical description of how the messaging system is created. It is of no particular interest to anyone but the practitioners of the arcane arts. It is included here only by the insistence of the device maker.

– King Sven

Technical description

The application is developed as an HTML5 / Javascript app using a Koa server and the React framework. The source code is published to github at the following address:

When deployed it creates a docker container using an Ubuntu image that I maintain and use for most of my web applications. The container is stored in Azure on the container registry.

In order to be able to easily deploy and scale the application, I set up a Kubernetes cluster using the Azure Cloud Shell and deployed it with two replicas. I also used a loadBalancer to expose the application with an external IP.

Setting up a Kubernetes cluster on Azure is amazingly straightforward. I used this guide to quickly get up to speed on the Azure specifics: Using the Azure Container Registry required me to create a secret to hold my credentials:

kubectl create secret docker-registry <SECRET_NAME> \
--docker-server <REGISTRY_NAME> \
--docker-email <YOUR_EMAIL> \
--docker-username=<SERVICE_PRINCIPAL_ID> \
--docker-password <YOUR_PASSWORD>

In my deployment.yaml I expose the secret like this:

 - name: virtualraven
 - name: <SECRET_NAME>

When building my docker image I tag and upload it like this:

PKG_VERSION=`node -p "require('./package.json').version"`
docker tag svena/virtualraven:$PKG_VERSION$PKG_VERSION
docker push$PKG_VERSION

On the node backend I use Koa to connect to a set of Azure functions that retrieves a set of valid Kings and messengers and encrypts any outgoing messages in combination with the face recognition API.

For instance, the message upload function looks like this (some parts of the receiver address is obfuscated intentionally):

A lot of work also went into the UX design to make it work efficiently on a cellphone. I wanted to use a serif font to fit with the theme and ended up using the free Cormorant Garamond font. I wanted to use a professional TypeKit font, but the licensing costs proved too extreme. The color scheme is carefully selected to project an image of power, while still being pleasant to look at.

Having a tight schedule? Fear not!

Hei bloggen!

In our last post, we presented our provisioning engine. Sadly, you don’t always have your loyal servant Podric Payne at hand, or a computer for that matter. Luckely, house Coudborne has a solution for those who cannot wait to wage more war!

The PowerApp is pretty simple: it adds new items to a list. This action triggers a flow, which sends a http – request to the flow discussed in our previous blog post, which again provisions a group.


The old flow is updated to check for an email address in the payload, and sends an email to said adress with the relevant information.




House Cloudborne (first, second and third of the Andals).

Up and running!

The first message

Westeros has finally seen the sending of the first message between 2 kingdoms. Kings are happy with the security validation and magic behind the magic mirror “Surface” and the magic crown “Hololens”.

King Sven Anders is ready to send a message using the magic mirror. It is blocked, only a king can unlock it.

He looks directly to the magic mirror and it recognizes the king! It then asks for instructions from the user, and he writes “Send a message”. The strange device answers with another question “What message do you want to send?”, so the king types it down.

Finally, the mirror asks for the destination and the messenger who will deliver.

What is behind?

The first step is to athenticate the user, the device takes a picture every 5 seconds to check if a face is present, and if it is then it validates if it is a king’s face.

Once authenticated, a bot starts the conversation, asking for simple questions. The user supplies the answer by using handwritten text, which is sent to Computer Vision services in order to convert it to data that the bot can manage. Once the text is written and the king and messengers are selected, the bot sends the info to an Azure function that receives the parameters and starts the encryption process which goes like this:

  1. A new record in an azure table is created that holds: The unique key from the sender (King A), The unique ID of the messenger, the unique ID of the destination (King B).
  2. Encrypts the original text using RFC 2898 and generates a QR code image that goes to printing. (The original message and encrypted message are never saved in any place, only printed in paper, if the paper is lost, the message is lost).

Lot of trial and error

In addition to handwritting, we created also a mobile app that runs the same logic but without a bot. It runs under NodeJS in a docker container. The QR code can also be sent this way without paper.

The first receiving

King Carlos receives the messenger in his realm, he proceeds to wear his magic crown and looks at the messenger’s face while saying “Validate messenger”. Magically he receives information from a ghost called “The cloud” than can only be seen using this magic crown.

This info confirms the sending from Sven Anders (King A) and also confirms the messenger’s name.

What is behind?

By using the hololens the user can send voice commands to call certain functions in the device, some of the commands will make holograms react while other commands execute a HTTP POST to an azure function.

The first validation is the messenger’s face. Once the person is scanned, the request is sent to the defined endpoint and it returns the info about who send him and who is he.

The second validation is executed by sending the voice command “Show me the message”. It sends a second HTTP POST, but this time asking for the decrypted text. The process goes like this:

  1. The message is sent to an azure function, it receives the byte array (picture) and the receiver key (Kings B unique ID).
  2. It proceeds to decrypt the message by using the receiver ID (which is unique for each hololens, it is linked to the devicename). It means that only that specific device (King) kan get the decrypted text.
  3. Once the info is returned, it is shown in a ghostly hologram that we call “The cloud”.

As a cool addition, we added some voice commands to animate a couple of dragons if the messenger is not validated.

Dashboard view in SharePoint

To have a better overview over the status for each team when it comes to points and money. Created we a WebPart in SharePoint.


Here we have used SharePoint framework with React. We are using the SharePoint theme color on the background for the heading area, so the WebPart doesn’t stand out from the site.  The WebPart is calling a SQL database to collect the data for each team.


We have also created an overview over all the purchased services from the web shop.

Whenever a team make a purchase in the web shop. Will the data be sent to a list in SharePoint.  So, the WebPart is using sp-pnp to retrieve the data