Wednesday 23 December 2015

Do not give your manager a DOORS DXL Reference Manual

After reading my last post my manager came back to me with a DXL he wrote in one and a half days.

"Here is my DXL utility which can be used to perform requirements analysis on a module:" (source)


Requirements Analyzer tool UI

His selection was taken from various white-papers on the subject matter - such as that one from NASA - Automated Analysis of Requirement Specifications.

This tool creates a view in the module for each theme. It creates a compound "or" for all the keywords against either Object Text or all attributes.

Additionally only users will "manage database" can edit the themes.

You can see this UI in action in this video:


Well that's pretty cool!

Sure there is place for improvements, and it's not bullet proof, but that's not the point. Imagine what a Requirement Engineer could do in DOORS 9 if given a bit more time...

Maybe something similar to the use case I mentioned in my last post: use Watson Natural Language Classifier Service from Bluemix.

The story!

Once you have your requirements filtered you can have a list of rules your device must implement.
If you continue watching above video you will see how I created a rule from a very quick UI I wrote in DXL (source).

UI for Rule Creation in IoT Real-Time Insights

Code short and simple but shows you how to get schema information so you can:
  • view condition items
  • assign rule to schema 
If rule is successfully created its ID is stored in ruleId string attribute.

Of course, instead of extra attribute with value it could store an external link to:
https://iotrti-prod.mam.ibmserviceengage.com/api/v2/rule/{ruleId}

It would support 'follow' action, but user name and password which are apiKey and apiToken from VCAPS are not easy to remember ;)

As a exercise you can extend this example to enable action selection. In this way you will create a complete rule with action which can trigger alerts.

Traceability is Power

If you are working with DOORS 9 I'm pretty sure you are familiar with concept of Suspect links and changed objects

OOTB DOORS 9 let's you check information about in-links or out-links to open modules or to all modules.

Well that's great but with OSLC links, or even without links (if you select to stay with extra attribute in last chapter)?

With DXL you can easily do it, just retrieve your rule from IoT RTI and compare to what you have. See the last part of the video to see it working.

Suspect Linked Rule

As promised sources for

Conclusion

If you do not like challenges do not give your DXL Reference Manual to your Manager, he might come back with some great ideas!

Friday 18 December 2015

2015 Blog Summary


It would be too easy to dismiss previous posts on "DOORS 9 and IoT" as disposable fun with no real value.

Yes they are bite size examples - deliberately so when you think about it - but at the high level they set out to prove a number of very serious points.

Those points are:


  1. There are no barriers preventing you from using DOORS 9 with BlueMix and IoT services today.
  2. There is no requirement for complicated, over-engineered solutions or add-ons. Forget that. You already have the most important piece - DOORS 9 itself.
  3. You can jump right in and immediately start to explore and take advantage of these new and exciting technologies.

So we thought it would be helpful if we took a step back and tried things from a different angle.

This time the aim is to walk through an end-to-end scenario. In doing so we hope to demonstrate how best practice RM in DOORS 9 can be used alongside IoT to sustain ongoing efficiency, improvement and success.


Scenario for a Story

Consider a company that makes a device.

Now it isn't particularly important what this device does: it could be large or small, shiny or matte, there may be one of them or there may be thousands, it could sit in your home or it could move around on water, roads or the air.

However one thing for sure is that it is complicated, takes time to build and involves both hardware and software.

Now the company feels very strongly about this device. It is the market leader. It makes them money. Of course they are going to be emotional about it.

The company isn't stupid either. They realise that over time this device will need to change in order to continue market dominance.

Change may come for any number of reasons such as simple aesthetics, available of new materials or increased regulation.

Change may also occur because life has a frustrating habit of throwing up problems.

In other words the company's device is built to exacting standards but it can still fail.

To help deal with this failure the device includes sensing, monitoring and communication capabilities. In real-time there is a continual flow of information back to the company and naturally the company wants to act fast in diagnosing and fixing any problem.

This historical data can also be used when considering a new variant of the device. The company wants to identify small changes that can be made incrementally and hopefully at low cost.

Phew... that took a while to set the scene. Might have been better to use a GIF to present a "Star Wars" crawl, or a Mad Men sales pitch.

So let's gets back to DOORS and IoT.


How DOORS 9 fits into this story?


We may have already mentioned this but the company isn't stupid and this means their requirements are written and managed in DOORS 9, so let's start there...

There has been a lot of work over the years into what makes a good requirement.

For example if you google "nasa requirement quality analysis" you will find excellent whitepapers on natural language processing. See Automated Analysis of Requirement Specifications by Wilson, Rosenberg and Hyatt.

This essentially boils down to scaffolding the requirements process with tools that maximise the conciseness, consistency and overall quality of the statements used to describe what a thing should do.

There are more involved utilities available, but using DXL it is a relatively straight-forward exercise to parse your module looking for keywords and/or symbols. A filter can be applied to focus the attention on those that are ambiguous or likely to cause confusion.

You could even consider using a service such as Watson Natural Language Classifier.

On November 1st I've shown how to use DXL HttpRequest to connect to Clouding, so one shall know how to use it with DXL, no matter layout, attribute or utility.
So what's stopping our fictional company from feeding Watson NLC with their DOORS 9 data using DXL or NodeJS and OSLC with aid of a very first example I presented? With HttpRequest this company could use Watson NLC to classify requirements as according to:

  • imperatives
  • continuances
  • directives
  • weak phrases
  • options


With this said there is no reason why the same utilities cannot be used to identify requirements that are essentially "rules" in their own right. This act alone helps system engineers focus the attention on the rules that will need added to Workbench or Real Time Insights.

In terms of our device this means that it would have a thing dedicated to capturing and reporting a specific behaviour or variable.

When RTI detects that a boundary condition is breached an alert is made.

The rule in RTI that controls this behaviour should be able to be extracted from a good requirement.

XXX must not exceed a maximum of YYY units.
So not only can we use DOORS 9 to increase the confidence in our requirements – but we can then use those requirements as a basis for development of things and the subsequent reporting capabilities.

Did I forget to mention DOORS 9 can easily create a rule (or action, alert, whatever you need and the API lets) in Real Time Insights or IoT Workbench (currently an experimental service)?
Well, you know the drill use your DXL HttpRequest to POST some data to a URL.
URL you found in this blog, in a post not only telling you how it fits into IoT World but also telling you where is the IoT Real Time Insights HTTP(S) API documentation, good luck finding that with any search engine (well apart from link to this post).


A thing in the wild


So the requirements are filtered and rules identified, implemented and introduced via things in the real world. Sooner or later something bad happens.

A message is sent to RTI from the thing.

We can easily report on this information from DOORS 9 - and contextualise it in terms of the requirements that perhaps could be a problem. Requirements are linked through to design, test case and source code - so impact analysis of change can be investigated and more importantly executed.

The Layout DXL columns posted earlier show this particular scenario playing out in DOORS 9.

Closing the lifecycle circle


But remember there was another use case.

The company feels it is time to consider a new variant of the product but they want to ensure maximum benefit from minimum change.

We’ve seen how errors can be handled but what about other intangibles. This is where IoT can really begin to drive benefit from the unstructured data.

What if the device was a car that was intended to get mpg at mph. Accelerate to mph in ss.

What if it never achieved that? The information to hand allows you to cross-check weight of passengers, softness of brakes, engine performance to determine areas of weakness.

Is this climate based – were environment factors to blame? Is the car configured wrongly OOTB for the geographical area?


Conclusions

Hopefully my 2015 posts will have raised your awareness and interest in IoT.

It represents fantastic levels of potential and my aim was show that in a few relatively small steps you can begin to unlock that potential now.

At this moment you have everything to need in order to take advantage of DOORS 9 and the GB/TB of data it sits on top of.

Any further advice I could give is best summed up by IBM Fellow John Cohn when he stresses "The importance of play":




Tuesday 15 December 2015

DOORS 9 direct IoT support - build-in challenge #1

Monday morning is a good time for reflection and quick chat with coworkers before throwing yourself into work cycle.
In this way I had quick chat session with my manager yesterday. That ended up with challenge#1 (DOORS and IoT) and later on I had another quick chat my friend from DNG team, that ended up with challenge #2 (DOORS 9 itself).

Since challenge #1 is about IoT let's leave details of the other one for another post- but I like the idea so I will definitely post something in that subject.


Challenge #1

It was simply; "DXL triggers and IoT."

Well in my last post I used external tool written in NodeJS to publish DOORS 9 TRS feed to IoTF broker.

You need a full DOORS Web Access stack with some extra settings (as TRS is not enabled by default) to be able to get TRS feed which you will consume with your external application before posting that to IoT Foundation broker.

That's how complex last solution was
DWA Stack used in last post

There are no events in TRS so the last example was querying for changes every 10 seconds. That's far from real-time in IoT terms.


DOORS 9 Triggers

Have a look at DXL Reference Manual if you are not familiar with concept of triggers in DOORS 9.

Triggers are as close to real-time events as it can be in RM tool. Yeah but DOOR 9 doesn't have MQTT client.

Does it have to have it?

Well NO!

Looking at Internet of Things documentation, you can find topic on HTTP(S) Connectivity for Devices. If you read my previous post you should be familiar with HttpRequest. So your trigger DXL have to do is send a HTTP POST request to
<target server: org_id.internetofthings.ibmcloud.com>/api/v0002/device/types/{DeviceType}/devices/{DeviceID}/events/{eventID}

Looking at the URL format you can guess some event values will be set automatically:
{
"device_type": {DeviceType},
"device_id": {DeviceID},
"evt_type": {eventID},
"timestamp":
{
"$date": 1450194956425
}
All you need to add is a evt itself.

So your entire trigger DXL will look like

Module m = current Module
Buffer msg = create
msg = ""
msg += "\"id\": \"" getResourceURL m"\", "
msg += "\"content\": \"" name m "\", "
msg += "\"group\": \"Open\""
msg += "}"

string ioturl = "https://org_id.internetofthings.ibmcloud.com/api/v0002/device/types/DeviceType/devices/DeviceID/events/trigger_update"

HttpHeader h = create
string auth = ""
toBase64_("use-token-auth:YOUR_TOKEN", auth)
auth = auth[0:length(auth) -2]
add(h, "Authorization", "Basic "auth)
add(h, "Content-Type", "application/json")

HttpBody b = create
setValue(b, msg)

HttpResponse resp = HttpRequest(HttpPost, ioturl, b, h)
delete b

if (!null resp && resp.isOk)
{
// no one really needs to see it
  print "got it " (!null resp ? resp.code "" : "null") "\n"
  delete resp
}

delete h
delete msg


That's all you need to do publish events in real-time to IoTF!

Conclusion

Again HTTP Request proved to be a very powerful and useful perm. Using it the initial stack was reduced to very simple form:
Simplified stack
Final stack in DOORS to IoTF communication

DOORS 9 has a potential in it, you just need to know how to use it.

Friday 11 December 2015

DOORS reporting to IoT

So far we had an example how to use DOORS to read IoT historian data. Now it's time to publish something back.

There's no problem sending some useless data, the problem is to make sense out if it. So let's think on some 'real World' usage.


"Would you like to know when your most favorite module was modified?" 


Why not!

This could look like this:
Someone is playing with my module!


If you read my previous posts you know we have a Tracked Resource Set (TRS) support in DOORS Web Access.
When enabled (as it's not by default) TRS "tells" us about all Modification, Creation and Deletion changes in DOORS Database.

So we need to modify a TRS reader to its new purpose.

I decided I won't need a front end for my TRS translation service so I rewrote the app.js:

/* jshint node:true */
var trs = require('./trs')
var superagent = require('superagent');
var iotf = require("ibmiotf");

//this is used to hide self-signed certificate errors
process.env.NODE_TLS_REJECT_UNAUTHORIZED = "0";

// IoTF setup
var deviceClientConfig = {
  org: 'quickstart',
  type: 'mytype',
  id: '001122334455',
  'auth-method':'token',
  'auth-token': 'secret'
};

var deviceClient = new iotf.IotfDevice(deviceClientConfig );
deviceClient.connect();
deviceClient.on("connect", function () {
 console.log("Connected to IoTF");
});

var user1 = null;
function updateTRS() {
 if (user1 == null) {
  user1 = superagent.agent();
  user1
    .post('https://DOORS_SERVER:8443/dwa/j_acegi_security_check')
    .type('form')
    .send({ j_username: 'DOORS_USER', j_password: 'PASSWORD'})
    .end(function(err, res) {
    })
 }
 else{
  trs.TRSUpdate(user1, deviceClient, function(err) {
    // just ping I'm alive
    console.log(". " +err); // one might want to check if err is defined
  });
 }
}

//query DWA for TRS changes
setInterval(updateTRS, 10*1000); //every 10 seconds
updateTRS()


Simple and quick. Now changes to trs.js which as you can see now takes additional deviceClient parameter.

Changes are really obvious:

1 . First publishing method

function publishTRS(deviceClient, data) {
   //publishing event using the user-defined quality of service
   var myQosLevel=2

   deviceClient.publish("trs_update","json",'{"d" : { "trs" : '+JSON.stringify(data)+' }}', myQosLevel);
   console.log("send data ->" + util.inspect(data));
}

2. Modified function parameters.
Note changes are no longer added to array. Instead those are send directly to IoTF

< function parseChangeLogs(clResults, client, last) {
---
> function parseChangeLogs(clResults, last) {
>     var changes = [];
97c88
< publishTRS(client, {start:dt, id:o.object, content:object, group:grp});
---
> changes.push({start:dt, id:o.object, content:object, group:grp});
108c99
< last(err);
---
> last(changes);
169c159
< exports.TRSUpdate = function(usr, client, last) {
---
> exports.TRSUpdate = function(usr, last) {
176c166
< parseChangeLogs(results, client, last);
---
> parseChangeLogs(results, last);

Now your TRS example is ready to talk to IoTF! Time to prepare IoTF application to use this feed.

Internet of Things Foundation

Go to you Bluemix dashboard and add new IoTF starter application. This will create for you a simple Node-Red application. Remember to add some credentials so only you will have access to your dashboard.

Once ready navigate to you IoTF dashboard
IoTF dashboard link

Here we will add a new device type to you IoT world.

New IoTF Device Type

Select Devices, Device Types and Create Type (on the bottom of the page)

The wizard will guide you to provide

  1. Name and description
  2. Define template
  3. Define template

Save the type and proceed to add new Device


IoTF Device

Select Devices, Create Device (again on the bottom of the page)
You will have similar wizard. If you select Type defined earlier some fields will be populated.

  1. Device ID
  2. Security
  3. Summary
    Please note credentials are NOT recoverable!

Finally Node-Red end


Once your device is ready, you can work on your Node Red consumer. 
Here I will just indicate how it might look like. I'm not going to write your business logic ;) I'm just showing an example what you can do.

So your full example would might look like:

Mine is just showing it works so you can try and do what you want.

I had to configure my IoT input:

Prepare my "business logic"

And observe the output if warning is set to true:


Conclusion

In those really simple steps we made a DOORS Web Access Tracked Resource Set a s Thing!

We just 'thingified' DOORS! 

Friday 4 December 2015

IoT data in JavaScript

Introduction

In my previous post Displaying real-time IoT data in IBM DOORS, I displayed IoT data in the DOORS Web Access hover over. This involved an investigation into how I would access IBM Foundation IoT data from JavaScript. My initial thoughts were this would be very easy, I will just use the Paho Js utility, though it didn't end up as straight forward as anticipated. Therefore the aim of this post is to make the process a little easier and highlight the gotchas I encountered.


Wednesday 2 December 2015

DOORS 9 IoT Report

Let's think about following user story:
"As a Requirement Engineer I would like to know how many errors my devices report"

In DOORS 9 module one can visualize this as:
Example report in DOORS9

So we would like to get a number of errors of given type reported by each of devices.

Device Error Reports in Sample App

First I need a device which can send some error codes. The IoT Starter Application mentioned in one of previous post publishes three kind of messages

  • touchmove - user interaction with screen
  • accel - orientation of the device and its position
  • text - a text provided by user

I extended that application to be able to send error codes.
 

Pressing "Send Error" button and selecting error code sends a MQTT message to IoTF broker.

Typical error message looks like;
{
device_id"doors9"
evt_type"error"
timestamp:
{
$date1449059192155
}
evt: 
{
errorCode"20"
}

}

For a compiled version of this application follow this link.
Now I could start updating my module.

Design for Analytics


Requirements in my DOORS module have a "ErrorCode" integer attribute which links a requirement to a error code reported by my device.

Additionally I'm using DOORS module level attributes to store values I do not need hardcoded in my DXL. Those are:
  • Authorization (string)- so I do not need to calculate Base64 on each call
  • DeviceType 
    (string) - which of my devices this module describes
  • Organization (string) - my Bluemix organisation name
With all information in place I can write some simple layout DXL (which can be convert to attribute DXL later on to improve performance)

Layout DXL 

I want a specific type of event from all devices of a given type from my organization. So I need to use a historic query for a device type:

buf = "https://internetofthings.ibmcloud.com/api/v0001/historian/" org"/" deviceType "?evt_type=" eventId

This will return a JSON string with a list of events. If you do not have a JSON parser ready you can try to parse this data with Regular Expressions. Please remember this is a very simple example and in real world one shall not attempt to parse JSON with regular expressions.

My main worker code looks like:
if (!null obj) {
  int ival = obj."ErrorCode"
  if (ival == 0) halt
 
  string val = obj."ErrorCode"
  Module m = module obj
  string auth = m."Authorization"
  string dtype = m."DeviceType"
  string org = m."Organization"
 
  if (!null m && auth != "" && org != "" && dtype != "")
  {
    Buffer b = getData(auth, org, dtype, "error")

    if (!null b)  {
      string s = stringOf b
      Regexp re = regexp2 "\"device_id\":\"([^\"]*)\"[^}]+.[^}]+{\"errorCode\":\"([^\"]*)\""
      int i = 0
      string device = "", code =""
      //  temporary skip to hold names of devices which reported
      Skip erSkp = createString
      int allErrors = 0
      int numDevices = 0
      while (!null s && re s && i<100) {  // i is just a guard, we know there is no more then a 100 results in one page
        device = s[match 1]
        code = s[match 2]
        int ireported = 0
        // if code matches attribute value
        if (code == val)
        {
          allErrors++ // icrease number of errors
         
          if (!find(erSkp, device, ireported)) {
            put(erSkp, device, 1)
            numDevices++
          }
          else {
            ireported++
            put(erSkp, device, ireported, true)
          }
        }
        s = s[end 0 +1:]
        i++
      } // while
      // clean up
      delete b
     
      // report
      if (allErrors != 0) {
        for ireported in erSkp do {
          device = (string key erSkp)
          displayRich "Device with Id {\\b "device "} reported an issue " (ireported == 1 ? "once" : ireported" times")
        }
      }
     
      delete erSkp
    } // null b
  } // module setup
} // !null obj

You can find full DXL here.

Conclusion


Above layout DXL works fine when there're not so many devices. Once there will be more of them we no longer want to see how many times each device reported given issue. Thus the DXL could be rewritten to show something like:
Example report


As you saw in this example I'm using layout DXL but I think for a better understanding and feedback, one should consider writing a utility DXL.
Maybe that utility could provide its own UI for easier navigation?

In example above there's no need to send a HttpRequest for each object... It is enough to make one call per 100 (max page) events returned by query and write a little more complex Skip management. That however would require to make one top level Skip, but I'm sure you all know how to do it.

Monday 30 November 2015

Displaying real-time IoT data in IBM DOORS (Doors Web Access)

There are many ways in which we could imagine surfacing IoT data in the IBM DOORS interface.

One way that I would propose is to extend the rich hover so that you could see IoT data in real-time, displayed when you hover over an IBM DOORS link-end point. Imagine being an IBM RQM user looking at a defect and being able to hover over a link to a requirement and see the history of a thing displayed.

Extending the rich hover to display IoT data in IBM DOORS Web Access


Extending the rich hover to display IBM DOORS IoT data in IBM RQM


Showing display of real-time data in hover over



As part of the technical investigation I looked into the implementation of a JavaScript client which listens for data from a device in the IoT Foundation. And then display that real-time data in the UI. The functioning graphing component is shown in the examples above. I will blog about the JavaScript / IoT implementation in a future post. 



Sunday 29 November 2015

Moving DOORS9 towards real-time IoT sensors

Last time I shown how easy it is to use historian from your IoT enabled device data in DOORS using IoT Foundation HTTP API. I thought 'it cannot be that hard to show real-time data', I know there are no MQTT callbacks in DXL, no JSON parser (sure we can write one, but what's the point?).

Sounds challenging. So let's do it!

Let's play!

If you have spare 15 minutes please have a look at wonderful presentation by Dr John Cohn (@johncohnvt) - "The importance of play".

Let's create a very simple HTML static page you can open in any (recent!) browser and which will connect to IoT Foundation.

Very (and I mean it) simple HTML page

The page will have just two p elements, one for static info, and one used dynamically from JavaScript. It will additionally include 3 scripts; jquery, paho and MQTT connection helper class.
The entire page code is simply:
<!DOCTYPE html>
<html>
    <head>
        <meta http-equiv="Content-Type" content="text/html; charset=utf-8" />
        <script type="text/javascript" src="jquery.min.js"></script>
        <title>MQTT listener</title>
    </head>

    <body onContextMenu="return false">
        <p>Please do not close</p>
        <p id="pStatus"></p>
    </body>
        <script type="text/javascript" src="realtime.js"></script>
        <script type="text/javascript" src="mqttws31.js"></script>
        <script type="text/javascript">
          var realtime = new Realtime(org, apikey, apitoken, device_type, device_id);
        </script>
</html>

MQTT connection

As you know IoT devices communicate over MQTT, lightweight device to device protocol implemented in many languages. Among which there are many JavaScript implementations available, which you can find at http://mqtt.org/tag/javascript. Personally I'm using Paho because it is very easy to use with IBM Bluemix services.

Using Paho is straight forward, you need to create new Messaging.Client object providing host, port and clientId.
For newly created client you should provide two message callbacks:
  • onMessageArrived - function(msg)
  • onConnectionLost - function(e)
client = new Messaging.Client(hostname, 8883,clientId);
client.onMessageArrived = function(msg) {
  var topic = msg.destinationName;
  var payload = JSON.parse(msg.payloadString);
  data = payload.d;
};

client.onConnectionLost = function(e){
  console.log("Connection Lost at " + Date.now() + " : " + e.errorCode + " : " + e.errorMessage);
  this.connect(connectOptions);
}

Then you construct connection options object which will be passed to connect function of Messaging.Client. This will start connection with IoT MQTT Broker.
You have to specify:
  • timeout - number
  • useSSL - boolean
  • userName - string - as usual this will be API Key for your organization
  • password - string - and API Token  
  • onSuccess - function() - executed when connection was successful
  • onFailure - function(e) - execeuted on failure
var connectOptions = new Object();
connectOptions.keepAliveInterval = 3600;
connectOptions.useSSL=true;
connectOptions.userName=api_key;
connectOptions.password=auth_token;

connectOptions.onSuccess = function() {
  console.log("MQTT connected to host: "+client.host+" port : "+client.port+" at " + Date.now());
  $('#pStatus').text("MQTT connected to host: "+client.host+" port : "+client.port);
  self.subscribeToDevice();
}

connectOptions.onFailure = function(e) {
  console.log("MQTT connection failed at " + Date.now() + "\nerror: " + e.errorCode + " : " + e.errorMessage);
  $('#pStatus').text("MQTT  connection failed at " + Date.now() + "\nerror: " + e.errorCode + " : " + e.errorMessage);
}

console.log("about to connect to " + client.host);
$("#pStatus").text("about to connect to "+client.host);

client.connect(connectOptions);

Once you establish connection with IoT MQTT Broker you should subscribe to some topic.

var self = this;
// Subscribe to the device when the device ID is selected.
this.subscribeToDevice = function(){
  var subscribeOptions = {
    qos : 0,
    onSuccess : function() {
      console.log("subscribed to " + subscribeTopic);
    },
    onFailure : function(){
      console.log("Failed to subscribe to " + subscribeTopic);
      console.log("As messages are not available, visualization is not possible");
    }
  };
  
  if(subscribeTopic != "") {
    console.log("Unsubscribing to " + subscribeTopic);
    client.unsubscribe(subscribeTopic);
  }
  subscribeTopic = "iot-2/type/" + deviceType + "/id/" + deviceId + "/evt/+/fmt/json";
  client.subscribe(subscribeTopic,subscribeOptions);
} 

DOORS9 HTML window

Now it's time to display this very simple page in DOORS 9:
bool onB4Navigate(DBE dbe,string URL,frame,body){return true}
void onComplete(DBE dbe, string URL){}
bool onError(DBE dbe, string URL, string frame, int error){return true};
void onProgress(DBE dbe, int percentage){return true}

Module refreshModuleReff = null
void moduleRefreshCB(DBE x) {
  if (!null refreshModuleReff) {
    refresh refreshModuleReff
  }
}

string surl="c:\\iot\\realtime.html"
DB iotUIdb = create("do not close")
DBE iotUI=htmlView(iotUIdb, 100, 50, surl, onB4Navigate, onComplete, onError, onProgress)
DBE t = timer(iotUIdb, 1, moduleRefreshCB, "ping")
startTimer(t) 
realize iotUIdb 

You probably noticed I'm using timer functionality (page 48 in DXL Reference Manual), that's because layout DXL is run when the module is being refreshed, redrawn, etc. This simple code ensures it is refreshed every second.
There is a lot of space for improvement in above DXL; you can add a toggle to check if user wants refresh, you can add noError/lastError block before calling refresh refreshModuleReff. 

Now save above code as %doors_home%/lib/dxl/startup/iot.dxl and restart DOORS. This will create a top level window which can be accessed from any DXL context in your DOORS 9 session.

Layout DXL

Finally we can add some worker code. This will be using hidden perm which has following syntax:
string hidden(DBE, string)

string property = obj."accel_property"
if (property != "none") {
  noError()
  refreshModuleReff = current Module
  string strval = hidden(iotUI, "realtime." property "()")
  string err = lastError
  if (!null err) { halt }
  real val = realOf strval

  //enable blow if your Module is similar to the one from last post
  //obj."iot_data" = val
  display strval
} 

I put whole code from "MQTT Connection" chapter into realtime.js and created additional helper methods to get required
 
this.yaw = function() {
  if (data != null) return data.yaw;
  return -101;
}
this.pitch = function() {
  if (data != null) return data.pitch;
  return -101;
}
this.roll = function() {
  if (data != null) return data.roll;
  return -101;
}
var data = null;


Now the module is displaying real-time data from a device connected to IoT Foundation. It has 1 second delay, maybe you want to synchronize data object when it's retrieved from realtime.js.

Well afterwards I didn't need to write a JSON parser in DXL!

Conclusions

DOORS reading IoT real-time data is easy and possible! It was even simpler then I thought initially ;)

Remember in this blog I'm just giving examples what can be done, it's up to you to extend it with unsubscribe, change device type and device depending on current module. There are many thing you might want to add.

Thursday 19 November 2015

DOORS IoT data example

In my last post I demonstrated how easy it is to get data from IoT Foundation. But many of you might thing, why? Well I just wanted to show how easy it is to report on IoT data using pure old DOORS 9 but there was no fun in the preview post, just boring numbers. 

Example Scenario

Let's have a hypothetical example where we measure yaw/pitch/roll and have following rules:
  1. Measurable parameters of an Android Device shall not exceed expected maximum values
  2. Each report of a value exceeding 0.77 of maximum should be indicated Orange
  3. Each report of a value exceeding 0.88 of maximum should be indicated Red

So we need something to read data from IoT device, storage for a maximum value and some nice indicators.

Module Attributes

I just extended my module with extra attributes:
  • iot_data (real) - doesn't affect change bar, doesn't affect change dates and doesn't generate history
  • range_max (real) - a regular attribute of type range

Having add those I can update my layout DXL column to render color depending on current value:

if (findPlainText(re, ":", l, o, false)) {
  string ss = re[l+1:length(re) -3]
  real r = realOf ss
  obj."iot_data" = r
  real m = obj."range_max"
  real rm = r/m
  DBE cnv = getCanvas
  realBackground(cnv, realColor_Green)
  if (rm >= 0.77 && rm < 0.88) {
    realBackground(cnv, realColor_Orange)
    realColor(cnv, realColor_White)
  }
  else if (rm >= 0.88) {
    realBackground(cnv, realColor_Red)
    realColor(cnv, realColor_White)
  }
  else {
    realColor(cnv, realColor_Black)
  }
  display ss
}

Code isn't perfect, could be faster, but that's not the point here ;)

Now let's have some fun with canvas DXL. Add new DXL column and set its DXL to:

DBE canvas = getCanvas
if (canvas==null) halt

int rh = 50 du
int rw = 100 du

setHeight rh
setWidth rw

int normalize(real  x, max) {
  real  i = 180.0+ x/max * 180.0
  return intOf i
}

real x = (obj."iot_data")
real m = (obj."range_max")

if (m == 0.0) halt

int v = normalize(x, m)
int margin = 1

realColor(canvas, realColor_Green)

if (v >= 280 && v < 320) {
  realColor(canvas, realColor_Orange)
  margin = 2
}
else if (v >= 320) {
  realColor(canvas, realColor_Red)
  margin = 5
}

ellipse(canvas, 0, 0, rw, 2*rh)
realColor(canvas, realColor_White)
ellipse(canvas, margin, margin, rw-2*margin, (2*rh)-2*margin)

realColor(canvas, realColor_Black)
polarLine(canvas, rw/2, rh-1, rh, v)
Resulting View will show us (almost) real-time events and warning from our device.

DOORS 9 View with almost real-time IoT data warnings

Remember this could be a group of devices or a single device. It all depends what you want to measure.

Conclusions

IoT Foundation and DOORS 9 are really cool and really powerful! Hope to show you some more fun stuff soon!

Tuesday 17 November 2015

DOORS in IoT World

Last time I shown a quick example how to use HTTP request perms (DOORS functions) to get data from Cloudant database.
Today let's go a little step further and try to get more recent data.

Before you will continue with DOORS connection to IoT Foundation API please have a look at IoT Foundation Starter application. There you will learn how to create a simple IoT device, deploy a IoTStarterApp to your mobile, and start talking IoT! That's something I'm not covering in this blog (yet).

IoT connectivity

IoT devices communicate over Machine 2 Machine connectivity protocol - MQTT Message Queue Telemetry Transport. This protocol is extremely lightweight protocol much lighter than HTTP. MQTT becomes more and more popular, not only in IoT world but also in mobile data exchange, enven Facebook Messenger uses MQTT.

DOORS and IoT connectivity

MQTT protocol requires message broker and message client. From DOORS client point of view IBM IoT Foundation can be a message broker, but what with MQTT client? Well at the moment I do not see easy/best solution, one can try writing a MQTT client in DXL, or using some external library OLE interface. However MQTT messages are great for a real time data which can feed IoT Real-Time Insights or IoT Foundation.

OOTB DOORS can consume a historic data. In order to get a historic data from your device you can use IBM IoT Foundation HTTP API and you can do it using HTTP perms available in DOORS.

You can get events for:
  • every device from your organization - so all device types, all devices.
  • all devices of selected device types in your organization 
  • a single device
Each of possible historic data queries has a number of filters available.

Authorization 

IoT Foundation HTTP API uses a Basic Authentication in combination with HTTPS. You need to create  API key for your IoT service.

IoT API Key generation

When authenticating API Key is used as user name and its associated Auth Token as the password. Those should be provided in HTTP Authorization header in all requests.

Following DXL will add set authorization header:
  HttpHeader h = create
  string auth
  toBase64_(apiKey":"apiToken, auth)
  auth = auth[0:length(auth) -2]
  add(h, "Authorization", "Basic "auth)

toBase64_ is a perm which will convert a string into its Base64 representation. It it adds a newline character to the resulting string thus is it removed before use.

Historic query

Since I'm going to use my script in Layout DXL I'm not going to query for devices or their types. Let's assume I know already all these information (or it can be stored in module level attributes) so I will focus on a single query on specific device. From documentation I know I need to send a GET request (with empty body) to a URL:

https://internetofthings.ibmcloud.com/api/v0001/historian/${org}/${type}/${id}?filter

where:
${org} is organization
${type} - device type
${id} is a device ID

filter will let me narrow down response:
top - selects a number of top most events - selecting top=1 is almost equivalent to getting data in real time ;)
start/end - interval of historical data to query at
evt_type - narrows down selected events
sumarize and sumarize_typeString - allows performing aggregate functions of selected events.

My layout DXL will select an average from last 100 'accel' events over a selected attribute:
Buffer buf = create
buf = "https://internetofthings.ibmcloud.com/api/v0001/historian/" org"/" deviceType "/" deviceId "?top=100&evt_type=accel&summarize={"attr"}"

'accel' is one of the events sent by IoTStartApp if you use different IoT device, then please change that accordingly.

Proposed usage scenario

The scenario here is that you have requirement(s) for an Android device that is being measured in the real world. Those measurements may mean that error messages are passed back and ultimately you would want to report against these from your original requirements. The attribute "accel_property" in the following code is the key that identifies the thing and measurement that you are interested in (and is relevant to the requirement).

DXL IoT column in DOORS module


Whole DXL is:
Buffer getData(string apiKey, apiToken, org, deviceType, deviceId, eventId, attr)
{
  Buffer buf = create
  buf = "https://internetofthings.ibmcloud.com/api/v0001/historian/" org"/" deviceType "/" deviceId "?top=1&evt_type=" eventId "&summarize={"attr"}"
  HttpHeader h = create
  string auth
  toBase64_(apiKey":"apiToken, auth)
  auth = auth[0:length(auth) -2]
  add(h, "Authorization", "Basic "auth)
  HttpResponse resp = httpRequest(HttpGet, tempStringOf buf, null, h)
  delete h
  delete buf

  if (!null resp && resp.isOk)
  {
    HttpBody b = resp.body
    Buffer respBuf = create
    respBuf += b.value

    delete resp
    return respBuf
  }
  else {
    if (!null resp) {
      display "error getting response " resp.code""
      delete resp
    }
    else {
      display "connection error"
    }
  }

  return null
}

if (!null obj) {
  string val = obj."accel_property"
  if (val != "none") {
    Buffer b = getData("API KEY", "Auth Token", "org", "type", "id", "accel", val)

    if (!null b)  {
      string re = stringOf b
      int l,o
      if (findPlainText(re, ":", l, o, false)) {
        display val " last value " re[l+1:length(re) -3]
      }
      delete b
    }
  }
}

As you can see I'm using any fancy JSON parser, my query returns a single variable so it can be easily extracted from a string. 
I used an extra enum attribute I created for my module so each of my requirements can select different variable from query.

Conclusion

DOORS can be really easily connected to IoT Foundation and access its data. With help of htmlView DBE one can add an extra DXL window displaying IoT data in Real-Time.
There is a lot DOORS can do for you!

Bonus lines of DXL for Real-Time data

"Turn your mobile phone into an IoT device" is an interesting extension to the article on m2m I showed you in the very beginning of this post. In compare to preview post on IoT starter application it has "Step 6. Create a Bluemix app to visualize sensor data". Where you can see how to deploy IoT sample NodeJS application which will visualize real-time or historic data from your organization. Once you deploy this application to your Bluemix account run following DXL for a simple HTML viewer:

void iotDataView(string srcUrl)
{
    DB dlg = create("IoT real-time data", styleCentered)
    DBE html = htmlView(dlg, 1000, 800, srcUrl, onHTMLBeforeNavigate, onHTMLDocComplete, onHTMLError, onHTMLProgress)
    show dlg
}
iotDataView "http://(your_application_name).mybluemix.net"

This will create windows similar to the following:

Sunday 1 November 2015

DXL and Cloudant documents

It was a very busy month and I didn't have much time to write something interesting. Today I decided to do a quick tutorial how to use HTTP perms.

You can read more on HTTP Perms in DOORS DXL Reference

Reading Cloudant documents

We know DOORS integration is not only outgoing to outside World, sometimes we need to get data into DOORS. Today I will show you a very short DXL script which will let you get data from external database. Because of ease of use and simple connectivity I decided to connect to Cloudant.

If you are not familiar with Cloudant, have a quick look at its Learning Center. In short Cloudant is a NoSQL database-as-a-service (DaaS) widely used with Bluemix Mobile and Cloud applications. But let's not jump into my future post and let me show you how can you read Cloudant documents from DOORS DXL.

As you know Cloudant uses HTTP requests and my script will focus on DXL HTTP requests.

Buffer getData(string username, password, account, db, doc)
{
  Buffer buf = create
  buf = "https://" username ":" password "@" account ".cloudant.com/" db "/" doc
 
  HttpHeader h = create
  HttpResponse resp = HttpRequest(HttpGet, tempStringOf buf, null, h)
  delete h
  delete buf

  if (!null resp && resp.isOk)
  {
    HttpBody b = resp.body
    Buffer respBuf = create
    respBuf += b.value
   
    delete resp
    return respBuf 
  }
  else {
    if (!null resp) {
      print "error getting response " resp.code""
      delete resp
    }
    else {
      print "connection error"
    }
  }

  return null
}

Buffer b = getData("", "", "ae4582eb-c21f-4c1a-92b3-8da8589eea0c-bluemix", "doors9test", "aee5f8c1be606328ddedef1a546cb7ca")

if (!null b)  {
  print tempStringOf b
  delete b
}


I set read permissions on this Database so everyone should be able to get following JSON:

{"_id":"aee5f8c1be606328ddedef1a546cb7ca","_rev":"1-f44612cc036915a9f12f1afb7ba2fc37","title":"DOORS 9 connection test","description":"You shall read this ;)"}

Writing documents back to Cloudant


If you would like to PUT something to Cloudnant database you can use the same HttpRequest but this time you need to set HttpHeader Content-Type. This is required buy HttpRequest with HttpPut verb, otherwise an error will be reported. Setting header values is done with add perm:

HttpHeader h = create
add(h, "Content-Type", "application/json")  // minimum required header
add(h, "Accept", "application/json")

setValue perm is used to set HTTP body:
HttpBody body = create
Buffer buf = create
buf = {"_id":"aee5f8c1be606328ddedef1a546cb7ca","_rev":"1-f44612cc036915a9f12f1afb7ba2fc37"}
setValue(body, buf)
HttpResponse resp = HttpRequest(HttpPut, url, body, h)
That's how you can simply put and get data from the Cloudant!

Soon I will do some more fun stuff with DXL, Bluemix and IoT but for now...

Happy Halloween!

Thursday 1 October 2015

DWA headless authentication from Node.JS service

In my previous posts I was using dwaOauth.js to perform OAuth dance with DOORS Web Access. This was using DWA provided login page which required user to provide his DOORS credentials.

Then I showed you TRS feed reader which was using the same dwaOauth module. But it was very inconvenient to start reader and then manually authenticate with DWA. I would expect my server to connect to remote server itself and then just view the changes in the feed.

It's time to separate view from model and make them two separate applications.

Form authentication

With aid of superagent library one can perform form authentication with DWA and then use a agent to handle further TRS calls:
user1 = superagent.agent();
user1
  .post('https://your.dwa.server:port/dwa/j_acegi_security_check')
  .type('form')
  .send({ j_username: 'doors_user', j_password: 'password'})
  .end(function(err, res) {
    if (err != null) {
      console.log(err)
      user1 = null
    }
    else if (res.status != 200) {
      console.log(res.status)
    }
    else {
      console.log('user logged in to DWA')
      trs.TRSUpdate(user1, storeTRS);
    }
  })

Now user1 is authenticated and can be used to get data from DWA:

user1
  .get('https://your.dwa.server:port/dwa/rm/trs/changelog')
  .set('Accept', 'text/turtle')
  .buffer(true)
  .end(function(err, res) {
      // use res.text for plain text response from DWA
}

If you do not specify .buffer(true) res.text will be an empty string!

Monday 14 September 2015

TRS in DOORS Next Generation

DOORS Next Generation has two TRS specifications implemented TRS and TRS 2.
With very small changes to the my previous posts you can read TRS1 feed from DNG.

Well if you remember my first post you will notice that my dwaOauth.js works just fine with DNG. So that little module gives you not only access to DWA protected resources but you can reuse it with DNG or other CLM servers.
Remember there were small changes done in order to make it working with most recent Node.js. You can read about those changes in my OSLC TRS listener.

DNG setup

You will need to prepare your DNG to provide TRS. Have a look at Register applications as TRS providers for the Lifecycle Query Engine in IBM Knowledge Center.

Usually DNG is ready to provide TRS feed, all you need is to assign a license to user who would be used to read it.

Also you will need to check permissions on /lqe/web/admin/permissions

Now yow are ready to rad TRS from DOORS Next Generation.

Changes in trs.js

There are very little changes in function getChangelogFromDWA
> var previous = '/rm/trs';
var previous = '/dwa/rm/trs/changelogs'

and
> if (previous == '/rm/trs') {
if (previous == '/dwa/rm/trs/changelog') {

So we just changes the starting point for triple feed.

You probably will notice the difference in TRS changelog id in DNG feed. It ses uuid and it looks something like:
urn:uuid:cdaf9dae-ad2d-4434-87a7-aea2488182d6

Thus we need to modify parseChangeLogs function parser:

async.whilst(
  function () { return rest != trs.nil && order > iLastOrder},
  // get ordered list or changes
  function (next) {
      var o = store.find(rest, trs.first, null)[0];
      if (typeof o !='undefined') {
        
        var timearr = o.object.split(':');
        if (timearr[0] == 'urn') {
            // update order
            order = parseInt(n3.Util.getLiteralValue( store.find(o.object, trs.order, null)[0].object) );
            idx++;
            //
            //  DO SOMETHING USEFUL HERE
            //
        }
        
          rest = store.find(rest, trs.rest, null)[0].object;
      } else {
          rest = trs.nil;
      }
      next();
  },
  function (err) {
    console.log("read " + idx + " changelog items");
      last(changes);
  });

DWA has change id has a useful format so one doesn't need further requests to gets some basic details of the change occurred but it is easy enough to write some callback functions getting changes from DNG. Of course one should not relay on format of the id as it is not standardized, defined nor promised not to change. 

Conclusion

Well TRS is TRS ;) with little changes to my inital TRS posts you can consume DWA, DNG or even Bugzilla TRS feed. In near future I will explain how to modify this parser for TRS 2.0.
Stay tuned!

Tuesday 8 September 2015

TRS sample usage

If you read my last two posts you should be able to parse DWA TRS data.

Here I want to show two possible user cases using TRS data.

Below examples are using visjs.org to display graphs.

Change "Counter"

So simply show the amount of each type of changes per minute (day, week)

 

Data input

Each point has following format:
{
  "x": dt, //date and time of the event
  "group":grp, // one of [module,project,object][Creation,Deletion,Modification]
  "y": i // number of group operations per time interval (minute)
}

Browser code

Very simple Graph2d initialization and usage.
    <script>
        var groups = new vis.DataSet([
            {id:'objectModification', content:'Objects Modified', value:0},
            /* ....  */
            ]);
        var items = new vis.DataSet([]);
        var container = document.getElementById('visualization');
        var timeline = new vis.Graph2d(container);
     
        timeline.setItems(items);
        timeline.setGroups(groups);
        timeline.setOptions({
            drawPoints: {style:'circle'},
            interpolation: false,
            defaultGroup: 'ungrouped',
            legend: true
          });
 
        var socket = io();
        socket.on('trsevent', function(msg){
            items.add(msg);
        });
    </script>

Server code

Below code goes trough changes from latest to last parsed and calculates data for each minute which is present TRS feed.

Parser assumes following format of the change id:
urn:rational::1-dbid-[M|P|O-\d+]-00000525:hostname:port:YYYY-MM-DD:HH:mm:ss.nnn:xx
and it is not parsing other ids (like users:rational::...)

trs.js updated parser:
function parseChangeLogs(clResults, last) {
  var store = clResults[1];
  var lastID = clResults[0] || '';
  
  // check for store
  if (typeof store !== 'undefined') {
    var changesHead = store.find(store.find(null,
                        trs.changes,
                        null)[0].object,
                   trs.first,
                   null)[0];
    if (typeof changesHead !== 'undefined') {
      if (changesHead.object != clResults[0]) {
        
        // remember head
        cache.putCache('trs_lastChangeLog', changesHead.object);
        
        // try to get last trs.order based on lastId
        var lastOrder = (lastID != '') ? store.find(lastID, trs.order, null)[0].object : '"0"^^http://www.w3.org/2001/XMLSchema#integer';
        var iLastOrder = parseInt(n3.Util.getLiteralValue(lastOrder));

        // process changes from the newest to last parsed (or last in list)
        var rest = changesHead.subject;
        var changes = [];

        var order = Number.MAX_VALUE;
        async.whilst(
            function () { return rest != trs.nil && order > iLastOrder},
            // get ordered list or changes
            function (next) {
              var o = store.find(rest, trs.first, null)[0];
              if (typeof o !='undefined') {
                
                // o [_b:xx, trs.first, urn:rational::1-dbid-M-00000525:hostname:8443:2015-09-03:14:45:58.971:39]
                var timearr = o.object.split(':');
                if (timearr[0] == 'urn') {
                  var d = timearr[6].split('-');
                  var dt = new Date(d[0], d[1]-1, d[2], timearr[7], timearr[8], timearr[9]);
                  dt.setSeconds(0);
                  
                  // update order
                  order = parseInt(n3.Util.getLiteralValue( store.find(o.object, trs.order, null)[0].object) );
                  
                  // timearr[3] |= 1-dbid-M-00000525
                  //        |= 1-dbid-O-9-00000525    
                  var tmp = timearr[3].split('-');
                  var target = (tmp[2] == "M") ? 'module' : ((tmp[2] == "P") ? 'project' : 'object');
                  
                  if (target != '') {
                    // type -> group
                    var type = store.find(o.object, trs.type, null)[0].object;
                    var grp = target + type.split('#')[1];
                    
                    // push
                    var bDone = false;
                    for (var idx in changes) {
                      if (changes[idx].x.getTime() == dt.getTime() &&
                        changes[idx].x.getDate() == dt.getDate() && 
                        changes[idx].group == grp)
                      {
                        changes[idx].y++;
                        console.log('=>'+grp);
                        bDone = true;
                      }
                    }
                    if (!bDone) {
                      changes.push({x:dt, group:grp, y:1});
                    }
                  }
                }
                
                rest = store.find(rest, trs.rest, null)[0].object;
              } else {
                rest = trs.nil;
              }
              next();
            },
            function (err) {
              last(changes);
            });
      }
    }
  }
}

As you can see all data processing is done on server side, browser just displays those points. Same approach is used in second example

What, when, how

A timeline graph showing when item (project, module or object) was either created, modified or deleted:


Above is an iframe so you could try Timeline. Data behind this iframe is a static data used to show possible usage.

Browser code

index.ejs is similar to previous one but this time is using a Timeline graph:
<script>
        var groups = new vis.DataSet([
            {id:'Modification', value:0},
            {id:'Creation', value:1},
            {id:'Deletion', value:2}
            ]);
        var items = new vis.DataSet([]);
        var container = document.getElementById('visualization');
        var timeline = new vis.Timeline(container);

        timeline.setItems(items);
        timeline.setGroups(groups);
        timeline.setOptions({
            groupOrder: function (a, b) {
              return a.value - b.value;
            },
            editable: false,
            type: 'point'
          });

        var socket = io();
        socket.on('trsevent', function(msg){
            items.add(msg);
        });
</script>

Server code

Below is just a working part of parser:
async.whilst(
  function () { return rest != trs.nil && order > iLastOrder},
  // get ordered list or changes
  function (next) {
      var o = store.find(rest, trs.first, null)[0];
      if (typeof o !='undefined') {
        
        // o [_b:xx, trs.first, urn:rational::1-dbid-M-00000525:hostname:8443:2015-09-03:14:45:58.971:39]
        var timearr = o.object.split(':');
        if (timearr[0] == 'urn') {
          var d = timearr[6].split('-');
          var dt = new Date(d[0], d[1]-1, d[2], timearr[7], timearr[8], timearr[9]);
          
          // update order
          order = parseInt(n3.Util.getLiteralValue( store.find(o.object, trs.order, null)[0].object) );
          
          // timearr[3] |= 1-dbid-M-00000525
          //         |= 1-dbid-O-9-00000525    
          var tmp = timearr[3].split('-');
          var object = '';
          
          if (tmp[2] == "M") {
            var obj = store.find(o.object, trs.changed, null)[0].object;
            if (obj.indexOf('view') == -1) {
              object += 'Module ' + tmp[tmp.length-1];
            }
          } else if (tmp[2] == "P") {
            // Object
            object = 'Project ' + tmp[tmp.length-1];
          } else {
            // Object
            object = 'Module ' + tmp[tmp.length-1] + " object " + tmp[3];
          }
          
          if (object != '') {
            // type -> group
            var type = store.find(o.object, trs.type, null)[0].object;
            var grp = type.split('#')[1];
            
            // push
                  changes.push({start:dt, id:o.object, content:object, group:grp});
          }
        }
        
          rest = store.find(rest, trs.rest, null)[0].object;
      } else {
          rest = trs.nil;
      }
      next();
  },
  function (err) {
      last(changes);
  }
);

Same assumptions for change id.

Data passed to Timeline looks like:
{
  "start": time of event,
  "id": id of event (urn:rational...)
  "content": what to show
  "group": modification, deletion or creation
}

Conclusions

TRS gives us ordered list of changes to the resources in working set. Those are ordered from newest to the latest change. One can simply show them as a list of changes, to generate some kind of report, or a graph.

One might want to request some additional information using DXL services and show those changes in browser (see below), store them, etc. Possibilities are endless.

Showing a change made to the object with aid of DXL service and Timeline item templates: