Software Design Blog

Journey of Rinat Abdullin

Inject Business Intelligence Into .NET Software

Intelligent software brings and saves money. So let’s learn about injecting some business intelligence into a simple .NET application.

We’ll see, how easy it is to model a simple application using eXpress Application Framework. We will focus on enriching its analytical capabilities with some serious business intelligence by integrating with Lokad Forecasting API in order to get predictive analytics for the published reports.

Basically, Business Intelligence (or BI for short) is a term used for the discovery, extraction and analysis of the business data in order to provide better decision support for the organizations. Better decisions make business - better, since this information helps to make more money and reduce expenses, performing better than the competition in the long run.

There are multiple forms of Business Intelligence, ranging from data-mining and reporting up to KPIs and predictive analytics. Each form has unique capabilities and requirements in order to be of any use.

We’ll focus on the simplicity of adding predictive analytics to a simple .NET line of business application that has not even been designed for data mining.

Let’s consider basic situation, where some company sells products to the customers, while keeping track of the inventory in stock and all the orders. There are lot’s of these. Such scenario could be easily and rapidly modeled with the eXpress Application Framework (XAF for short).

Given we have customer, products and orders, generated database structure might look like this:

Database for a simple LOB application

And the generated User Interface would look like this:

All screen-shots are real, but represent completely mock data, created by GenerateData.com, googling up and parsing some old catalogies with filehelpers library, Random.Next. API keys are mocked as well.

For some simple business cases, after some tuning (and with richer domain model), this application would even work out as a real LOB application. However, I recommend some caution if you plan to use the default XAF architecture for the anything other than simple applications that do not need scalability and high levels of concurrency. Efficiently managing rapidly changing business rules and challenges of enterprise integration might also become tricky down the road.

Let’s us get back to the prototype .NET application for now. If we turn on the Pivot Chart Module within the XAF, then ability for a simple data-mining would be added, along with the charting capabilities. So with a little bit of drag-n-dropping users could get them some nice historical reports and analytics.

Business Intelligence and Pivot in XAF

Business Intelligence and Pivot in XAF

So far so good, but we want more. Let us spend a little bit of time solving two challenges that come up with this default XAF Analytics module:

  • Running reports stresses the underlying database a bit.
  • We want predictive analytics - to see how the sales would look like all the way up to June.

As it turns out the solution to these two could be rather simple and straightforward. We just need to define our own XAF Module with a little bit of custom functionality.

We could optimize the performance by capturing visible report data from the Pivot and publishing it to the separate table. This table could be in another database or even in the cloud (SQL Azure being the simplest solution and some sort of big table - cheapest), for all we care. Even if it is in the same database, the performance benefit would be there, since we don’t need to call that 7-JOIN query any more.

Because of that, whenever some manager accesses already published report (i.e.: to make a presentation, export into PDF or simply play with the numbers), database would not be stressed. In a way that’s a simple case of Command-query separation at work here.

Once we’ve got the report with some data (i.e.: sales), adding good forecasts requires either a development team with PhD and statistical skills or usage of some forecasting web service. Second option is much cheaper and faster. So we’ll go for integrating with forecasting service to handle the complex math for us. We’ll send a copy of the published numbers to Lokad Forecasting API and will merge the results with the published report afterwards.

Development-wise, we just need to create a controller for the Analysis object (available from the existing BaseImpl classes) and a “Publish” action. The action will grab data from our analytics report and publish it to a specialized type of the business intelligence object, which actually inherits from the Analysis class.

Database structure for simple report publishing

Within the “Publish” action we need to grab cell contents and save them to another report. The simplest way of grabbing the data is:

var pivotCells = pivotGridControl.Cells;

for (int col = 0; col < pivotCells.ColumnCount; col++)
{
  for (int row = 0; row < pivotCells.RowCount; row++)
  {
    var cell = pivotCells.GetCellInfo(col, row);

    if (cell.Item.IsGrandTotalAppearance)
      continue;

    var colValue = cell.GetFieldValue(cell.ColumnField);
    var rowValue = cell.GetFieldValue(cell.RowField);
    var cellCalue = cell.Value;

Once we’ve got cells, we simply figure out, which side of the cell holds the date and which - the name, parse the results and save them to a forecast value list:

    if (TryConvert(colValue, out date))
    {
      name = rowValue.ToString();
    }
    else if (TryConvert(rowValue, out date))
    {
      name = colValue.ToString();
    }
    else
    {
      Messaging.DefaultMessaging.Show("Problem", 
        "Either column or row should be convertable to date.");
      return;
    }
    if (!TryConvertValue(cellCalue, out value))
    {
      Messaging.DefaultMessaging.Show("Problem", 
        "Can't convert cell to a number.");
      return;
    }
    var forecastValue = new ForecastValue(session)
      {
        Date = date,
        Name = name,
        Value = value
      };
    list.Add(forecastValue);
  }
}

forecast.Values.AddRange(list);
forecast.ObjectTypeName = typeof (ForecastValue).FullName;
forecast.Name = "Forecast " + DateTime.Now.ToString();
forecast.Prefix = Guid.NewGuid().ToString().Substring(0, 6);
session.Save(forecast);

Where ForecastValue is a collection within the Forecast object (derived from the Analysis of the base implementation library).

The only important thing is that Forecast object is smart enough to fetch its own published values, when asked for the analytics data. This done with the criteria bound to the Oid. Since at the time of creation its primary key might be empty, we’ll update the criteria next time, it is loaded from the DB, there is no rush here:

protected override void OnLoaded()
{
  if (string.IsNullOrEmpty(Criteria) || Criteria.Contains("00000"))
  {
    Criteria = CriteriaOperator.Parse("Forecast.Oid=?", Oid).ToString();
  }
  base.OnLoaded();
}

Although we could immediately access the published report, we need forecasts. This is done by uploading data to the Forecasting API and then later merging values into the published report. ForecastSerie table (and entities) will be used to keep basic data needed for this process.

The process could start like this:

using (var api = new LokadApiForm(forecast))
{
  api.ShowDialog();
}

where the form looks as simple as:

Lokad Forecasting API integration

Users just need to enter their Forecasting API key, select forecasting options and press Publish. Lokad API keys could be created in the “Users” section of the web management UI:

Creating Lokad API Keys

Given all this, with the latest version of Lokad Forecasting SDK, we need to:

  • Connect to the API;
  • Compose our scattered forecast values into the regular time series;
  • Save series to the API, while stripping actual names (this keeps data perfectly valid for forecasting, but makes it useless for if others, if we decide to share the Lokad account with the other people in the company).
  • Save the API key and sync data into the database, so that later, when accessing the published report, we would be able to merge forecasts into it.

The simplified core code (bound to the “Publish” button in the LokadApiForm) might look like this:

var service = ServiceFactory.GetConnector(apiKeyEditor.Text);

// wipe all old series for this report
service.DeleteSeries(service.GetSeriesWithPrefixLegacy(forecast.Prefix));

// compose loose values into time series and upload them
var lookups = forecast.Values.ToLookup(v => v.Name, v => 
    new TimeValue{ Time = v.Date, Value = v.Value }
  );

var nameToKey = lookups.ToDictionary(n => n.Key, n => Guid.NewGuid());
var keyToName = nameToKey.ToDictionary(n => n.Value, n => n.Key);
var series = lookups.ToArray(s => new SerieInfo()
  {
    Name = nameToKey[s.Key].ToString()
  });

service.AddSeriesWithPrefix(series, forecast.Prefix);
var serieIdToKey = series.ToDictionary(si => si.SerieID, 
  si => new Guid(si.Name));
service.UpdateSerieSegments(series.Select(g => new SegmentForSerie(
  g, lookups[keyToName[new Guid(g.Name)]].ToArray())));

// define forecasting tasks
var period = MaybeParse
  .Enum<Period>(_periodText.SelectedText)
  .ExposeException("Invalid period");

var interval = Convert.ToInt32(_intervalEdit.Value);
var tasks = series.Convert(si => new TaskInfo(si)
  {
    FuturePeriods = interval,
    Period = period,
  });
service.AddTasks(tasks);

// save sync information into our Database
forecast.Key = apiKeyEditor.Text;
forecast.Series.AddRange(tasks.Convert(ti => 
  new ForecastSerie(forecast.Session)
  {
    Serie = ti.SerieID,
    Task = ti.TaskID,
    Name = keyToName[serieIdToKey[ti.SerieID]]
  }));

Second part of the job is to retrieve forecasts and merge them back into the published analytics report. The code below goes to a Simple Action that is bound to the Forecast business object:

// Retreive the forecasts
var forecast = (Forecast)View.CurrentObject;
var service = ServiceFactory.GetConnector(forecast.Key);
var session = forecast.Session;

var forecasts = service.GetForecasts(forecast.Series.ToArray(s => 
  new TaskInfo()
  {
    SerieID = s.Serie,
    TaskID = s.Task
  }));

// merge forecast values with the published report
var dates = forecasts.SelectMany(f => f.Values.Select(d => d.Time))
  .Distinct().ToSet();
var intersections = forecast.Values
  .Where(v => dates.Contains(v.Date)).ToArray();
session.Delete(intersections);

var taskToName = forecast.Series.ToDictionary(fs => fs.Task, fs => fs.Name);

var values = forecasts.SelectMany(f => f.Values.Select(v => 
  new ForecastValue(session)
{
  Category = "Forecast",
  Date = v.Time,
  Name = taskToName[f.TaskID],
  Value = v.Value
}));
forecast.Values.AddRange(values);
session.Save(forecast);
Messaging.DefaultMessaging.Show("Ready", "Report Ready");

When we access the published report after it had been merged with the forecast values, then we would see, that our data has just got enriched with the forecast estimates up to the June:

Sales reports with forecasted values

And since the published report with forecasts is just another Analytics object, we could access them alike from the reporting section:

Business intelligence in XAF

This does not look that bad for a hack-technologies-together-in-a-day effort, does it? Still, there is a lot of room for improvement.

First, the scenario, from the business viewpoint, already screams for a little bit of the enterprise integration in order to automate some repetitive tasks related to handling orders, keeping track of the inventory, automatically and pro-actively handling stock shortages or merely sending out notifications. We could use something like NServiceBus to jump-start the development here. Forecasts and predictive analytics could be used to make the process even more efficient for the business.

Then there’s always room for the enterprise dashboard with some KPI indicators and drill-down reports, allowing the owner to keep track of the business heart beats. Actually, if we keep history of the important domain events, this also provides rich business audit logs for the stakeholders (what happened when, where and why). This information could also be used for the better business intelligence feedback in form of event analysis and detection of various scenarios.

There is an example, how the composite dashboard of an enterprise application might look like on an iPad.

At the very least, we could automate report publishing and forecast retrieval, since “click-click-click” is too boring, plus there aren’t many users working concurrently during the nights. Forecasts could be visualized better in our Pivot and Chart displays. We could also send more data from the Pivot (i.e.: values of the outermost fields) to the Lokad API in form of tags, which would enhance the forecasting precision.

Then, how about some business rules that are editable by the end-users, powered by the internal DSL and linked to the analytics:

when product sales_estimates
   are_greater_than_supplies_by 25%
   then alert

when product sales
   increase_by_more_than 25%
   compared_to previous.week
   then notify

when customer orders
   increase_by_more_than 50%
   compared_to previous.month
   then make_customer_preferred

Obviously these business intelligence features require more than a single day to be implemented. Yet, they are feasible with the technologies and services we have at hand in .NET, if the business could use such capabilities to become more profitable.

Business intelligence approach could be taken even further, if you capture and persist domain events in your solution. You can literally go back in time in order to find dependencies and project them into the future.

If you are interested in more articles on this topic, you can stay tuned by subscribing to this blog via RSS.
Note, that this document outlines the scope of the challenges that xLim 4 body of knowledge aims to solve efficiently.

All comments, thoughts, questions and any other feedback are welcome and appreciated.

So, what do you think?