Excel A Business Intelligence Toy?

Excel A Business Intelligence Toy?

Over the years I have lost count how many times I have heard the mantra “Excel is just a toy you can’t use it for serious Business Intelligence”. Often from “IT experts” that never had to work with a BI solution from a business user perspective.


In part this statement is true:  particularly when Excel is (mis)used as data store for storing copies of similar data in 85 different versions where no one know what the origin is and if someone has changed (inadvertently…) the logic (formulas) or the actual data. We are all familiar with the resulting spreadsheet hell and related horror stories a plenty: https://www.cio.com/article/2438188/enterprise-software/eight-of-the-worst-spreadsheet-blunders.html


But is this really Excel’s fault? Not really… It’s just the wrong approach of storing data in the spreadsheet  (In rare cases this can make sense for example with Power Pivot as I will point out later) as opposed to using it as a (very flexible and effective!) presentation layer that can cater for most analytical application requirements that involve “power users” in their familiar spreadsheet environment.

Separation of data storage & presentation

The crucial factor is to clearly define the boundaries of how Excel is used. The key to avoid these issues in a business intelligence context is in our opinion the separation of data storage and presentation layer: data should not be pasted directly from a data source (or even worse rekeyed) into a spreadsheet but made interactively accessible from consistent, “single version of the truth” data mart(s) that also include as much as possible key parts of the logic (especially calculations).

Exce(l)ptional New Capabilities

Excel offers great integration with a lot of data sources  out of the box. One of them -although unfortunately by now not available in all editions- is Power Pivot (Vertipaq/ xVelocity engine) that enables users to connect to a data source and load millions of records into a model linked to an Excel workbook. This allows the users to build reports as needed based on a consistent single Power Pivot data model that can be automatically updated when the source data changes.  In addition to data consistency another crucial aspect is security: as long as security rights are properly defined in the underlying source and the user has to authenticate, she will only ever see what she is allowed to see despite multiple users using the same report definition.

Some examples below (and yes this is all “just” Excel…)


The Next Level: A Two Way Client Server Solution

Power Pivot is a great solution but it doesn’t cater for every use case particular when there are requirements to:

  • use the data model with different clients (e.g. web applications, mobile apps etc.) and multiple users need to work on the same model for different applications.
  • write back and collect data into the central model.

In these cases a real client server based solution is needed. This can be for example Analysis Services or other complementary solutions (Managility offers two with Clear Jelly and Agility Planning that are add-on components to MS SQL Server for handling business driven business modelling/writeback) that offer a server based two way architecture for reading and writing. In conjunction with Excel Online this approach also enables rolling out professional BI applications to users that only need a web browser.

At Managility we have used this architecture approach very successfully for more than 10 years in large scale enterprise BI projects with hundreds of users and Gigabytes of data volumes. The advantages are plentiful:

  • consistency and security
  • the flexibility that Excel provides in regards to report definition
  • extensive visualisations options ans ad hoc analysis is unparalleled

and finally of course:

  • Excel knowledge is widely available.


Admittedly Excel is not the answer in every case (for mobile deployment  or interactive dashboards Power BI is a brilliant complementary option) but if properly used in conjunction with a true central data model (these days often cloud based) it can be a key front end option for tremendously powerful busines intelligence solutions that typically cater for most analytical application requirements that involve “power users” in their familiar spreadsheet environment.


Please don’t hesitate to contact us to arrange a hands on experience of the new options in one of our Fast Start Workshops or the new Advanced Business Intelligence with Excel workshop. All in all, the improvements in Excel/Power BI are definitely a huge leap forward and will pose a substantial threat to currently very popular, read only analysis tools like Tableau and Qlikview whose price tag will be closely analysed by organisations who have already invested in Microsoft licenses.

Power BI & The Austrian Election

Power BI & The Austrian Election

Over the weekend I had a bit of time on my hand and I wanted to try out some new features in the latest Power BI release.

While looking for an interesting data set I thought about the Austrian election. As most of you know this is the country where I was born before I moved to Australia and came across an interesting web site that aggregates poll results and other election information called “Neuwal.com“. The Neuwal operators are kind enough to make their data available via json format. So, I started my shiny new latest version of the Power BI engine and connected to that data set and put an initial simple analysis together in about an hour:

I tried the fairly new ribbon chart visual to show the aggregate, average poll results, this is an interesting alternative to stacked bar charts for showing trends. The table visual on the right contains the links to the publications that have published the poll results. So, with a click on the survey time line, the relevant links are automatically filtered and the user can navigate to the specific publications. Finally I am showing at the top left a comparison of the last election to the average of all polls for a selected time frame.

To put this analysis took about an hour with a live link to the data source. So, new Neuwal results are automatically updated. If you would like to “play with the application” please use the embedded report below:

To discuss how Managility can help with your analytics projects please contact us here: Contact Managility

Managility Logistics Optimization For Power BI

Managility Logistics Optimization For Power BI

Managility just completed a project with a large retail company analyzing and optimizing their shipment processes based on a Power BI solution. The relevant data set included 300.000 annual shipment transactions with all relevant details like pickup/drop off location, weight, volume, provider and shipment cost as well as a pricing data from a variety of logistics companies.

The initial step involved gaining insights into the data and providing the client with a clear picture. The outcome is shown in the dashboard below (for confidentiality reasons all data is anonymised). It enables insights into developments over time, grouping data into weight and volume buckets, rankings of top sender and drop off locations as well as a variety of filter options (e.g. dangerous goods etc.). Helpful insight was gained through a new custom visual called “Flow Map” that visualizes route details through the thickness of the link between two locations.

The next step in the project was focused on realising tangible savings. Shipment rates of all relevant logistics suppliers were added to the model. This fairly complex requirement with a variety of varying rate parameters like availability in the route, differentiations by weight, time of day, urgency, volume etc. was handled effectively with Power Query. DAX calculation logic was then implemented to determine the optimal provider/ rate option to minimise cost.


The results of this process are available in an interactive dashboard that includes all providers and their shipping products. As shown in the screen shot below savings of around $1m through using an optimal mix of logistics products was identified that can then be further filtered using the criteria on the right.

The projects was initiated using Managility’ s Fast Start program that included clarification of project deliverables and priorities, the building of an initial prototype and a project plan for the entire project that in the end took only 10 days from start to finish. Using legacy BI solutions that Managility has been using beforehand, a project like this would have taken weeks.

For information on the Managility Fast Start program and product/supplier mix optimization please contact us here.

Powerapps based Practice Management App

Powerapps based Practice Management App

What Is It?

PowerApps, is a relatively new service by Microsoft that enables business users to build mobile or web browser based apps in conjunction with their corporate data in an Office like environment with no or a very low number of lines of code.
At Managility we have been using Powerapps (apps) and the related Flow (workflows) services mostly to implement customised budgeting/forecasting process steps like for example budget data collection and ongoing tracking of actual purchases to assigned budgets.
Recently we wanted to test how Powerapps could be utilised for our internal processes and built a practice management / time sheet system for our consultants. Within just a few days an initial version of the application was ready that runs as a mobile app or from a web browser.


How does it work?

Initially our consultants are automatically authenticated using their office365 subscription without a need to login separately. From that point onwards they can select the project and enter tasks and hours against it using an easy, touch enabled interface.

Users with admin rights can access summary reports directly on the app , create new projects or invoke a billing process that generates invoices to be sent via email with a summary of the billable using our xero accounting solution.



Finally, there is also the option to use the data of the app with an interactive Power BI dashboard:


Contact us how Powerapps mobile applications could streamline your processes:  Contact Managility

Power BI Top 10 Learnings Tip #1 GIGO: Garbage In Garbage Out

Power BI Top 10 Learnings Tip #1 GIGO: Garbage In Garbage Out

Power BI is an awesome tool but no tool will be of any use if your underlying data features the 3 “I”s:

1. incomplete,

2. incorrect,

3. inconsistent

For a simple ad-hoc analysis of one source you can get by fixing #1 in Power Query and hoping to identify the “Incorrect” in the analysis. It’s no secret that the initial benefits of many Business Intelligence projects are typically the identification of data issues that become apparent through improved insights with the use of effective data visualisation.

As soon as you want to use multiple sources #3 will become an issue, where “changes in the front end (i.e. Power Query)” will often not fix the issues anymore and you will require a broader BI repository strategy where a key part is the architecture of an integrated data storage approach for analytical purposes often referred to as a “Data Warehouse”.  

For us at Managility, the reporting and front end side is typically the easiest part of project that takes 10-20{320046cdb665ced85659719d82eb4f3c995a3f48b2ef66684e89730660807005} of the overall implementation time. Typically, the major portion of the rest of the time in our projects is spent on the design and implementation of the back end with loading and cleansing and ensuring consistency of data in between the different sources into a database optimised for analytics.

A simple Star Schema used for ClearJelly.net

A process referred to as ETL (Extraction Transformation and Loading) in our tech speak. As much as software vendors will try to position their tool as “really easy” and “self-service”  an inherent part of the data integration process are complex issues where data typically “doesn’t match” and mappings andrules will need to be established that no out of the box process will be able to cover co,mpletel and that typically requires specialist knowledge.

At a minimum, we recommend that as soon as you embark on more complex analytics projects with different sources, you make yourself familiar (or find someone to help you…) with multi-dimensional modelling and how to structure data into star schemas (in a nutshell the separation of facts, records of what is happening and dimensions: details and hierarchies by which you want to analyse your data). It is a more or less mandatory concept in Power BI to have data structured that way if you want to analyse across different sources. At minimum, you will require a dimension table with distinct elements by which the different sources and their “Fact Tables” can be joined. The simplest and likely most widely used example there is a common date table which we will cover in more detail in the next tips&tricks post here.

Easy Sentiment Analysis for Twitter

Easy Sentiment Analysis for Twitter

In this post we are covering how the new Microsoft Flow can be used to insert data from Twitter with sentiment analysis into a Power BI streaming dataset.

Create streaming dataset in PowerBI:

Go to PowerBI.com -> Then “Streaming dataset” –> Create streaming datas

Now we will create a dataset type of API.

Name the dataset –> Then add the following values:

Time ——DateTime.



Sentiment values will be numbers in 0,1 (with 0 being negative and 1 positive).

Create flow to push data from twitter to PowerBI:

Go to –>flow.microsoft.com–> My Flow–>Create from blank.

Enter your flow name and title and on the screen below Click on twitter category

Then select new step–> Add an action –>search by “Sentiment” word and select “Text Analysis – Detect Sentiment”

Example: select tweet text to be your text that will be analysed with text analytics – Detect Sentiment

In the last step we are going to push the data into a PowerBI Streaming dataset.

Click on new step –>Add an action–> select PowerBI–> Then click on PowerBI – Add rows to a dataset.

Then select your workspace, dataset, and table

Then select the following:

time: created at

Tweet: Tweet Text

Sentiment: Score

Then select create the flow.

Now the data is inserting directly to your streaming dataset.

Let’s see how to use that in a dashboard:

Go to PowerBI, dashboard then click on add tile.

Click on tile –> Then select “Custom Streaming Data” –> select your streaming dataset–>select your visual type ie: Line Chart –> Then add time as Axis, Tweet as Legend, and Sentiment as Values

Or you can use build direct report from your streaming data set.

Go to streaming data set–> click on create report.

Power Search

Power Search

Inspired by the Delta Master BI -a tool that we have worked with for many years and that combines advanced analytics and a great UI- I was looking to realise one of its cool features in Power BI. Delta Master is a brilliant tool but unfortunately has an “enterprise price tag. As we will see we can get pretty close to using one analysis method using Power BI.

The Delta Master feature is called “Power Search” and does a ranking across all dimensions and levels in a multi-dimensional cube (e.g. Analysis Services).

The user can specify a measure e.g. “Revenue” and the method returns a ranking across all dimensions like for example shown here:

This is a great starting point to see what is “driving ” your revenue. In this case the first 2 results are trivial as we are only operating in this market and with one company but from rank 2 it gets more interesting and we see that a customer group has the biggest share of “Revenue” with 80{320046cdb665ced85659719d82eb4f3c995a3f48b2ef66684e89730660807005} followed by a product group classification with 78{320046cdb665ced85659719d82eb4f3c995a3f48b2ef66684e89730660807005} etc.

A ranking like this is also particularly interesting if it’s on a measure like Actual-Budget variance as it will avoid that I have to navigate through multiple dimensions to find where is the variance coming from. Using the Power Search approach I get the key factors immediately.

Taking this approach further Power Search also allows you to include combinations in the ranking when it gets even more interesting like:

To realise something similar in Power BI you just add the Table visual to your dashboard and drag the attributes that you want to use in the ranking into there:

Power Search


And voila we see an overview of what is driving Revenues across all relevant attributes. To Calculate the share of the total just use this simple DAX calculation:

Share = CALCULATE(sum(Xero_Sales_Data[LineAmount])/CALCULATE(sum(Xero_Sales_Data[LineAmount]),ALLSELECTED()))



One click integration: OLAP/RDBMS/Online Service

A requirement that is very often communicated to us in client projects these days is “Data mash up”. The ability to integrate internal and external data sources more effectively and enabling more holistic insights.

One of the key challenges in the setup of analytical application processes is the side by side use of relational databases often referred to as the “data warehouse” and multidimensional data stores.

The data warehouse architecture is typically “data driven“ or in other words it typically involves integrating existing data from different sources. This architecture is suited for flexible analysis and often requires specific technical expertise by the analyst.

The other approach is “concept driven” i.e. enabling the analyst to enter data, conduct what if analyses and to modify structures (business modelling). This mostly involves multi-dimensional (also referred to as OLAP) databases that are optimised for use by non IT specialist business users. Often an “in memory” approach is used in this architecture that enables extremely fast response times when calculating query results across 3-20 dimensions with multiple aggregation hierarchies.

None of the above concepts is “better than the other” the use case typically depends on the requirements at hand and in most cases Managility recommends a combined approach that involves both architectures.

In recent times a third aspect to consider is the integration of commercial web based data services for example provided in the Azure Data Market or by other third party providers. This enables organisations to put their data in “perspective” for example by comparing it to temperature, population or financial markets details.

Historically the “keeping in sync” of these different “data worlds” required extensive development work of ETL (Extraction, Transformation and Loading) processes.

To address the integration challenges Managility has recently developed a new approach that synchronises OLAP and relational sources: The PowerSync framework a solution that we provide to our clients free of charge. PowerSync automatically converts Jedox cubes into a relational dimensional model with fact and dimension tables.

The benefits are much improved audit options (every OLAP transaction is a relational record than can be easily traced) and a comprehensive view on the data. The combined sources can now be analysed with existing data warehouse reporting solutions or any standard BI tool that supports relational access.

In addition Managility Power Sync supports new technologies like PowerPivot, a component that enables analysis of large data volume sources in Excel and the easy integration of external sources. As per the example below the Jedox Sales demo can be easily extended with population information to enable a relative metric measuring sales per population.

At the moment we have used the approach at a few early adoption clients with very good feedback. Please don’t hesitate to contact the Managility team or me for any further questions or to setup a demo with your data.




In this post, I would like to present you the “Option Pricing n Greeks” in Jedox.

Well continuing on my previous post, I would like to highlight the potential of Jedox in terms of number crunching capacity, where I have designed the macro for option pricing most commonly used in any front desk team of an investment bank.

For all the background about options derivatives used in the financial industry and the measures (the Greeks) incorporated in this model references are as below:





This model in Jedox computes the option prices and its greeks based on the user inputs and as the benefits of Jedox come along together the possibilities are enormous.


  3. function dOne($S, $X, $T, $r, $v, $d){
  4.  $n1 = application()->Log($S / $X);
  5. $n2 = ($r $d + 0.5 * pow($v , 2)) * $T;
  6. $d1 = $v * pow($T,2);
  7. $value = ($n1 + $n2) / $d1;
  8. return $value;
  9. }
  11. function NdOne($S, $X, $T, $r, $v, $d){
  12. $nd = dOne($S, $X, $T, $r, $v, $d);
  13. $n1 = 1* (pow($nd , 2) / 2);
  14. $d1 = application()->sqrt(2 * application()->Pi());
  15. $value = application()->Exp($n1) / $d1 ;
  16. return $value;
  17. }
  19. function dTwo($S, $X, $T, $r, $v, $d){
  20. $value = dOne($S, $X, $T, $r, $v, $d) $v * application()->sqrt($T);
  21. return $value;
  22. }
  24. function NdTwo($S, $X, $T, $r, $v, $d){
  25. $value = application()->NormSDist(dTwo($S, $X, $T, $r, $v, $d));
  26. return $value;
  27. }
  29. function OptionPrice($OptionType, $S, $X, $T, $r, $v, $d) {
  30. if($OptionType == ‘C’){
  31. $value = application()->Exp(-$d * $T) * $S * application()->NormSDist(dOne($S, $X, $T, $r, $v, $d)) $X * application()->Exp(-$r * $T) * application()->NormSDist(dOne($S, $X, $T, $r, $v, $d) $v * applica tion()->sqrt($T));
  32. } elseif ($OptionType == ‘P’){
  33. $m1 = application()->Exp(-1*$r * $T);
  34. $m2 = application()->NormSDist(-1 * dTwo($S, $X, $T, $r, $v, $d));
  35. $m3 = application()->Exp(-1 * $d * $T);
  36. $m4 = application()->NormSDist(-1 * dOne($S, $X, $T, $r, $v, $d));
  37. $s1 = $X * $m1 * $m2;
  38. $s2 = $S * $m3 * $m4;
  39. $value = $s1$s2;
  40. }
  41. return $value;
  42. }
  44. function OptionDelta($OptionType, $S, $X, $T, $r, $v, $d){
  45. if($OptionType == ‘C’) {
  46. $value = application()->NormSDist(dOne($S, $X, $T, $r, $v, $d));
  47. }elseif ($OptionType == ‘P’){
  48. $value = application()->NormSDist(dOne($S, $X, $T, $r, $v, $d)) 1;
  49. }
  50. return $value;
  51. }
  53. function Gamma($S, $X, $T, $r, $v, $d){
  54. $value = NdOne($S, $X, $T, $r, $v, $d) / ($S * ($v * application()->sqrt($T)));
  55. return $value;
  56. }
  58. function Vega($S, $X, $T, $r, $v, $d){
  59. $value = 0.01 * $S * application()->sqrt($T) * NdOne($S, $X, $T, $r, $v, $d);
  60. return $value;
  61. }
  63. function OptionRho($OptionType, $S, $X, $T, $r, $v, $d){
  64. if($OptionType == ‘C’){
  65. $value = 0.01 * $X * $T * Exp(-1 * $r * $T) * application()->NormSDist(dTwo($S, $X, $T, $r, $v, $d));
  66. }elseif ($OptionType == ‘P’){
  67. $value = 0.01 * $X * $T * Exp(-1*$r * $T) * (1 application()->NormSDist(dTwo($S, $X, $T, $r, $v, $d)));
  68. }
  69. return $value;
  70. }
  72. function OptionTheta($OptionType, $S, $X, $T, $r, $v, $d){
  73. if($OptionType == ‘C’){
  74. $value = -(($S * $v * NdOne($S, $X, $T, $r, $v, $d)) / (2 * application()->sqrt($T)) $r * $X * Exp(-1 * $r * ($T)) * NdTwo($S, $X, $T, $r, $v, $d)) / 365;
  75. }elseif ($OptionType == ‘P’){
  76. $value = -(($S * $v * NdOne($S, $X, $T, $r, $v, $d)) / (2 * application()->sqrt($T)) + $r * $X * Exp(-1 * $r * ($T)) * (1 NdTwo($S, $X, $T, $r, $v, $d))) / 365;
  77. }
  78. return $value;
  79. }



Screen Shot’s” associated with the windows keyboard “Print Screen” function key, have been and still being utilized heavily to report error’s occurring on the client side with the application. And clients not sending/capturing the screen state at the time of errors lead to difficulty reproducing the issue within the development arena in order to resolve the same.

Thus to simplify a similar situation, I thought it would be great if the screen shot could be captured at the time of error occurring (error handling “catch” block) and sent across to server and saved as an image without client going through the pain of capturing, saving, attaching and send via portal tickets/email.

Below is a small POC demonstration with the help of a JavaScript library and custom PHP code customization to achieve the similar.

The functionality above illustrated, will allow you to capture the screen shot of the current spreadsheet in Jedox, (but the widget area will remain blank as the JavaScript API excludes the iframe html components to render in the the screen capture image). But this technique is not restricted to Jedox it can be part of any large scale application which is custom HTML written.

With this interesting technique I would only like to highlight the dangers of the same, as it can be wrongly used to target the client side confidential information, so please beware !!! (Mischievous intent people shoo!!! away)

The process above illustrated is a two step process, in which the first step source target area screen shot is captured via “html2canvas.js” and converted and appended to he body of the code base as html “Canvas” element, which is then picked up by next JavaScript code blocks and sent to the server as raw blob content where it is saved back as a “PNG” file on the server.

Sample Code:


[code language="html"]

<p>Screen Shot Widget</p>
<input type="button" value="Capture Screenshot" onclick="takeScreenShot();">
<input type="button" value="Post Screenshot" onclick="postImage();">
<script src="/pr/custom/html2canvas.js"></script>
<script type="text/javascript">
 function takeScreenShot(){
 html2canvas(window.parent.document.body, {
 onrendered: function(canvas) {
 var cand = document.getElementsByTagName('canvas');
 if(cand[0] === undefined || cand[0] === null){


 function postImage(){
 var cand = document.getElementsByTagName('canvas');
 var canvasData = cand[0].toDataURL("image/png");
 var ajax = new XMLHttpRequest();
 ajax.setRequestHeader('Content-Type', 'application/upload');
 ajax.send(canvasData );






[code language="php"]

// Get the data

// Remove the headers (data:,) part.
// A real application should use them according to needs such as to check image type
$filteredData=substr($imageData, strpos($imageData, ",")+1);

// Need to decode before saving since the data we received is already base64 encoded

//echo "unencodedData".$unencodedData;

// Save file. This example uses a hard coded filename for testing,
// but a real application can specify filename in POST variable
$fp = fopen( 'test.png', 'wb' );
fwrite( $fp, $unencodedData);
fclose( $fp );