Power BI Embedded: Step 3 - integrate your report using PHP and JavaScript

Using Power BI Embedded you can integrate your report in a website or web-application. To do so there are three parts you need to do:

  • Configure your Azure environment
  • Add a Power BI workspace to this environment and upload a report
  • Implement the code to generate a token and embed your report

In this last post we will zoom in on the third part. In the previous post we’ve created a workspace, uploaded our report and saved the ID’s of the workspace and the report. In this post we will use that information (in combination with the Access Key) too generate a token and embed the report.

Continue reading...

Power BI Embedded: Step 2 - Create a workspace with Node.js and add a report

Using Power BI Embedded you can integrate your report in a website or web-application. To do so there are three parts you need to do:

  • Configure your Azure environment
  • Add a Power BI workspace to this environment and upload a report
  • Implement the code to generate a token and embed your report

In this post we will zoom in on the second part. In the previous post we’ve created a Power BI Workspace Collection in Azure. We will use the Access Key of this collection to add a workspace and upload our .PBIX Power BI report.

Continue reading...

Power BI Embedded: Step 1 - Configuring Azure

Using Power BI Embedded you can integrate your report in a website or web-application. To do so there are three parts you need to do:

  • Configure your Azure environment
  • Add a Power BI workspace to this environment and upload a report
  • Implement the code to generate a token and embed your report

In this post we will zoom in on the first part: setting up your Azure environment.

Continue reading...

3 Important Lessons about Test Automation in Business Intelligence

As you might know, I'm keeping an eye on how I can automate my Data Warehousing testing. In "Automated Testing for the Data Warehouse" my observation was

In essence, testing a Data Warehouse isn't that complex

It turns out I'm not the only one thinking about this (really? 😉 ). Some time ago I listened to episode 72 from SQL Data Partners podcast, starring Lynn Winterboer. Really deep insight in the requirements for test-driven Data Warehouse work there, I definitely recommend listening the full podcast1

Continue reading...

Row level security on your SQL database

One of the cool new features of SQL 2016 or SQL Azure is that you can add row level security to your database directly.

So you don't need to use a cube or a tabular model to implement row level security. Let me show you how you can use it. The example is based on Adventureworks, the sample database from Microsoft. In this example I am going to show how you can restrict acces to certain colors of products (not really a realistic scenario I guess, but it's done pure as an example).

First we are going to create a table where we store for each user what colors he can see.

second, we are going to create a function that checks wether a user can see a certain color:

Make sure to add yourself to the userrights table, or add an exception in the function below to make sure certain roles can see al colours regardless.

and lastly, we need to enable row level security on the products table:

There you go!

You may want to restrict acces based on roles rather than users, but you can build from this example and try it out yourself 🙂

 

 

Recover the lost query

Just recently I was writing a very complicated query. Took me 2 hours to write. In the end I saved it as a view in the development database. I did not yet synchronize it back to the solution in visual studio, because I wanted to add a few related views for the problem I was solving. Maybe it was better to develop from visual studio, but sometimes you just want to see the results from the query, and merge it back to visual studio later.

At that moment however, a colleage of mine did a publish from his version of Visual Studio to the database on the development server. Gone were all my views and my hard work. 🙁 Lessons learned of course. But then I felt a mission. The mission to find my lost query. I would rather try two hours to recover my query than to rewrite it from scratch. In the end I found this gem:

(taken from: http://stackoverflow.com/questions/15034832/recover-unsaved-sql-query-scripts)

And I did find my lost query!

So if you ran a query to your SQL database and accidently closed the session, you might want to try this to find it back.

Strangely enough I did not find the alter view statement, but I did find my select statement. So you might have to search a little. But at least there is hope.

Trim the fat (put your data warehouse on a diet)

The idea of organizational Business Intelligence was to create "One version of the truth, One Ring version to rule them all" errrr, well that was the idea. Away with all those Excels and bring on the conformed BI reports right? Of course it is a great idea to make sure you have a data warehouse where your data is guaranteed "right". For example, if you want to know how much profit you made last year, you don't want to have multiple interpretations!

How should we then view the rise of Power BI and self service BI? Hello different versions from the truth?

The real question is: how many reports need to be exactly unique across the whole enterprise? How much integration do you need? In the past, organizations tried to fit everything in a big Enterprise Datawarehouse. Just saying EDWH makes managers shiver, remembering huge costs and little payback. Or, big BI competence centers that are handling a huge backlog, because everything has to come from the EDWH. Lots of waiting time and saying "no" because there is too much work.

Sure it helps to have master data available across the business, and data quality is not a bad idea either. But you don't have to push ALL the data into the data warehouse. I think there is more data where you can give the data analist freedom to decide how he wants to see the data than we used to think.

Instead of this situation, where the blue is the managed BI or enterprise datawarehouse and purple is the self service BI layer:

 

We could "trim the fat" of the data warehouse and give users more freedom to go to this situation (taken from Motion10's data platform model):

 

The most important element here is to make sure you have a good data catalog, where users can easily find datasets they want to use for their analysis. It will be interesting to see how Azure Data Catalog can play a role in this. At the moment I often use the datasets in Power BI in combination with content packs as a catalog. All datasets have clear understandable names. That is as good a catalog as any other. If the report is complicated in any way, the first page contains a short self explaining description.

 

 

Best Practice: no more queries in frontend tools

You may have experienced it: your report was working just fine, untill you published a database change and oops! a report doesn't work anymore. Especially when you have about 70+ reports running their own queries, this can be a pain.

Why would a report contain a query? Well, because it is easy, quick to develop, and why not?

The reason is, you don't get to manage all those queries at all. They are the hidden queries, long forgotten, never tuned and unfindable in your database.

Please use views or a stored procedure to feed your datasets in reporting services. If the query lives in the database and you use a daily build to check if the code still works, then you will be notified if the query is broken, before you publish it to production. Same goes for all front end tools, including Power BI. At least use a view. The other tools going to the database should not know the datamodel at all.

The idea is that a view can remain consistent while the underlying datamodel can change as needed.

Azure Data Factory, first thoughts

I have been trying out Azure Data Factory.

You can wonder if ADF is more like SSIS in the cloud or SQL Server Agent in the cloud. It does not quite feel like a pure ETL tool. Coming from SSIS on premise this is a completely different experience. My first impression is that the UI is not very intuitive. It takes a while before you realize where you need to be when solving a problem. Another thing that I found was that there is not very much about it online. Stuck with a problem? you're likely to be on your own. Googling my error message got me nothing, that's quite refreshing for me (unthinkable for a SQL Server error) 🙂

It is quite easy to copy data from A to B. In other words: you can do E L: Extract and Load. The T is not very easily done. If you want to get some data from Azure Blob storage and load it into your Azure SQL, this is a good tool. But if you want to do complex transformations? Not so sure yet...