Power BI Crypto Trade Analysis in Real-Time – 3 Easy Steps

It’s been a few weeks since I first pondered about what would be a suitable first post to kick-start this blog. The question was – which problem specifically could I address and is it something I care about? Then it hit me. Combine a passion for trading with a passion for analytics! So without further ado, here is how to perform real-time Power BI Crypto trade analysis in 3 easy steps. In this demo we will be using an open source client to connect to a cryptocurrency exchange. Subsequently, we will feed trades in real-time to a Power BI stream and display them on a dashboard.
Power BI Crypto Trade Data Pipeline
Power BI Crypto Trade Data Pipeline
Note: The design pattern outlined below facilitates any use case where one requires real-time reporting. Above all, it is very useful where a reliable API data source is available for consumption. The source code for this tutorial can be found on Github.


Step 1 – Configure a Real-Time Crypto Trade Data Feed

Getting Started With Binance

Initially, we need to connect to a cryptocurrency exchange and pull down real-time trades via a websocket. Now, in terms of a trading pair, let’s go with Bitcoin/USDT (USD Tether) as it is one of the highest volume markets on most crypto exchanges. However, neither the trading pair or choice of exchange are important. In other words, the focus will be on choosing a reliable data source and knowing how to set it up. We’re assuming that you have an account on Binance and know how to setup an API key. If any case, head on over to and set up an account if needed. Afterwards, here is how to set up an API key. Make sure to disable trading and withdrawal permissions. Finally, make a note of the API Key & Secret.

Binance API

Binance offers a rich API for interacting with the exchange. Specifically, it includes a standard REST API as well as websockets. Data is generally classified as public or private. For example, public data feeds include:
  • Ticker information
  • Trading pairs
  • Historic spot prices
Examples of private data feeds include:
  • Account balances
  • Trade history
  • Order history
The good news is that there is a large and active community of developers that maintain open-source clients that hide the complexity of the API. Therefore, today we are going to use a .Net wrapper for the Binance API.

Visual Studio Project Creation

Fire up a new .Net Core console project in Visual Studio. I called it CryptoTradesPowerBI and feel free to name yours as desired. Then, using NuGet or PM console, install the following packages:
  • Binance.Net (JKorf, Nikkozp, CaptHolley) v6.0.2*
  • CryptoExchange.Net (JKorf) v3.0.14*
* At time of writing. The project is going to have 4 classes:
Class NameDescription
ProgramEntry Point of Application
APIConfigBinance & Power BI API Configuration
TradeRepresentation of a Trade
TradeStreamContains a Static Method to Initiate a Stream from Binance to Power BI
In our APIConfig.cs class, let’s configure the API connection details. Note: The Binance client is shipped with the Binance Websocket connection string built-in. It is my understanding that the developers maintaining the client keep the connection string up to date and update the release notes accordingly.

APIConfig Class

using System;

namespace CryptoTradesPowerBI
    class APIConfig
        // Binance & Power BI configuration data

        // Binance API
        private static string _binanceAPIKey    = Environment.GetEnvironmentVariable("BinanceAPIKey");
        private static string _binanceAPISecret = Environment.GetEnvironmentVariable("BinanceAPISecret");
        private static string _binanceSymbol    = "BTCUSDT";

        // Power BI API
        private static string _powerBIAPIURL    = "";

        public static string GetBinanceAPIKey()
            return _binanceAPIKey;

        public static string GetBinanceAPISecret()
            return _binanceAPISecret;

        public static string GetBinanceSymbol()
            return _binanceSymbol;

        public static string GetPowerBIAPIURL()
            return _powerBIAPIURL;
Create the Environment Variables using the Binance API Key & Secret created earlier. Notice the Power BI Streaming Dataset API Url. This is one I previously setup and just altered it for security. In Step 2, you will have the opportunity to create your own. For now, we need a class to represent a trade.

Trade Class

using System;

namespace CryptoTradesPowerBI
    public class Trade
        // This class represents a Trade
        // A Trade object is instantiated using the stream event returned from Binance
        // The Trade data is then pushed to the Power BI Streaming Dataset

        public readonly string  Symbol;             // e.g. BTCUSDT
        public readonly string  Timestamp;          // Date & Time of the Trade
        public readonly decimal Price;              // Price Trade took place at
        public readonly int     TradeCount;         // # of Trades (including this) since stream began
        public readonly decimal AverageQuantity;    // Average Trade quantity
        public readonly decimal CumulativeQuantity; // Cumulative Trade quantity (including this)
        public readonly decimal Quantity;           // This Trade's quantity

        public Trade(string symbol, DateTime timestamp, int previousTradeCount, decimal previousCumulativeQuantity, decimal price, decimal quantity)
            Symbol             = symbol;
            Timestamp          = timestamp.ToString("yyyy-MM-ddTHH:mm:ss.fffZ"); // Power BI expects this format
            TradeCount         = previousTradeCount++;
            CumulativeQuantity = previousCumulativeQuantity + quantity;
            AverageQuantity    = CumulativeQuantity / TradeCount;
            Price              = price;
            Quantity           = quantity;
We will use the same object schema above to create the Power BI Streaming Dataset in Step 2. Next, we define how to connect to Binance via a websocket, pull down live trades and send them to Power BI. We could have put this logic in the Program class but it’s better create a wrapper in a separate class for code maintainability. Also, we need to make sure that Program.cs does not have anything to do with the implementation detail of the stream.

TradeStream Class

using System;
using System.Collections.Generic;
using System.IO;
using System.Net.Http;
using System.Threading.Tasks;
using Binance.Net;
using Binance.Net.Objects.Spot;
using CryptoExchange.Net.Authentication;
using CryptoExchange.Net.Logging;
using Newtonsoft.Json;

namespace CryptoTradesPowerBI
    class TradeStream
        // Client required for posting to Power BI
        static HttpClient httpClient = new HttpClient();

        // Initiate stream
        public static void InitiateStream(string binanceAPIKey, string binanceAPISecret, string symbol, string powerBIAPIURL)

            // Trade aggregations
            int tradeCount = 0;
            decimal cumulativeQuantity = 0;

            // Configure websocket
            BinanceSocketClient.SetDefaultOptions(new BinanceSocketClientOptions()
                ApiCredentials = new ApiCredentials(binanceAPIKey, binanceAPISecret),
                LogVerbosity   = LogVerbosity.None, // Change to Debug if required
                LogWriters     = new List { Console.Out },

            var socketClient = new BinanceSocketClient();
            Console.WriteLine("Connecting to Binance Websocket @ " + socketClient.BaseAddress + "\n");

            // Binance exposes many streams and we can connect to whichever ones we want
            // Here we are connecting to trade updates for a specified pair
            var successSingleTicker = socketClient.Spot.SubscribeToTradeUpdates(symbol, data =>
                // Increment aggregations
                cumulativeQuantity += data.Quantity;

                // Create Trade instance
                var trade = new Trade(symbol, data.TradeTime, tradeCount, cumulativeQuantity, data.Price, data.Quantity);

                // Convert to string
                var jsonString = JsonConvert.SerializeObject(trade);

                // Write to console
                Console.WriteLine(jsonString + "\n");

                // Post to Power BI Streaming Dataset
                    var postToPowerBi = PowerBIPostAsync(powerBIAPIURL, "[" + jsonString + "]");
                catch (Exception e)



        // Ansyc post to Power BI method
        static async Task PowerBIPostAsync(string pbiurl, string tradeEvent)
            HttpContent         content  = new   StringContent(tradeEvent);
            HttpResponseMessage response = await httpClient.PostAsync(pbiurl, content);
            return response;
All the above code does is define a method that takes API config parameters and creates a data pipeline from Binance to Power BI. Now, once the Binance websocket is open, we subscribe to a stream. Specifically, we will subscribe to the BTC/USDT trading pair. After subscription, the stream is going to send us details of trades as they transact on the exchange. Out of the return object, the Datetime, Quantity and Price are of interest to us.

Crypto Trade Running Aggregations

There are also running aggregations such as Average Quantity, Cumulative Quantity and Trade Count. As Cumulative Quantity and Trade Count are members of the Trade class, we need to pass in the previous readings to the constructor each time we generate a Trade object. Alternatively, we could separate the concerns of individual trades and running aggregations. However, since the Power BI Streaming Dataset is expecting a single schema it’s convenient to let the Trade class calculate and store the updated aggregations. Once the Trade object is created, it’s converted to a json string and post it to the Power BI API endpoint. Additionally, we write the string to the console to make sure the app is running correctly and to get a feel for the volume of trades being generated. Now that the core of the application is complete, all that’s left is to read the API configuration into Program.cs and initiate the stream.

Program Class

using System;

namespace CryptoTradesPowerBI
    class Program
        static void Main(string[] args)
            // Import Binance & Power BI config
            var binanceAPIKey    = APIConfig.GetBinanceAPIKey();
            var binanceAPISecret = APIConfig.GetBinanceAPISecret();
            var binanceSymbol    = APIConfig.GetBinanceSymbol();
            var powerBIAPIURL    = APIConfig.GetPowerBIAPIURL();

                // Start Stream
                Console.WriteLine("**** Powering up Binance to Power BI {0} stream ****\n", binanceSymbol);
                TradeStream.InitiateStream(binanceAPIKey, binanceAPISecret, binanceSymbol, powerBIAPIURL);
            catch (Exception e)

Step 2- Create a Streaming Dataset in Power BI

Sign in to Power BI online or register here Then, navigate to your preferred workspace and click on New -> Streaming dataset.
Streaming Dataset Creation
Streaming Dataset Creation
Choose API as the data source. Details on each data source is available here.
Streaming Dataset API
Streaming Dataset API
Now we will create the streaming dataset schema. Note, this needs to match the trade event strings we are posting to Power BI in Step 1. Hence we need to configure the schema in the New streaming dataset section as follows:
Power BI Crypto Edit Streaming Dataset
Power BI Crypto Edit Streaming Dataset

Note on Power BI Historic Data Analysis

Enabling Historic Data Analysis makes Power BI treat the data as a push dataset. This means the data will persist and will be available for analysis even after the the data has left the stream cache. Critically, this also means that we can create reports on top of the dataset using the full range of Power BI’s visuals. For the purposes of this demo, we will disable Historic data analysis.

Power BI API Endpoint

Once the dataset is created, the Push URL will be displayed on screen. This is the endpoint that our application will use to post events to the stream. Therefore, head back and set the Power BI Streaming Dataset API URL in the APIConfig class we created in Step 1.
Power BI Crypto Push URL
Power BI Crypto Push URL
The streaming dataset will show up in your workspace once complete and changes can be made if necessary. With the application ready to go and a streaming dataset created, all that remains is to create a dashboard in Power BI to consume our snazzy new data feed.

Step 3- Power BI Dashboard Creation

Back in your Power BI workspace, select New -> Dashboard and choose an appropriate name. We are going to place 5 tiles in the dashboard. To add a tile, select Edit -> Add a tile -> Custom Streaming Data. Then, select the streaming dataset created in Step 2. The tiles we will create are:
  1. Price Time Series (Line Chart)
  2. Trade Quantity & Average Quantity Time Series (Line Chart)
  3. Cumulative Quantity Time Series (Line Chart)
  4. Latest Price (Card)
  5. Trade Count (Card)
For each time series chart, choose Timestamp for the x-axis and the appropriate series for the y-axis. Also, choose the Symbol field where possible for the legend. For the cards, simply select the appropriate field from the dropdown. Feel free to tweak the display units.

Crypto Trade Analysis – Program Execution

Now that the dashboard is created, all that’s left is to run our program and observe the console & dashboard. So, back in Visual Studio, hit ctrl + F5 to run the program. If everything is configured correctly, the following output will appear:
Power BI Crypto Websocket Connection
Power BI Crypto Websocket Connection
Success! Since BTC/USDT is a high volume market the trades will start appearing within a second.
Power BI Crypto Live Stream
Power BI Crypto Live Stream
Nice work! And now for the moneyshot. Back in the Power BI dashboard, the tiles should start to consume the streaming data and update the visuals in real-time. Indeed, that is what we see!
Power BI Crypto Live Dashboard
Power BI Crypto Live Dashboard
That’s it! To summarize, we have successfully created a Power BI dashboard that displays cryptocurrency trades in real-time! If you enjoyed this post, please leave a comment below. Likewise, and for more content and news, why not follow me on Twitter or subscribe to my YouTube channel. For direct contact, feel free to use the contact form.

Share this post

Share on twitter
Share on linkedin
Share on email

Also by Jonathan:

5 2 votes
Article Rating
Notify of
Newest Most Voted
Inline Feedbacks
View all comments
3 years ago


Paul Russell
3 years ago

Coming from a background separated from the .NET stratosphere . Meaning university involved me learning: Java. Following that; a scripting language. Surely not Perl, yes PERL.

So, my first ever stint in industry, after nearly 3 years of Object Orientated Programming in University. Was Perl (a scripting language(dated scripting language)language),

If you made it past the brackets and the counterintuity of the last sentence, you have successfully learned Perl (a task which I have not yet completed(and never will))

More nested brackets you say….

So after a week of Python, I a;ready knew this was the language for Data Science And Machine Learning. (Sorry SQL you’re the real god) – Long Lived Win

Then, after focusing on Python and SQL I began to find my base in Data Engineering, so I thought.

With Python and SQL I should surely be able to process data at scale on any cloud platform with no issues?

I’m halfway there, How will I scale?

Well, I haven’t talked about scale.

So cloud platforms, in this example: GCP (My Favourite)

For creating a streaming data platform, these are the components:

Airflow [WHEN TO IMPORT THAT DATA] (Daily/Monthly/Yearly)
**insert apache beam. kafka step**
Google Data Studio [HOW WILL I SHOW MY RESULTS/ANSWERS] (3D/ Bar Charts/Histograms) (Could be PowerBI, Google Data Studio, Looker)

The key element in creating a streaming data platform doesn’t lie in the cloud infrastructure, but lies in the scheduler (Airflow).

I know little about Airflow, but it operates on the premise (cron job) that you create a task for each operation and you can set out the order in which each task is executed within each operation. **The abstraction opportunities here allow for extension and adaption of pre-existing airflow DAGs** (Also with the use of docker we can further abstract any operation in an ETL process)

But after all that, through my high-level explanation of how you would process and visualize streaming data in Python on GCP. (remember I did not mention Apache Beam or Kafka)

It involves quite a few more steps thank if you went the C# / PowerBI route.

I hope without me saying, that, for your use-case (streaming data),the tool of choice may be an MS Azure Stack rather than GCP or AWS. (You will avoid having to learn Apache Bean or any other cloud streaming frameworks)

Purely, for ease-of-use.


Would love your thoughts, please comment.x