Partner
How to get the largest bitcoin transactions with Blockchair's API
2020-11-06 09:50:28
Guillaume Souillard
Today, we are going to see how to use Blockchair's API to index the biggest bitcoin transactions. ClankApp uses Blockchair since 1 year ago. The quality of the service is awesome and the team is always available.
What is Blockchair?
Today, we use blockchain explorers everyday in order to track our transactions or extract some stats about cryptocurrencies. Blockchair is the most advanced search engine for exploring 17 blockchains (and more in the future).
Blockchair keeps your personnal data secret and has a "Crypto-anarchist corner" allowing to use Blockchair through the Tor Browser for example. They aspire to become the DuckDuckGo for the blockchain community and act for the adoption of this technology.
It is for all these reasons that we decided to work with Blockchair.
Blockchair's API ❤️
Blockchair's API represents 40% of our data, mainly for bitcoin, ethereum and ERC 20 tokens.
But today we will focus on bitcoin and we will see how to index the biggest bitcoin transactions.
Api requirements, pricing and rate limit.
As you can see in the official documentation, you can use Blockchair API for free if you don't need a lot of charge (Up to 1440 requests a day for the Free Plan).
If you need more calls, consult Blockchair Plans here.
If you are a student, you can use your GitHub student account to get premium access for Blockchair's API ✌🏼✌🏼.
Have you made your choice? Let's see how to use the API
As a first step, it is important to keep the documentation open in order to understand everything. We're just going to focus on the infinitable endpoints (SQL-like queries) for the transactions part.
This endpoint will allow us to make requests with conditions to find bitcoin transactions.
Our first request
# Transactions endpoint
GET https://api.blockchair.com/{:btc_chain}/transactions?{:query}
Remember the goal, here we want to recover bitcoin transactions with a total output value greater than $1,000,000.
For that, we will replace {:btc_chain}
by bitcoin
and we will construct our query logic like this: q=output_total_usd(1000000..)
.
Which gives that:
GET https://api.blockchair.com/bitcoin/transactions?q=output_total_usd(1000000..)
If you run this request in Postman for example, you will get this result:
{
"data": [
{
"block_id": 656595,
"id": 586132324,
"hash": "5efb8d3b2e67fb7984872bff49103fe59419a068cb0b9fec3c0196d699715a7a",
"date": "2020-11-12",
"time": "2020-11-12 14:37:46",
"size": 3003,
"weight": 9729,
"version": 1,
"lock_time": 656594,
"is_coinbase": false,
"has_witness": true,
"input_count": 3,
"output_count": 61,
"input_total": 25000841362,
"input_total_usd": 3923410,
"output_total": 25000619406,
"output_total_usd": 3923370,
"fee": 221956,
"fee_usd": 34.8318,
"fee_per_kb": 73911.4,
"fee_per_kb_usd": 11.599,
"fee_per_kwu": 22813.9,
"fee_per_kwu_usd": 3.5802,
"cdd_total": 22.899962184847
},
...
],
"context": {
"code": 200,
"source": "A+T",
"limit": 1,
"offset": 0,
"rows": 1,
"pre_rows": 1,
"total_rows": 2065884,
"state": 656595,
"cache": {
"live": true,
"duration": 60,
"since": "2020-11-12 14:54:27",
"until": "2020-11-12 14:55:27",
"time": null
},
"api": {
"version": "2.0.68",
"last_major_update": "2020-07-19 00:00:00",
"next_major_update": null,
"documentation": "https://blockchair.com/api/docs",
"notice": "Beginning July 19th, 2020 we start enforcing request cost formulas, see the changelog for details"
},
"time": 2.0017759799957275,
"render_time": 0.01679706573486328,
"full_time": 2.018573045730591,
"request_cost": 1
}
}
data
contains an array of database rows. Each row is in the following format:
Column | Type | Description |
---|---|---|
block_id | int | The height (id) of the block containing the transaction |
id | int | Internal Blockchair transaction id (not related to the blockchain, used for internal purposes) |
hash | string [0-9a-f]{64} |
Transaction hash |
date | string YYYY-MM-DD |
The date of the block containing the transaction (UTC) |
time | string YYYY-MM-DD HH:ii:ss |
Timestamp of the block containing the transaction (UTC) |
size | int | Transaction size in bytes |
weight † | int | Weight of transaction in weight units |
version | int | Transaction version field |
lock_time | int | Lock time — can be either a block height, or a unix timestamp |
is_coinbase | boolean | Is it a coinbase (generating new coins) transaction? (For such a transaction input_count is equal to 1 and means there's a synthetic coinbase input) |
has_witness † | boolean | Is there a witness part in the transaction (using SegWit)? |
input_count | int | Number of inputs |
output_count | int | Number of outputs |
input_total | int | Input value in satoshi |
input_total_usd | float | Input value in USD |
output_total | int | Output value in satoshi |
output_total_usd | float | Total output value in USD |
fee | int | Fee in satoshi |
fee_usd | float | Fee in USD |
fee_per_kb | float | Fee per kilobyte (1000 bytes) of data in satoshi |
fee_per_kb_usd | float | Fee for kilobyte of data in USD |
fee_per_kwu † | float | Fee for 1000 weight units of data in satoshi |
fee_per_kwu_usd † | float | Fee for 1000 weight units of data in USD |
cdd_total | float | The number of destroyed coindays |
More info about data fields here
Now, we want to adjust our request in order to have transactions, with output total value greater than $1,000,000, and total output wich is equal to 1 and let it be ordered on the time field in the ASC order.
GET https://api.blockchair.com/bitcoin/transactions?q=output_total_usd(1000000..),output_count(1)&limit=1&s=time(asc)
I'll let you try 😉
To conclude
We only reviewed a tiny fraction of what Blockchair's API is able to do, I will let you make your proposed opinion. Be curious, and consult the official documentation for more information.
Disclaimer: This article is for informational purposes only and you should not construe any such information as investment or other advise. ClankApp services do not engage in any trading activities. We urge everyone to do their own research and draw their own conclusions.
Recent publications
Continue reading with our latest publications.