Technical Considerations

Deep dive in to setting up micro-services to facilitate ledger interactions.

Proxy Signing Service

Sawtooth is built for enterprises and has some limitations that need to be considered when allowing public write access to a database. To interact with Transaction Processors(TPs) memory must be managed in the signed batch of transactions. These complexities have lead to building a simple Relay Process Command(RPC) which is described below. This is built with Node.js and happens inside an express.js server instance 'app':

index.js { transaction proxying }
app.get('/adv-build', (req, res, next) => {
var auth = false
if (req.session) auth = authed(req.session.user)
if (!auth) {
delete req.session
} else {
//[1-4Now all the memory access has been permission by Batch Processor (BP) signature (authorization is important). The transactions have been listed, that batch has been built and signed and "finished" into a package.]

This authorization check should happen in middleware; here for linearity.

Here a JSON web token is verified against our secret key. This token could be delivered by magic link.

function authed(user) {
var auth = false
try {
if (user && localUsers[jwt.verify(user.token, sessionSecretKey).sub] != null) {
auth = true //If session exists, proceed to page
} catch (e) { console.log('JWT Error: ' + e)}
return auth

Once the user is verified a method to hand form data over is used; here in url query:

index.js [1]
var payload = {
[req.query.vt || 'Verb']: req.query.verb || 'inc',
[req.query.nt || 'Name']: req.query.action || 'test',
[ || 'Value']: req.query.desc || parseInt(req.query.num) || 1
if (req.query.addr || req.query.rt) {
payload[req.query.rt || 'addr'] = req.query.addr || null
if ( === 'emre') {
payload = `${
req.query.verb || '0002019CO2T8642'},${
req.query.action || 'create'},${
req.query.desc || '000_2019_CO2_T_8642_test_Qmtest'}`

This supports building any type of transaction. It was modeled off the intKey transaction family but includes methods to change to any transaction payload with three strings. More complex transactions will need more work here and refinement is suggested. By default this will generate a transaction that will increment an intKey at the 'test' address. If emission records(emre) are requested a test transaction is also built there. From here we will be talking through selecting both of these types.

index.js [2]
let payloadBytes
var inputs = [], //addresses to read
outputs = [] //addresses to write
const _hash = (x) => // crypto = require('crypto')
const _hash64 = (x) =>
crypto.createHash('sha512').update(x).digest('hex').toLowerCase().substring(0, 64)
const INT_KEY_FAMILY = 'intkey'
const INT_KEY_NAMESPACE = _hash(INT_KEY_FAMILY).substring(0, 6)
const EMSSIONS_KEY_FAMILY = 'emre'
switch ( {
case 'emre':
inputs.push(EMSSIONS_KEY_NAMESPACE + _hash64(payload.split(',')[0])) //test for allowed names?
outputs.push(EMSSIONS_KEY_NAMESPACE + _hash64(payload.split(',')[0]))
payloadBytes = Buffer.from(payload, 'utf-8')
case 'intkey'://intkey
default: //intkey
inputs.push(INT_KEY_NAMESPACE + _hash(payload.Name).slice(-64))
outputs.push(INT_KEY_NAMESPACE + _hash(payload.Name).slice(-64))
payloadBytes = cbor.encode(payload)

TPs are expecting properly formatted payloads in properly signed and addressed batches. Depending on how the TP is made will determine the type of payload it is expecting. IntKey expects a concise binary object representation (cbor) while emission records uses a standard utf-8 buffer available in node.

Depending on which TP is intended will determine the addressing scheme. Both of these transactions only take the same input and write to the same output. To build reference tables and utilize internal data dependencies for these addressing schemes will also require more addresses to be added to these input and output arrays.

Now that we have out payload encoded for each of their possible TPs let's look at how signing occurs.

index.js [3]
const transactionHeaderBytes = protobuf.TransactionHeader.encode({
familyName: || 'intkey', //can these be arrays? this could be a goverenace limitation
familyVersion: req.query.version || '1.0',
inputs: inputs,
outputs: outputs,
signerPublicKey: signer.getPublicKey().asHex(),//public keys for signers
batcherPublicKey: signer.getPublicKey().asHex(),//'signer' is batch processor key
dependencies: [],
payloadSha512: createHash('sha512').update(payloadBytes).digest('hex')
const signature = signer.sign(transactionHeaderBytes)//sign header bytes
const transaction = protobuf.Transaction.create({
header: transactionHeaderBytes,
headerSignature: signature,
payload: payloadBytes
}) //
const transactions = [transaction] //can include multiple in batch
const batchHeaderBytes = protobuf.BatchHeader.encode({
signerPublicKey: signer.getPublicKey().asHex(),
transactionIds: => txn.headerSignature),
const bsignature = signer.sign(batchHeaderBytes)
const batch = protobuf.Batch.create({
header: batchHeaderBytes,
headerSignature: bsignature,
transactions: transactions
const batchListBytes = protobuf.BatchList.encode({
batches: [batch]

All that's left to do is send off the batch to our sawtooth api running normally at port 8008

index.js [4]{
url: '',
body: batchListBytes,
headers: { 'Content-Type': 'application/octet-stream' }
}, (err, response) => {
var link = JSON.parse(response.body).link
if (err) {
res.setHeader('Content-Type', 'application/json');
res.send(JSON.stringify({ payload, err, res: response.body }, null, 3))
} else {
if (link) {
console.log('Followed: ' + link)
url: link
}, (e, r) => {
if (e) return console.log(e)
res.setHeader('Content-Type', 'application/json');
var body = JSON.parse(r.body)
var link =,
res.send(JSON.stringify({ payload, inputs, outputs, data:, link }, null, 3))
} else {
res.setHeader('Content-Type', 'application/json');
res.send(JSON.stringify({ payload, inputs, outputs, body: JSON.parse(response.body) }, null, 3))

Following which the confirmation/status link is re-wrapped and returned to the user.