Stuck writing large and nested `if-else if-else`

conditions? Trouble following how all these different conditions interact and flow together? Here's a tool that can help: *decision trees*.

Decision trees are a way to model conditional logic in a clear and composable way. Although commonly used in machine learning, they can be just as useful in more "typical" use cases which we will explore in this article.

This article will provide a brief introduction into trees as a data structure and decision trees as a tool, as well as their application in software development.

We will create a binary decision tree in JavaScript step-by-step and apply it to decide whether a value is a positive number, negative number, zero, or not a number at all.

Read on to find out more!

## What is a tree data structure?

A **tree** data structure is a type of data structure in which data - represented by **nodes** - is connected in such a way that every node **branches out** into zero or more **child nodes**.

Visualising node connections gives the structure the appearance of a tree, hence the name.

## What is a binary tree data structure?

A **binary tree** data structure is a special type of tree data structure where **every node can have up to two child nodes**: a **left child node**, and a **right child node**.

A binary tree begins with a **root node**. The root node can then branch out into left and right child nodes, each child continuing to branch out into left and right child nodes as well.

Nodes that branch out into children are called **non-leaf nodes**. Nodes without children are called **leaf nodes**.

Going through the nodes of a binary tree - **traversing** the tree - gives us the choice of moving to *either* the left or right child node - a *binary choice -* earning this type of tree the name *binary tree*.

## What is a decision tree?

A **decision tree** is a tool to help visualise *decisions* and the *consequences* of their *outcomes*.

At its simplest, a decision tree contains **decision nodes** and **outcome nodes** (also called *end nodes*).

Decision trees may also contain *chance nodes*. Chance nodes serve as "weights" to favour one family of outcomes over another under certain conditions.

There are many different ways to visualise decision trees, one example being flowchart symbols.

## What is a binary decision tree?

A **binary decision tree** is a decision tree implemented in the form of a binary tree data structure.

A binary decision tree's *non-leaf nodes represent conditions* and its *leaf nodes represent outcomes*.

By traversing a binary decision tree we can decide on an outcome under a given context and conditions.

## What are decision tree applications?

Decision trees can be applied for predictive modelling in statistics, data mining, and machine learning.

Decision trees can also be applied in game development for building AIs and branching story paths, as well as general development scenarios where there is a need to handle large chains of interconnected conditional statements.

## How to turn conditional statements into binary decision tree nodes?

To turn conditional statements into binary decision tree nodes, we have to treat *conditional statements* and *outcomes* as *arguments* passed to *decision functions*.

We will begin designing our decision tree data structure API by looking at conditional statements in our example.

The `decideNumberSign`

function takes in a parameter `x`

and attempts to return its sign, 0 or `?`

if the sign cannot be determined:

```
function decideNumberSign(x) {
if (x > 0) {
return "+";
} else if (x < 0) {
return "-";
} else if (x === 0) {
return "0";
} else {
return "?";
}
}
```

When it comes to making decisions based on a condition, we need to define an outcome for the case of the condition being true, and another outcome for the case of it being false. With that in mind, our decision node function would look like this:

```
decision(x > 0, "+", "-");
```

Combining decision nodes would then look like this:

```
decision(x > 0, "+", decision(x < 0, "-", decision(x === 0, "0", "?")));
```

To support more complex conditions and prevent evaluating conditions on nodes that won't be traversed, we can refactor our condition expressions into functions that will only be called when the node is reached:

```
const isPositive = x => x > 0;
const isNegative= x => x < 0;
const isZero = x => x === 0;
decision(isPositive, "+", decision(isNegative, "-", decision(isZero, "0", "?")));
```

With the API finalised, we can implement the function:

```
const decision = (conditionFunction, trueOutcome, falseOutcome) =>
(context) => conditionFunction(context) ? trueOutcome : falseOutcome;
```

We can now build a decision tree out of decision nodes, but we can't traverse the tree just yet. To traverse the tree and reach a decision we must be able to test the conditions by calling them with a context.

## How to perform binary decision tree traversal?

To traverse a binary decision tree we provide a context to the root node of the tree, which then calls its condition function and any decision node condition functions that follow as outcomes.

Let's again start by outlining the API:

```
const context = "number";
const numberSignDecision = decision(isPositive, "+", decision(isNegative, "-", decision(isZero, "0", "?")));
decide(context, numberSignDecision);
```

We have to keep in mind that the outcome of our decisions - *our left and / or right child nodes* - will either be a *new decision node* (function) or *non-callable value* (anything but a function).

If the outcome of our decision is another decision node, we have to decide the new node's outcome for the current context until we reach a non-callable value.

If the outcome of our decision node is a non-callable value we return the value.

By deciding the outcome of every decision node we reach in that way, we will effectively traverse the decision tree and reach a decision.

```
const decide = (context, decision) => {
const outcome = decision(context);
return typeof outcome === "function" ? decide(context, outcome) : outcome;
}
```

That's it, we're done! That's all there is to creating a simple binary decision tree in JavaScript.

## JavaScript binary decision tree example code

```
// Decision tree API
const decision = (conditionFunction, trueOutcome, falseOutcome) =>
(context) => conditionFunction(context) ? trueOutcome : falseOutcome;
const decide = (context, decision) => {
const outcome = decision(context);
return typeof outcome === "function" ? decide(context, outcome) : outcome;
}
// Example
const isPositive = x => x > 0;
const isNegative= x => x < 0;
const isZero = x => x === 0;
const numberSignDecision =
decision(isPositive,
"+",
decision(isNegative,
"-",
decision(isZero,
"0",
"?"
)
)
);
const contextValues = [ "number", 1, 0, -1, Number.NaN, ];
for (const value of contextValues) {
console.log(value, decide(value, numberSignDecision));
}
```

## Homework and next steps

- Improve the
`decision()`

function to check whether the`conditionFunction`

argument is a function or not before calling it. This will make the function more robust and provide us with a way to short-circuit our decision with truthy or falsey values, which can be very useful for debugging purposes. - Try turning our binary decision tree into an m-ary decision tree. M-ary decision trees can have more than two decision nodes. In their case we may not have
`true`

and`false`

as outcomes, but rather`1`

and`0`

as well as any value in between which would represent how certain we are in the outcome.

Thank you for taking the time to read through this article!

Do you have any experience creating decision trees? Have you tried implementing one yourself?

Leave a comment and start a discussion!

## Discussion (2)

You can try to use a technique called curry that is widely used in functional languages. Essentially, it resolves an attribute or two and gives you a function.

Hello Bruno, thank you for the comment!

There's definitely a lot more that can be done to improve this implementation. I wanted to keep this article simple and focus on explaining the building blocks of a binary decision tree. Leaning into more advanced programming concepts and techniques could have muddied the waters a bit so I decided to stay away. Hopefully that was the right choice! What do you think?

Let me know if you fork or create your own implementation. I'd love to see what you come up with!