This dashboard teaches you why multiple trees outperform a single decision tree.
The data is artificial. There is a diagonal line, separating two classes. All green points are part of class B and are to be found in the upper left triangle. All red points are part of class B and are to be found in the lower right triangle. You know that decision trees perform vertical and horizontal splits to the data.
You can see two graphs. The first one shows one single decision tree. There are a few splits, but they are not enough to represent the complexity of the data. With the dropdown you can select different trees. You can see, they are all different, some are better in a certain area, but worse in another area.
At the right side you see the result of taking all decision trees together in one random forest model. All their individual weeknesses are overcome and the final result has a much better overall performance.