Financial Modelling | Excel Hints & Tips
Updated: Jan 29
Whatever type of Excel model you are building, utilising the below tips will help you build more robust, efficient & accurate models.
Do not Embed Hard Coded Numbers within Formulas
Hard coded numbers should never be embedded within formulas, this is for two reasons:
It makes it very hard to update the model unless you are very familiar with it, as you would have to know exactly where the embedded figures were to update them (which would not be easy!)
There is no context as to what the embedded number represents. So, for example, "=A1 * 5000". What does the 5000 represent?
Always makes sure that your hard coded numbers / inputs are separated in their own section (or tab) and then reference these cells within the formulas. This will enable a clean and consistent approach to understanding and updating the inputs for future users.
This leads on nicely to the next section of ....
Separate Inputs, Calculations & Outputs
Financial models should have separate areas for inputs, calculations & outputs:
Inputs - Following on from the first point on hard coded numbers, containing all inputs to one area allows users of the model to easily see what assumptions and inputs are feeding into the model and also allows the user to update them in one place. This not only makes the model more robust, but also user friendly and allows the input cells to then be referenced throughout the model.
Calculations - By containing calculations within the same area, this allows users who wish to examine the detail to easily see what components are included within the calculations, but also allows the output tab to easily reference the output figures required, whilst containing the detail in the background.
Outputs - The output should be focussed on providing the audience with only the numbers and visuals required. By separating inputs & calculations this allows this to be done in a way that only the essential data is presented, whilst still allowing users of the model to easily understand where the output figures come from and how they are made up (as they will be linked to the calculation tab).
Ensure Inputs/Assumptions are Clearly Defined
Extending on the above advice, inputs should contain sufficient detail to understand exactly what they are and where they come from. This enables a more efficient update process and adds to the robustness of a model, as it removes any uncertainty or potential ambiguity.
Avoid a 'Daisy-Chain' Effect with Formulas
Wherever possible formulas should refer to the original source. For example, if you need to link to a calculated figure in five different places then make sure all of those formulas point back to the same original calculated cell and that they are not linking to a link of a link (of a link!)
Use VLookup / HLookup Excel Functions with Caution
Widely loved by many Excel enthusiasts and undeniable flexible, the VLookup & Hlookup functions have a coveted role in many Excel users toolbox. Although there's no denying the usefulness of the functions, they also come with a serious health warning when used within financial models.
To touch on this briefly (as the subject warrants it's own topic which I have added here), apart from some of the limitations of the functions, it also poses various risks such as either returning the nearest match by default (as opposed to the exact match) and also something as simple as a user inserting a row or column can make make an entire financial model correct.
Although there are steps that can be taken to prevent the above, I believe the risks far outweigh the benefits and there are much better alternatives that can be used (again, I'll be creating a post soon that will go into this and the above in greater detail.)
Use 'Volatile' Functions Sparingly
Volatile functions in Microsoft Excel refers to those that automatically calculate every time Excel recalculates.
Whilst this sounds harmless enough, in reality it can actually create issues such as making a model so slow you have enough time to make a cup of tea between each key press or mouse click (well, probably not quite that long, but you get the idea!)
The issue lies in the way that Excel treats the cells it's recalculating, in that under normal circumstances Excel will only recalculate a cell if it can see that there has been changes in the precedent formulas that the cell uses. This means that at any one point, only those cells that need to recalculate do, which results in a fast and efficient calculation process.
However, volatile functions such as NOW(), TODAY(), INDIRECT(), CELL(), OFFSET(), RAND(), RANDBETWEEN() etc. calculate every time a recalculation occurs, which means if you have a lot of these functions, or data linking to them then this is almost guaranteed to bring the speed of your model to a grinding halt
Avoid volatile functions as much as possible. If you do need to use something such as 'TODAY()' then use it once and refer to it in every calculation.
On a related note, simple things like inserting/deleting/hiding/unhiding a row, using auto-filter or renaming a tab causes a re-calculation. There are many other examples, but the majority of slow workbooks and models I've ever seen are caused by volatile functions, so treat them with caution!
Manual Calculation - Use with Caution
This may be a slightly contentious point, as I understand users have different reasons for using this, however, following on from the above I am going to say that very rarely should manual calculation mode be used.
If a model is structurally sound and does not include a mass of volatile functions then I rarely see the need to turn on manual calculation. In fact, turning off automatic calculations can often pose a greater risk when figures are being used or presented after updating inputs or data, only to later realise that Excel has not re-calculated (and therefore updated) the figures you were using.
I appreciate there are exceptions to the rule, but I'd strongly recommend only using manual calculation if you have no other option to do so and it is not due to the structural build or performance of the Excel model.
Named Ranges - To Use or Not To Use
Another contentious point in the Financial modelling world is the use of named ranges. I personally think they can be very valuable (and almost essential if ever used in combination with VBA), however, I do appreciate the arguments against using them.
I have written a separate article on Named Ranges that goes into this in more detail, but used correctly they can help ensure the correct structure is used, make the model and formulas more readable and also help prevent errors. (You can read more on named ranges here)
Of course for the above to work there has to be considerations for other users of the model, such as naming conventions and detailing what the ranges do if it's not blindingly obvious from the name (which there are options within Excel to assist with that if that is the case), but my view is that when used correctly Named Ranges can certainly be a useful addition to your modelling toolbox.
Formatting - Presentation is Important
The above statement may sound obvious, but it's surprising how often models are produced without taking this into account!
If the first thing that the audience of the model see is garish colours, sloppy formatting and/or a general lack of 'polishing' then this will portray that the same care has been taken with the numbers and modelling and this isn't going to help the confidence of the decision makers! There's nothing worse than seeing a messy Excel workbook/model, so ensuring it's tidy and refined should be a key focus.
Take the time to format a model correctly, use consistent neutral fonts & colours and a limited colour scheme that remains consistent throughout the model. Turn off gridlines, create a title/summary page or index where needed. Generally, just ensure that the end presented view represents the effort you have taken when building the model.
It's also vital that equal care is taken with the workings/calculations within the model. They may not be the main focus for most users, but for some they will be and you want to project the same confidence in all areas of your modelling work.
Keep Version Control
It is important to keep a document detailing each version of the model and what changes were made (and when). There are numerous reasons why you might need to use a previous model, or even revert to one before certain changes were made, so it's prudent to keep a version control document and an archive of associated versions of the model that this document refers to.
It is important to keep your model working the way it was intended and to protect the integrity of the data.
To do this look at Excel's protection options, which allow you to apply protection such as restricting user input to specific cells, or preventing the user from deleting rows or columns etc.
There is also the option of password protecting these measures, although if this is done then a process must be in place to ensure that the password can be recovered by another user if the need were to arise and that you are not the only person with the password. Losing/forgetting a password can cause serious headaches!
Some users prefer to not take these measures at all and simply put instructions/notes warning the users of the model not to take certain actions (such as amending formulas etc.), although if the protection functionality exists in Excel then why not utilise it?
Document Scenarios Modelled
If you need to use scenarios then as these are ultimately variable inputs take the same approach and ensure that these are clearly labelled and each scenario defined.
Whether you use Excel's Scenario Manager or not is a personal choice, but it does provide you with a built in way of being able to store details of what each scenario means, as well as obviously storing the variable inputs which then enables you to easily toggle between them.
However, Excel's functionality to do this via the Scenario Manager is by no means necessary and many users prefer to take an alternative approach via data validation drop downs and lookup up methods which gives a less restrictive approach.
Check for Errors (and then Check Again!)
Whatever type of Excel modelling you are undertaking, the end result will be the same when it comes to the importance of error checking and stress testing. The output of a model will often be relied upon to make strategic decisions and therefore paramount that the data can be replied upon to support those decisions.
You may find it useful to build a separate error check page, which cross references figures from across your model and ensures that everything is consistent (you could also extend this to include conditional formatting that flags to the user that an error has been detected and to resolve this before using the data.
You should always have a centralised place which will immediately give you visibility of all of the checks throughout the model, saving the need to check everything individually (and the risk of missing something!)
Also, it is worth stress testing your model and entering all manner of assumptions and inputs (much smaller/larger than expected, negative or positive reversal etc.). By doing so you can then identify if these trigger errors which you would need to resolve and at the same time follow the inputs and test them through the calculation stages through to output to ensure the end result is as expected.
Unfortunately, errors happen, but in a well structured model the risk of this is reduced significantly and with error checks in place you can at least trap them when they do occur, before they cause any damage!
Whatever type of Excel modelling you are undertaking, using a consistent, structured approach makes sense from all angles and will provide benefits to all users and consumers of those models.
You can also subscribe to be notified of future content from Insightful Dynamics or visit us at www.insightfuldynamics.co.uk