You are here

Technical report | Logic Assumptions and Risks Framework Applied to Defence Campaign Planning and Evaluation


This paper describes the Logic, Assumptions and Risks Framework (LARF) developed by DSTO to establish a mechanism to improve the analysis, transparency and decision paths that underpin planning and evaluation. The challenge was to develop an approach based on established theory, but tailored for Defence-specific requirements, particularly for campaign planning and assessment activities. This framework has been validated through a range of trials on critical campaign functions and has proven to be effective. Consequently, it is assessed as being well-suited to the current and future Defence decision-making environment, and is likely to continue to be applied as part of DSTO's support in these areas.

Executive Summary

Evaluating the progress of ADF operations has come into focus recently in the Afghanistan, East Timor and Solomon Islands campaigns. In response, the Operational Planning and Evaluation Team within DSTO Operations Support Centre (DOSC) began in mid-2010 to design and implement a repeatable process for conducting campaign-level monitoring and evaluation, to support planning. The result was the development in collaboration with the Plans Branch of Headquarters Joint Operations Command (HQJOC), of the Campaign Assessment (CA) process, which includes techniques adapted from methodologies in other domains to address the challenges of evaluation in complex conflict or post-crisis contexts.

The problem for analysts in Defence is that it is difficult to assess the success of a campaign if the campaign's plan does not adequately articulate the intent, scope and expected outcomes of operational activity. Whilst all plans state objectives and various levels of goals, it is often not clear why a series of objectives is expected to lead to an overall positive outcome, or to what degree and in exactly what form a goal must be achieved to constitute 'success'. DSTO evaluation analysts consistently find that these explanations of linkage between intent and expected outcome are missing from campaign plans.

In order to address these problems, this research has three aims. The first is to identify priority information requirements, by combining insights from practical experience in CA cycles with a literature review of planning and evaluation theories. The second aim of the research is to investigate whether it is possible to develop a framework that can be applied to the CA process to draw out the information components that have been identified as priorities. The third aim of the research is to consider whether the framework developed can be applied to drafting of new plans and reviews of existing ones, so that priority information components are clearly articulated in future plans.

The research identifies that the priority information requirements for enhanced campaign-level planning and evaluation are logic, assumptions and risks. This is supported by review of several dominant theories in the disciplines of evaluation science, cognitive science, organisational psychology, and decision support engineering; namely Theory of Change, Program Evaluation and Cognitive Mapping. Each provides guidance for scientifically considering a problem, but not the actual techniques for applying that problem structure throughout the phases of planning and evaluation. Therefore, it is concluded that the theoretical insights require a framework for practical application, and although a large variety of frameworks and techniques have been developed by other agencies to apply these theories, none entirely meet the specific needs of Defence. Consequently a bespoke framework blending the advantages of a variety of approaches is required, and the Logic Assumptions and Risks Framework (LARF) has been developed to fill this methodological gap. The LARF provides a mechanism for systematically eliciting the logic, assumptions and risks in campaign planning and evaluation with some assurance of comprehensive consideration across all components of a plan. It then enables aggregation of those assumptions and risks and visibility of any primary risks that are common across multiple Effects, which can be highlighted and prioritised for monitoring or prevention.

Another key characteristic of the LARF is the scalability of its structure, depending on what users are aiming for and to what extent a plan or evaluation framework can be altered. Both Theory of Change and Program Theory allow for either micro- or macro- theories to be drawn out and analysed. By synthesising both of these theories into the LARF, it can be used to gain insight into either the detail of how activities and component parts of a plan are expected to work (ie. the micro-theory), or to gain insight into how the sum of the components is assumed to culminate in the achievement of some form of change (ie. macro-theory).

The LARF is assessed through trials on four critical campaign functions in three different Defence operations, and proves to be an effective tool in each instance, with a number of unexpected advantages. In summary to date, the LARF has been used as: 

                         an initial planning tool – to test viability of objectives, draft underlying detail of those objectives, and concept of how their achievement will be assessed;

                         a group learning exercise for building situational awareness – the basis of a forum for exchange of ideas and knowledge, for experienced and inexperienced. Also served as an effective team-building exercise which revealed individual's areas of expertise, strengths and weaknesses;

                         a structured format for revision of an existing plan and evaluation framework;

                         a tool for generation of a set of risks and assumptions to be added as indicators to an existing set of measures of effectiveness.

These trials demonstrate that the LARF offers planners and evaluators the ability to better identify the relationships between action, results, risks and opportunities. It enables planners to be more responsive and adaptive to changes in the operating environment. By making the campaign's priorities transparent it facilitates review of efficient use of resources and increases the likelihood of detecting and responding to obstacles. It has been found to distinguish between design or implementation flaws as the reason for a lack of progress towards defined objectives (or the reason for progress); and to assist in identifying indicators and metrics for measuring progress. The adaptability of the LARF and its suitability for rapid implementation, without a significant resource liability, makes the LARF well-suited to meeting current Defence planning and evaluation requirements, and it is simple and practical enough to be adapted for a wide variety of future functions.

There remains significant scope for further development, validation and application of the LARF, particularly in the use of data elicited in the LARF matrix as a reference for tracking whether planning logic and assumptions were correct as part of post-operational evaluations. However, research to date has produced a practical and immediately applicable framework.


Key information


Alison Hickman and Rebecca Karlsson

Publication number


Publication type

Technical report

Publish Date

May 2013


Unclassified - public release


Evaluation, Campaign Assessment, Operational Progress, Monitoring