Skip to main content

Digital Transformation

Services  /  Digital Transformation

Global spending on digital transformation reaches $4.1 trillion in 2026. 65% of those programmes fail to deliver their expected returns. The failure rate is not a technology problem. The technology works. The failure is consistently in five other areas: the strategy was defined in terms of technology adoption rather than business outcomes; the architecture was not assessed for readiness before the programme started; data governance was treated as an afterthought rather than a foundation; change management was underfunded and underprioritised; and return on investment was never defined precisely enough to be measured. Organisations that avoid these five failures succeed at rates the industry average does not predict.

Digital transformation is not a technology programme. It is a business change programme that uses technology as its primary mechanism. The distinction matters in every decision the programme makes: scope, sequencing, investment, measurement, and the allocation of accountability. An organisation that commissions a digital transformation programme and delegates it entirely to the IT function has already made the decision that is most likely to put it in the 65%.

This service produces a transformation strategy, architecture readiness assessment, data governance framework, ROI measurement framework, and change management programme — not the implementation of any technology. Technology implementation is executed by your team, existing vendors, or specialist implementation partners from the strategy and specifications we produce. The separation keeps the strategy analytically independent of any vendor’s commercial interest in a specific technology outcome.

Price Range
£28,000 – £200,000+
Strategy, architecture readiness, data governance, ROI framework, change management design. Technology implementation is separate and additional.
Duration
10 weeks – 9 months
Strategy phase only. Programme execution — typically 18 months to 4 years — is outside this engagement and outside our scope.
Scope boundary
Strategy, frameworks, architecture specifications, and change management design. We do not implement technology, manage vendors, or run the programme. We design the programme your organisation runs.
Prerequisite
Active executive sponsorship at C-suite level. Without named leadership accountability, a transformation strategy produces documents, not outcomes.
Contract
Fixed-price strategy phase. 50% on signing, 50% on delivery acceptance. Execution oversight (if required) priced separately as day-rate advisory.
The strategy is not the transformationThis engagement produces the strategy, architecture design, and frameworks. The transformation itself — changing processes, implementing technology, retraining staff, retiring legacy systems — takes years and requires sustained executive commitment, operational resources, and change management discipline that no consulting engagement can substitute for. Organisations that commission a transformation strategy expecting the strategy document to produce the transformation have misunderstood what they are buying.

Five specific failure modes. Each one documented in published research. None of them technology failures.

McKinsey, Gartner, BCG, and Harvard Business Review have each independently documented the same five causes of digital transformation failure. The consistency across sources is striking — and the consistent finding that the failures are not technology failures is the reason that technology-focused approaches to digital transformation strategy consistently under-deliver. Understanding each failure mode specifically is the prerequisite for avoiding it. General awareness that “transformation is hard” is not.

01
Strategy defined as technology adoption, not business outcome
The programme is defined as “migrate to cloud,” “implement ERP,” “deploy AI,” or “become data-driven.” These are means, not ends. A strategy that defines success as the adoption of a technology has no mechanism for evaluating whether the technology adoption produced business value — because business value was never defined. The programme completes, the technology is deployed, and no one can say whether it succeeded, because no one agreed on what success was before it started.
What the research shows
McKinsey: organisations that define transformation objectives in terms of specific business outcomes — revenue growth, cost reduction, customer retention, operational efficiency — deliver 3× the return of organisations that define them in terms of technology deployment. The measurement difference is the only structural difference between the two groups; the technology deployed is often identical.
How to prevent it
Every transformation initiative defined as: the specific business metric it is intended to move, the baseline value of that metric, the target value, the timeline for achieving it, and the mechanism by which the technology investment produces the metric improvement. If this cannot be stated before the initiative begins, it should not begin. This is the ROI framework component of the strategy — and it is the component most frequently absent from transformation strategies we are asked to review.
02
Architecture readiness not assessed before programme starts
The transformation programme begins before the organisation’s current architecture is understood well enough to know what the transformation requires. Systems that must be replaced before other systems can be modernised are identified mid-programme, adding scope and timeline that were not in the business case. Data quality problems that block the programme’s analytics ambitions are discovered after the analytics platform is procured, not before. Integration complexity is underestimated because the integration architecture was never mapped. The programme is in discovery mode throughout its first year.
What the research shows
BCG: programmes that conduct a structured architecture readiness assessment before commencing transformation have 40% lower cost overruns and 35% higher on-time delivery. The cost of the architecture assessment is recovered in the first year through avoided scope surprises alone. The most expensive architecture surprises — legacy system interdependencies, data quality gaps, integration constraints — are all knowable before the programme starts if the right questions are asked systematically.
How to prevent it
Architecture readiness assessment conducted as Phase 1 of any transformation programme: current-state estate inventory, system interdependency mapping, data quality assessment, integration architecture documentation, and identification of blocking dependencies that must be resolved before the transformation can proceed. This assessment takes 4–8 weeks. It saves months of mid-programme discovery and the budget overruns that accompany it.
03
Data governance treated as an afterthought
Digital transformation produces data at scale. The transformation’s analytics, AI, automation, and decision-support objectives all depend on data that is accurate, consistent, accessible, governed, and trusted by the people who use it. In organisations without a data governance foundation, none of these properties hold. Data is duplicated across systems with different values for the same field. No one owns the definition of “customer” or “revenue” or “incident” consistently across the organisation. Data quality is a persistent blocker rather than a solved problem. The transformation platform is built; the data it depends on is not fit for purpose.
What the research shows
Gartner: poor data quality costs organisations an average of $12.9 million per year in direct costs. Data quality problems are the most frequently cited cause of AI and analytics project failure — cited ahead of technology limitations, skills gaps, and budget constraints. Organisations that invest in data governance before deploying analytics or AI platforms have 2.5× higher return on those platform investments, measured at 24 months post-deployment.
How to prevent it
Data governance framework designed and implemented before the transformation platform is deployed — not in parallel, not after. The framework covers: data ownership (named individuals accountable for data quality per domain), data definitions (agreed definitions for all key business entities), data quality standards (measurable thresholds for accuracy, completeness, consistency), and data lineage (documented path from source to consumption for all key data products). The governance framework is a prerequisite for the platform, not a feature of it.
04
Change management underfunded and underprioritised
Technology adoption requires people to change how they work. This is the hardest part of any transformation, and it is consistently the most underfunded part of the programme budget. The typical budget allocation: 80–90% to technology, 10–20% to change management. The typical cause of failure: technology deployed successfully, adoption at 20% of target users after 12 months. The technology works; the organisation does not use it. The value case assumed 80% adoption within 6 months. The actual adoption curve, without a structured change management investment, is slower, lower, and more resistant to intervention the longer it is left without one.
What the research shows
Prosci research across 2,000+ transformation programmes: programmes with excellent change management are 6× more likely to meet their objectives than those with poor change management. The cost of poor change management — rework, workaround processes, parallel system operation, retraining, loss of productivity during transition — consistently exceeds the cost of the change management investment that would have prevented it. Budget allocation that reflects this should be closer to 70/30 (technology/change) for most transformation programmes.
How to prevent it
Change management programme designed before the technology programme begins: stakeholder analysis (who is affected, how, what their concerns are), communication strategy (what is communicated, when, by whom, in which channels), training programme (skills required per role, training design, delivery timeline, adoption measurement), resistance management plan (anticipated resistance points and response protocols), and adoption metrics defined before deployment so they can be measured from day one. The change management programme is a parallel workstream, not a phase that begins when the technology is ready to deploy.

One for each failure mode. Each one addresses a specific structural cause of transformation failure.

The five programme components below correspond directly to the five failure modes above. Every transformation strategy engagement produces all five components — not because a comprehensive document suite is the objective, but because missing any one of the five produces a transformation programme with a structural gap that the research consistently links to failure. Clients who want only a subset of the components should read the failure mode documentation for the components they want to exclude before making that decision.

Component 1 — Addresses Failure Mode 01
Transformation Strategy & ROI Framework
Strategy defined in terms of business outcomes, not technology adoption. Each initiative in the strategy maps to a specific metric, a baseline value, a target value, a timeline, and an attribution methodology. The board sponsor signs off on the ROI framework before any implementation investment is approved. The strategy includes explicit decision gates — points at which the programme’s progress against its ROI commitments is reviewed and continuation is evaluated, not assumed.
What this component delivers
  • Business outcome taxonomy: every transformation initiative categorised by the specific business metric it is designed to improve
  • Baseline measurement report: current values for all target metrics, measured before any transformation investment begins
  • ROI model per initiative: quantified expected return, investment required, payback period, confidence interval
  • Decision gate framework: specific milestones, measurement dates, acceptable ranges, and decision authority for continuation, modification, or cessation
  • Board sign-off pack: ROI framework, portfolio prioritisation rationale, investment approval request in business terms
What makes this hard
Defining ROI for transformation initiatives requires quantifying things that organisations habitually leave qualitative — “customer experience,” “operational efficiency,” “staff productivity.” This quantification is resisted because it creates accountability. An initiative with a defined ROI target can fail against it. An initiative described as “improving the customer experience” cannot be measured and therefore cannot be held accountable. The ROI framework creates accountability that some stakeholders will prefer to avoid.
Component 2 — Addresses Failure Mode 02
Architecture Readiness Assessment
Current-state estate mapped to identify what the transformation requires of the architecture, what the architecture can currently support, and what the gaps are. Blocking dependencies identified and sequenced: the systems that must be modernised or replaced before the transformation’s primary initiatives can proceed. Integration architecture documented. Technical debt quantified in terms of its transformation impact — not as a general statement that technical debt is a problem, but as a specific list of the systems, integrations, and data quality issues that will create programme delays and cost overruns if not addressed in the correct sequence.
What this component delivers
  • Current-state architecture inventory: all systems, their age, their support status, and their transformation relevance
  • System interdependency map: which systems depend on which others, and which are blocking dependencies for transformation initiatives
  • Data quality assessment: current data quality scores per domain, gaps that will block analytics or AI initiatives, remediation requirements
  • Integration architecture documentation: current integrations, planned integration requirements, gaps between them
  • Technical debt quantification: specific systems and the cost/timeline impact of their transformation constraints
  • Sequencing recommendation: the order in which blocking dependencies must be addressed before transformation initiatives can proceed
What makes this hard
Architecture readiness assessments consistently reveal that the current estate is less ready than anyone believed. Legacy systems are older, more interdependent, and carrying more technical debt than the IT function has reported to the board. Data quality is worse than the business teams have acknowledged. The findings create discomfort because they extend timelines and increase programme costs. They are, however, knowable before the programme starts — and they are more manageable before the programme starts than after £5M has been committed to initiatives that the architecture cannot support.
Component 3 — Addresses Failure Mode 03
Data Governance Framework
The data governance foundation that every transformation’s analytics, AI, and automation components depend on. Designed as a prerequisite for the transformation platform, not a concurrent or subsequent workstream. The framework covers four domains: data ownership (named accountable individuals per data domain), data definitions (agreed definitions for all key business entities — customer, product, transaction, incident), data quality standards (measurable thresholds with monitoring), and data lineage (documented and tooling-supported tracking of data from source to consumption). Without this framework in place before the platform is deployed, the platform’s outputs will not be trusted by the business users it is designed to serve.
What this component delivers
  • Data domain inventory: all key data domains, their current owners (formal and informal), and their quality status
  • Data ownership model: named data stewards per domain, decision rights, escalation paths, accountability structure
  • Business glossary: agreed definitions for all key business entities, approved by the data stewards for each domain
  • Data quality framework: measurable standards per domain, measurement methodology, acceptable thresholds, monitoring tooling specification
  • Data lineage design: tooling requirements and implementation specification for lineage tracking
  • Data governance operating model: meeting cadence, decision-making processes, issue resolution, policy review cycle
  • UK GDPR data governance alignment: where the data governance framework intersects with data protection obligations, the design satisfies both
What makes this hard
Data governance requires naming specific people as accountable for data quality in domains they do not currently own formally. Business unit leaders who do not currently feel responsible for data quality will resist the assignment of ownership. The governance framework requires resolution of definitional disputes — “what is a customer?” — that have existed for years without resolution because no one wanted to resolve them. These disputes are political as much as technical. The governance framework forces resolution, which is why it is resisted.
Component 4 — Addresses Failure Mode 04
Change Management Programme Design
Change management designed as a programme component equal in priority to the technology programme, with its own budget, timeline, resources, and accountability. Not a communications plan attached to the technology rollout. Not a training programme delivered in the final month before go-live. A structured programme that runs in parallel with the technology programme from strategy phase through to adoption measurement. It covers the full scope of human change required: the people who will use the new technology, the managers who will lead their teams through the transition, and the organisation’s capacity to absorb the rate of change the programme intends to impose.
What this component delivers
  • Stakeholder analysis: all affected stakeholder groups mapped by impact, influence, current attitude, and change required
  • Change readiness assessment: organisation’s current capacity to absorb change — staff, management, culture, and system constraints
  • Communication strategy: what is communicated, when, by whom, in which channels, and how understanding is verified
  • Training programme design: skills required per role, training approach (classroom, e-learning, on-the-job), delivery timeline, who delivers, adoption measurement approach
  • Resistance management plan: anticipated resistance points, designated response protocols, escalation paths
  • Adoption metrics framework: how adoption is measured, at what frequency, against what targets, and what triggers a programme response
  • Change management budget specification: recommended investment in change resources as a proportion of total programme investment, with justification
What makes this hard
Change management budget is the first item cut when programme budgets are under pressure, because its ROI is indirect and its failure mode is slow — the technology works, adoption just never reaches target. By the time the adoption failure is visible, the change management budget has been redirected, the people who should have been running the change programme are not there, and the cost of achieving adoption retrospectively is higher than proactive investment would have been. We will include a change management budget recommendation in our strategy. We will also document the adoption risk if that recommendation is not followed.

Three tiers. All five components in every tier. Scale of depth and scope differs.

All three tiers produce all five programme components. The difference between tiers is the depth of the analysis, the size of the organisation in scope, the number of transformation initiatives assessed, and the degree of engagement with the organisation’s leadership during the strategy process. Smaller organisations require less scope but the same structural rigour. A 50-person organisation with one poorly-defined transformation initiative and no data governance has the same five structural failure risks as a 5,000-person organisation. The tier determines the scale, not the integrity, of the analysis.

Focused Strategy
Focused Transformation Strategy
For organisations up to 300 staff with a focused transformation scope: up to 5 transformation initiatives, a single primary business unit, and one primary technology domain (e.g. one ERP modernisation, or one analytics capability, or one process automation programme). Typical: SMEs, single-service NHS providers, independent schools implementing a specific digital capability, small charities undertaking a specific digital change. Not appropriate for organisation-wide transformation spanning multiple business units and multiple technology domains.
£28,000
Fixed · VAT excl.
10 weeksAssumes full access and engagement as specified. Timeline extends if leadership workshops cannot be scheduled within the programme window.
Strategy & ROI
Up to 5 transformation initiatives assessed and defined as business outcomes
Baseline metrics measured for all target outcomes before strategy is finalised
ROI model per initiative: investment, expected return, payback period
Decision gate framework: 2 review points with defined continuation criteria
Board sign-off pack with investment approval request
Portfolio prioritisation across multiple business units
Architecture, Data & Change
Architecture readiness assessment: current estate, blocking dependencies, data quality per domain for up to 3 key data domains
Data governance framework for the primary data domains required by the transformation initiatives
Data ownership assignment and business glossary for assessed domains
Stakeholder analysis and change readiness assessment for the primary affected group
Communication strategy and training programme design for the primary transformation initiative
Adoption metrics framework for all 5 initiatives
Roadmap & Programme
Sequenced transformation roadmap for all 5 initiatives across up to 3 phases
Resource model: people, budget, and vendor requirements per phase
Risk register with mitigation design
Programme governance design for a focused programme
Vendor selection framework for the primary technology domain
30-day post-delivery advisory support (email)
Multi-unit programme governance (Professional tier)
Industry 4.0 or OT-specific strategy (Enterprise tier)
Timeline — 10 Weeks
Wk 1–2
Current State Assessment
Estate inventory, data quality baseline, architecture dependencies. Current initiative definitions reviewed.
Current initiatives may not have defined ROI. Documenting the absence of ROI definition is part of the output — not a reason to delay the assessment.
Wk 3–4
Initiative Definition & ROI
Each initiative defined as a business outcome. Baseline metrics measured. ROI model per initiative.
Baseline metric measurement requires data access that is sometimes resisted. The baseline is only useful if it is accurate — estimated baselines produce unreliable ROI assessments.
Wk 5–6
Data Governance & Architecture
Data governance framework for key domains. Architecture readiness gaps documented. Blocking dependencies sequenced.
Data ownership assignment requires business unit engagement. If relevant business unit heads are not available in this window, data ownership design is incomplete.
Wk 7–8
Change Management Design
Stakeholder analysis, communication strategy, training programme design, adoption metrics.
Change management design requires HR and operations involvement. If these functions are not engaged in this window, the change programme will not reflect operational constraints.
Wk 9–10
Roadmap & Board Pack
Sequenced roadmap. Resource model. Risk register. Board sign-off pack. Decision gate framework.
Board sign-off requires leadership availability. If the board meeting cycle is monthly, plan the delivery timing to allow the board pack to be submitted to a scheduled meeting rather than requiring a special session.
What Your Team Must Provide
Executive sponsor: 6–8 hours across the programme for initiative definition workshops, ROI review, and board pack review
Business unit leads for affected areas: 3–4 hours each for stakeholder analysis and change readiness input (weeks 7–8)
IT lead: 8–10 hours across weeks 1–2 for current state assessment
Finance: access to cost and revenue data needed to establish baseline metrics and ROI modelling inputs
Data access: the systems and data quality reports needed to assess data readiness for the target initiatives
HR or operations: 2–3 hours in weeks 7–8 for change management design input
What Is Not in This Engagement
Technology implementation — all technology selected and implemented separately from this strategy
Vendor selection execution — framework provided; RFP process and vendor negotiations are client-led
More than 5 transformation initiatives: scope addition at £3,500 per additional initiative
More than 3 data governance domains: scope addition at £2,500 per additional domain
Execution oversight and advisory support beyond 30 days: available at £1,400/day
AI systems engineering for specific AI initiatives: separate engagement (see AI Systems Engineering)
Professional Strategy
Professional Transformation Strategy
For organisations of 300–3,000 staff undertaking a multi-unit transformation spanning multiple technology domains, multiple business functions, and requiring a portfolio-level prioritisation and governance framework. Typical: NHS organisations undertaking digital transformation programmes, financial services firms in operational transformation, manufacturing organisations implementing Industry 4.0 programmes, universities undertaking student experience or research infrastructure transformation, local authorities in service digitalisation. This tier produces a full programme architecture, not a strategy document.
£85,000
Fixed · VAT excl.
20 weeks baselineMulti-unit stakeholder engagement and board approval cycles commonly extend this to 24–28 weeks in large organisations.
Strategy & ROI
Up to 20 transformation initiatives assessed across all business units
Portfolio prioritisation: initiatives ranked by ROI, strategic alignment, dependency, and risk
Baseline metrics measured for all target outcomes across all business units
ROI model per initiative with 3-year projection and sensitivity analysis
Decision gate framework: quarterly review cycle with defined continuation criteria per initiative
Board and executive team sign-off pack with investment portfolio request
Total programme investment model: capex, opex, people cost, and change cost over 3 years
Architecture, Data & Change
Architecture readiness assessment across all business units and all technology domains in scope
Data governance framework for all key data domains: typically 8–15 domains in a mid-market organisation
Data quality assessment and remediation roadmap per domain
Data platform architecture specification for the transformation’s analytics and AI requirements
Full stakeholder analysis across all affected business units
Organisation change capacity assessment: how much change the organisation can absorb per quarter, and the implications for sequencing
Change management programme design for all transformation initiatives
Training needs analysis across all affected roles and functions
Roadmap & Programme
Sequenced transformation roadmap: up to 5 phases across 3 years, accounting for all dependencies and capacity constraints
Resource model: FTE requirements, budget allocation, and vendor/partner sourcing plan per phase
Programme governance design: steering committee, initiative owners, reporting, escalation, decision rights
Programme office design: what a transformation PMO needs to operate this programme
Vendor selection framework for all technology domains in scope
Risk register at programme and initiative level
60-day post-delivery advisory support: email plus 3 × scheduled video calls
Year-1 strategy review included (at month 12)
Timeline — 20 Weeks Baseline (expect 24–28 weeks in large organisations)
Wk 1–3
Current State Assessment
Estate inventory across all business units. Data quality baseline for all domains. Architecture mapping. Current initiative inventory and status.
Multi-unit current state assessment requires coordinated access across all units. Designate a single internal coordinator before week 1 — without one, access provisioning delays add 2–3 weeks.
Wk 4–7
Initiative Definition & Portfolio ROI
All 20 initiatives defined as business outcomes. Baseline metrics measured. ROI models. Portfolio prioritisation workshop with leadership.
Portfolio prioritisation workshops are the most contentious phase. Business unit heads competing for initiative priority require a neutral facilitation approach and executive authority to make final prioritisation decisions. Without a named executive with authority to resolve prioritisation disputes, this phase extends indefinitely.
Wk 8–11
Data Governance & Architecture
Data governance framework for all domains. Architecture readiness gap analysis. Data platform specification.
Data ownership assignment across 8–15 domains requires engagement from all business unit heads. This is the phase most likely to be delayed by scheduling difficulties — expect 1–2 weeks of buffer beyond the baseline.
Wk 12–15
Change Management Programme
Full stakeholder analysis. Change capacity assessment. Change management programme design for all initiatives. Training needs analysis.
Change capacity assessment requires honest input about the organisation’s current change load — other programmes running, staff turnover, operational pressure. This input is sometimes withheld because it implies the programme should be slower than desired.
Wk 16–19
Roadmap & Programme Architecture
Full 3-year sequenced roadmap. Resource model. Programme governance design. PMO design. Vendor selection frameworks.
Roadmap sequencing may produce a timeline longer than the organisation expected or wanted. A realistic roadmap is more valuable than an optimistic one — the cost of mid-programme scope extension and timeline renegotiation is consistently higher than the cost of a slower but achievable initial plan.
Wk 20
Board Approval & Handover
Board sign-off pack. Investment approval request. Handover to programme leadership. Execution advisory begins.
Board approval cycle: in organisations with monthly board meetings, time the final delivery so the board pack is ready for the next scheduled meeting. Do not plan for a special board session — they are harder to schedule than anticipated.
What Your Team Must Provide
Named executive sponsor with budget authority and cross-unit authority: this person chairs portfolio prioritisation workshops and resolves cross-unit prioritisation disputes
Dedicated internal programme coordinator: not required to be technical, required to have scheduling authority across all units and functions
All business unit heads: available for current state interviews (2 hours each, weeks 1–3) and portfolio prioritisation workshops (half day, week 6–7)
Finance: cost and revenue data for baseline metric measurement and ROI modelling (weeks 4–7)
HR and operations: change capacity assessment and training needs input (weeks 12–15)
IT: full current state access across all systems and domains (weeks 1–3 and 8–11)
What Is Not in This Engagement
Technology implementation: all technology separately procured and implemented from our vendor selection framework and specifications
Vendor selection execution: RFP management, contract negotiation, and vendor management are client responsibilities
More than 20 initiatives: scope addition at £3,500 per initiative above ceiling
AI systems engineering for specific AI initiatives: separate engagement (see AI Systems Engineering)
Cloud migration strategy: separate engagement (see Cloud Migration Strategy)
Execution oversight beyond 60 days: £1,400/day advisory retainer
Year-2+ strategy review: £12,000/year
Enterprise Strategy
Enterprise Transformation Strategy
For organisations above 3,000 staff, multi-jurisdiction operations, Industry 4.0 or OT transformation scope, post-merger integration transformation, or organisations undertaking whole-enterprise transformation with £50M+ programme investment. Also appropriate for organisations with a stalled or failed transformation that requires strategic reset before recommencing. All enterprise engagements individually scoped. The price below is a starting point — actual scope determines actual price. Minimum engagement: £140,000.
From £140,000
Individually scoped · fixed · VAT excl.
From 6 monthsEnterprise transformation strategy programmes with multi-jurisdiction scope or failed programme rescue commonly run 8–12 months before implementation can recommence.
What Enterprise Adds
No ceiling on initiatives, data domains, or business units in scope
Industry 4.0 strategy: OT/IT convergence, smart manufacturing, predictive maintenance, digital twin — with the architecture constraints of industrial environments as a first-class design input, not an afterthought
Post-merger integration transformation: where the transformation programme encompasses the integration of an acquired organisation’s data, systems, processes, and people
Multi-jurisdiction strategy: different regulatory environments, data residency requirements, and operating model constraints across geographies
Failed programme rescue: assessment of why the existing programme failed, what must change before it can succeed, and whether any in-flight work can be salvaged
£50M+ programme business case: investment committee-grade financial modelling with scenario analysis
Why Enterprise Takes Longer
Portfolio prioritisation across 50+ initiatives with competing C-suite sponsors requires facilitated decision-making across multiple sessions — typically 3–4 months of stakeholder engagement before consensus is achievable
Data governance across 20+ domains in a large organisation requires a sustained engagement with data owners who are distributed across the organisation and who have competing priorities
Industry 4.0 strategy requires OT assessment which must use passive-only methods — active assessment in operational environments risks disrupting production
Failed programme rescue requires forensic analysis of what went wrong before any forward-looking strategy can be designed — this cannot be rushed without repeating the mistakes that caused the failure
Investment committee approval processes at enterprise scale involve legal, finance, and risk functions that move on different timescales from the strategy development
Enterprise Requirements
Named CDO, CTO, or equivalent as primary sponsor — digital transformation at enterprise scale requires digital leadership authority, not just IT authority
CEO or COO engagement for portfolio prioritisation: competing C-suite priorities cannot be resolved below this level in an enterprise context
Dedicated transformation programme office: enterprise strategy requires internal programme management capability to coordinate across all participating units
OT engineering team involvement for Industry 4.0 scope: no OT assessment begins without OT engineering sign-off on method and scope
Legal team availability for multi-jurisdiction regulatory assessment throughout the strategy programme

What both parties commit to. What follows when either fails.

Client Obligations
Executive sponsor must be genuinely engaged — not nominally named
A transformation strategy without an engaged executive sponsor produces a strategy document that no one has the authority or motivation to execute. The sponsor must attend the portfolio prioritisation workshops, resolve cross-unit disputes, and sign off the board pack with genuine understanding of what they are approving. A nominally-named sponsor who delegates all engagement to a programme manager does not satisfy this requirement — the programme manager does not have the authority to make the decisions the strategy requires.
If the sponsor is disengagedWe will flag this formally in writing after the first instance. If the pattern continues after the second instance, we will invoke the delay protocol. A strategy built without genuine executive engagement produces a document the organisation will not execute.
Accurate disclosure of current transformation status — including failed initiatives
Organisations sometimes present their current transformation status more positively in the discovery phase than it actually is — because the discovery phase feels like an assessment of the current leadership team’s performance as much as the organisation’s technology state. Initiatives that have stalled, failed, or been cancelled without announcement affect the transformation strategy. Failed initiatives reveal failure modes that the strategy must address. Their existence is relevant and their status must be disclosed accurately.
If failed initiatives are discovered during the engagementIncorporated into the assessment — failed initiatives are valuable inputs to the strategy because they identify specific failure modes. If the failure was concealed from the organisation’s own leadership, the strategy will address the failure mode rather than the concealment.
Change management investment must be funded at the level the strategy specifies
Our strategy will include a change management budget recommendation. That recommendation is based on the research linking change management investment to programme ROI. If the organisation funds the change management programme at substantially less than the recommended level, the adoption risk we document will materialise. We will include this risk in the strategy document. If the programme subsequently fails to achieve adoption targets and the change management underfunding is a contributing cause, this was documented as a risk before the programme began.
If the change management budget is reduced from our recommendationWe document the reduction, the adoption risk we assessed, and the client’s decision to proceed at the reduced level. This documentation is part of the final deliverables and cannot be removed.
RJV Obligations
ROI models that include realistic assumptions, not assumptions that support a predetermined conclusion
ROI modelling for transformation programmes involves assumptions: adoption rate, productivity improvement, cost reduction realisation, timeline to benefit. These assumptions are sometimes manipulated to produce the ROI that the business case needs rather than the ROI the programme can realistically deliver. We build ROI models from evidence: comparable programme benchmarks, conservative adoption assumptions, realistic timelines, and explicit sensitivity analysis showing what happens when assumptions are wrong. If the realistic ROI for a specific initiative does not justify the investment, the model will say so.
If you believe our ROI assumptions are too conservativeRaise within 10 business days of receiving the model. We will review the specific assumption and either defend it with the evidence base or revise it if you present evidence that the more optimistic assumption is better supported. We will not revise assumptions in response to commercial pressure.
Roadmap sequencing that reflects what is achievable, not what is desired
Transformation roadmaps are frequently presented with more compressed timelines than the organisation’s architecture, data quality, and change absorption capacity can support — because the board wants to see ambitious timelines and programme teams are reluctant to present realistic ones. We will present a realistic roadmap. If the realistic sequencing produces a timeline that is longer than the organisation expected, we will explain specifically why — with reference to architecture dependencies, data quality gaps, and change capacity constraints that are documented in the preceding assessment work.
If the realistic roadmap is significantly longer than expectedWe will present alternative scenarios showing what would need to change to accelerate the timeline — additional resources, resolved dependencies, adjusted scope — with the cost and risk implications of each. You choose which scenario to approve. We do not present an optimistic timeline and then manage the consequences of it being unreachable.
Vendor and technology recommendations independent of commercial relationships
Transformation strategy includes vendor selection frameworks and technology recommendations. These are based on the requirements derived from the strategy — the capabilities needed, the integration requirements, the organisational constraints — not on commercial relationships with technology vendors. We do not receive referral fees, implementation commissions, or any other commercial consideration from technology vendors. Where we recommend a specific product or platform, the recommendation is based on documented technical requirements matching documented vendor capabilities. Vendor relationships are declared in writing before the engagement begins.
If we have or develop a commercial relationship with a vendor we have recommendedWe disclose this immediately in writing. You may request a revised recommendation from a consultant without the relationship. If you have evidence of an undisclosed relationship, raise it — we will investigate and, if confirmed, remediate the affected recommendations at no cost.

Questions that reveal whether an organisation is ready to commission a transformation strategy

Can you tell us how long our transformation will take and what it will cost?
Not before the strategy is complete. The transformation timeline and total cost depend on the architecture readiness findings, the data quality gaps, the change management constraints, and the portfolio prioritisation decisions — none of which we know before the strategy engagement. Anyone who gives you a transformation timeline before conducting the architecture readiness and data quality assessment is estimating, not calculating. Our strategy produces the timeline and total cost model as outputs, not inputs. The discovery session gives us enough information to estimate the scale of the strategy engagement needed; the strategy engagement produces the programme timeline and cost model.
We have already selected our technology platform. Do we still need a transformation strategy?
Yes, unless the platform selection was preceded by a business outcome definition, architecture readiness assessment, and data governance design — in which case much of the strategy work is already done. If the platform was selected first — before the business outcomes were defined, before the architecture constraints were assessed, before the data quality was evaluated — then the strategy is needed to establish whether the selected platform is the right choice for the outcomes you are trying to achieve and the architectural constraints you are working within. Platform-first transformation is the second most common cause of transformation failure after outcome-free strategy. The strategy also defines what the platform needs to be configured to do, which is often incompletely understood at the time of platform selection.
Our previous transformation strategy didn’t get implemented. Why would this one be different?
This is the most important question to ask before commissioning a new strategy. A strategy that was not implemented failed for a reason. The most common reasons: the executive sponsor changed before the programme started; the ROI was not defined specifically enough to create accountability; the architecture constraints were not addressed before implementation began; the change management programme was not funded; the technology vendor delivered something that did not match the specification. We will ask specifically about the previous strategy’s failure in the discovery session. If the conditions that caused the previous failure have not changed, a new strategy will not produce a different outcome. We will say so if that is what the assessment reveals.
What is the realistic total cost of a transformation programme including execution?
For the Focused tier (5 initiatives, SME): strategy £28,000 + technology implementation typically £200,000–£800,000 + change management programme typically £30,000–£100,000 + first-year operational cost of new platforms typically £20,000–£80,000/year. For the Professional tier (20 initiatives, mid-market): strategy £85,000 + technology programme typically £2M–£15M over 3 years + change management typically £200,000–£600,000 + ongoing operational costs. These are illustrative ranges — the strategy produces the specific model for your programme. The consistent research finding is that organisations that underestimate transformation cost by a factor of 2× are the norm, not the exception. The architecture readiness and data quality findings are typically where the budget surprise originates.
How is this different from what a management consulting firm would produce?
Three structural differences. First: we produce a fixed-price, defined-scope strategy — not an open-ended advisory engagement billed by the day. Second: we do not have a technology implementation practice with a commercial interest in the platform recommendations — our vendor selection frameworks are not influenced by which vendor will generate the most implementation revenue. Third: the strategy is grounded in the actual architecture of your organisation, assessed technically, rather than derived from a generic industry transformation playbook adapted to your name and logo. We do not have a pre-built transformation model that we apply to every client; we begin from the structural analysis of your specific situation. Where a management consulting firm’s transformation model is the framework’s strength, we start from the organisation’s constraint set and build up — which is a slower and more expensive approach to the strategy, but produces a programme that the specific organisation can actually execute.
Our board is sceptical about the ROI of digital transformation given the 65% failure rate. How do we make the case?
The 65% failure rate is the aggregate rate across all programmes — including those without defined ROI, without architecture readiness assessment, without data governance, and without adequate change management. The rate for programmes that avoid these failures is substantially better. The board’s scepticism is appropriate given the aggregate statistics; the response is not to assert that “our transformation will be different” but to demonstrate specifically why — by presenting the baseline metrics, the precise ROI model per initiative, the architecture readiness evidence, the data governance programme, the change management budget, and the decision gate framework that will evaluate and, if necessary, pause non-performing initiatives before they consume further budget. This is what the strategy produces. A board presented with this evidence is in a materially different position from a board presented with a presentation that says “digital transformation will make us more competitive.”
What are your payment terms?
50% on contract signature, 50% on written acceptance of the final deliverables. No milestone payments during execution. Scope additions are invoiced as agreed in writing before execution — never retrospectively. The final payment is contingent on written acceptance. If a deliverable does not meet the agreed specification, we remediate before raising the final invoice. Execution advisory beyond the included post-delivery support window is billed monthly in arrears for days actually worked at the agreed day rate. Year-1 strategy review (included in Professional tier) is included in the programme fee — no additional invoice.
How does this service relate to the other RJV services?
Digital transformation frequently requires other RJV services as components. Where the transformation includes cloud migration, we produce a cloud transformation strategy component that integrates with the Cloud Migration Strategy service. Where the transformation includes AI deployment, the AI governance design integrates with the AI Systems Engineering service. Where the transformation requires new security architecture, the security requirements feed into the Cybersecurity & Resilience service. Where multiple services are engaged together, we design them as an integrated programme rather than parallel workstreams — which reduces duplication, resolves dependencies between the workstreams, and produces a single coherent architecture rather than multiple separately-designed components that must later be reconciled.

Start with a transformation assessment. Bring your current transformation initiatives — including the ones that have stalled.

A 90-minute session in which we review your current transformation status: what initiatives are active, what their stated objectives are, whether those objectives are defined as business outcomes or technology adoptions, and where the visible constraints and risks are. We also ask about initiatives that have stalled or failed — these are among the most useful inputs to a transformation strategy because they identify the specific failure modes in your specific organisational context.

At the end of the session, you have an honest assessment of what the principal risks in your current transformation approach are, and whether the five structural failure modes are present. This assessment is useful regardless of whether you subsequently engage us for the strategy programme — knowing which failure modes are present is the necessary precondition for addressing them.

Format
Video call or in-person in London. 90 minutes.
Cost
Free. No commitment.
Lead time
Within 5 business days of contact.
Bring
Current transformation initiatives and their stated objectives. Any existing transformation strategy or roadmap. Status of initiatives that have stalled or failed. Current technology estate overview. Executive sponsor’s stated priorities and timeline expectations.
Attendees
Executive sponsor or programme director. Optionally, the IT or digital lead. From RJV: a senior transformation strategist. Not a salesperson.
After
Written summary of session findings within 2 business days. Written scope and fixed price within 5 business days if you want to proceed.