After working through hundreds of enterprise application portfolios, we've refined our rationalization methodology down to six core steps. Each step builds on the last, and the full sequence — executed properly — produces a prioritized roadmap that a capable IT team can begin executing within 90 days of the first data pull.
What follows is that six-step framework in full detail: what you do in each step, what questions you're answering, and what the output looks like.
Why Framework Matters More Than Tools
Many organizations approach application rationalization by purchasing a Software Asset Management tool and expecting it to produce results. SAM tools are useful for inventory and license tracking, but they don't make decisions — and it's the decision logic that determines whether a rationalization initiative generates real savings or simply produces a prettier spreadsheet.
The APM Guru framework is a decision framework, not a software implementation. It can be executed with nothing more than Excel, though it scales effectively into SAM tools and ITSM platforms when the portfolio is large enough to justify them.
Discover and Inventory
The first step is building a complete, accurate application inventory. This means cataloging every application your organization pays for, uses, or depends on — including shadow IT that has been adopted outside official IT channels.
Discovery uses multiple data sources in combination: SSO platform logs (Okta, Azure AD, Google Workspace), accounts payable data, network traffic analysis, Software Asset Management systems, cloud billing consoles, and department-level surveys. No single source is complete on its own.
Output: A master application registry with one row per application, capturing: application name, vendor, business owner, primary function, annual cost, licensed seat count, contract renewal date, hosting model, and known integration dependencies.
Success criteria: The inventory is complete when three successive discovery passes (across different data sources) add fewer than 5% new applications to the list.
Gather Usage and Cost Intelligence
Inventory tells you what you have. Usage data tells you whether it's being used. This step focuses on pulling actual utilization metrics for every application in the registry.
Usage Metrics to Collect
- Monthly Active Users (MAU) vs. provisioned/licensed seats — the core utilization ratio
- Last login date — applications with no logins in 90+ days are immediate retirement candidates
- Feature utilization (where available from vendor portals) — are users engaging with the core capabilities, or only a surface-level subset?
- Login frequency distribution — what percentage of users log in daily, weekly, monthly, or never?
Cost Intelligence
Move beyond contract cost to full Total Cost of Ownership (TCO). For each application, estimate: vendor license cost, internal IT support cost (hours × loaded rate), integration maintenance cost, hosting/infrastructure cost if applicable, and training cost for onboarding new users.
Compute cost per monthly active user. This is the single most powerful metric in the audit: it translates raw cost data into something directly comparable across applications of different sizes and types.
Output: An enriched application registry with utilization rates, cost-per-active-user calculations, and TCO estimates for all significant applications (those above a materiality threshold, typically $10K/year or more).
Score the Portfolio
With inventory, usage, and cost data collected, every application is scored across four dimensions on a 1–5 scale:
- Business Value: Strategic alignment, revenue impact, compliance requirements, competitive necessity.
- Technical Health: Vendor support status, security posture, architecture modernity, integration complexity.
- Utilization: Monthly active user rate as a percentage of licensed seats. Below 40% = 1. Above 80% = 5.
- Cost Efficiency: Cost per active user relative to peer benchmarks and alternative solutions. Benchmarks vary by category — we maintain a database of per-seat cost benchmarks across 150+ software categories.
Scores are combined into a composite using weights that reflect organizational priorities. A cost-focused initiative might weight Utilization and Cost Efficiency more heavily. A security-focused program might weight Technical Health. For most general-purpose rationalization efforts, equal weighting across four dimensions works well.
APM Guru Benchmark: Average composite scores across portfolios we've audited: Retain-tier applications average 3.8+. Consolidation candidates cluster between 2.5–3.2. Retirement candidates average below 2.2. If your portfolio average is below 3.0, you have significant rationalization opportunity.
Map Functional Overlaps
Scoring identifies weak applications. Overlap mapping identifies redundant ones — which is where the largest consolidation savings typically live.
Group all applications by function (Communication, Collaboration, Project Management, HR & People, CRM, Analytics, Finance, Security, Industry-Specific, Infrastructure). Within each functional group, count how many applications serve the same or overlapping capability.
For each overlap, evaluate:
- Which application has the stronger scores across the four dimensions?
- Which has the larger, more active user base?
- What would it cost to migrate users from the weaker system to the stronger one?
- Are there integration dependencies that make migration complex?
The goal is to identify "consolidation pairs" — cases where one application can absorb the function of another, making the second redundant. Each consolidation pair is a retirement opportunity.
Output: A functional overlap matrix showing all redundant application pairs and estimated consolidation savings for each.
Validate with Stakeholders
Data tells most of the story. Business owners tell the rest. This step involves structured conversations with the business owner of every application flagged for retirement, consolidation, or migration.
Each conversation follows a structured agenda:
- Present the data: Share utilization rates, cost figures, and composite score. Let the data lead.
- Listen for context: Is there seasonal use you missed? A project dependency? A planned expansion?
- Discuss alternatives: Can its functions be served by another existing system? Is the business owner aware of the overlap?
- Establish timeline: If a retirement or migration is agreed, what's a realistic transition timeline?
"The stakeholder conversations almost always validate the data. The ones that don't — those are the conversations where you learn something that actually matters."
In roughly 15% of cases, stakeholder conversations reveal context that changes a recommendation — a planned expansion, a compliance requirement, an integration that's more critical than expected. That 15% is why this step matters: it catches the decisions where the data alone would have been wrong.
Build and Execute the Roadmap
The final step is converting validated recommendations into an execution roadmap — a time-phased plan for retiring, consolidating, migrating, and renegotiating across the portfolio.
Roadmap Structure
- Wave 1 (Days 1–90): Immediate retirements (applications with near-zero usage, no dependencies, expiring contracts) and quick renegotiations. Target: 30–50% of total savings opportunity.
- Wave 2 (Days 90–180): Simple consolidations requiring user migration. Training, change management, and offboarding the retired system. Target: additional 20–30% of savings.
- Wave 3 (Days 180–365): Complex migrations, legacy platform replacements, and high-integration consolidations. Target: final 20–30% of savings, plus a governance process to prevent future sprawl.
Governance Mechanism
The roadmap isn't the end of the journey — it's the beginning of a practice. Every rationalization initiative should conclude with a governance mechanism in place: an application intake process that requires IT review before new software is adopted, and an annual portfolio review cycle that applies the same framework to the updated portfolio.
Output: A fully documented rationalization plan: application dispositions for every item in the portfolio, Wave 1–3 execution sequence with owners and timelines, projected savings by wave, and a governance framework for ongoing portfolio management.
How Long Does the Process Take?
For a mid-size enterprise (500–2,000 employees, 150–400 applications), the full six-step process typically takes 8–12 weeks from kick-off to final roadmap delivery. Larger organizations with more complex portfolios typically need 12–16 weeks. The timeline is driven primarily by data availability and stakeholder scheduling — the analytical work itself is faster when data is accessible.
What the Framework Produces
At the conclusion of a full six-step engagement, an organization has:
- A 100%-complete, scored, and validated application registry
- A documented rationalization roadmap with specific application dispositions
- Projected savings by wave (typically $1M–$10M+ annually for mid-to-large enterprises)
- An application governance framework to prevent future sprawl
- Business-owner alignment across all major recommendations
Ready to run the APM Guru framework on your portfolio? Book a free initial portfolio assessment →
