What Is SAP AI Core Actually?
SAP AI Core is often marketed as "AI for enterprise," but the reality is more limited. It's a containerized runtime environment hosted on SAP's Business Technology Platform (BTP) that allows enterprises to deploy and run machine learning models—specifically models trained in Python (scikit-learn, TensorFlow, PyTorch) and stored in SAP's proprietary MLflow-compatible format.
This matters because SAP AI Core is not a machine learning platform itself. It doesn't include data preparation, model training, or automated machine learning (AutoML) capabilities. You bring pre-trained models; SAP Core runs them. Think of it as a container orchestration layer, similar to Kubernetes, but integrated into BTP and designed to work with SAP's data ecosystem.
How SAP AI Core Fits into SAP's AI Stack
SAP is positioning AI across multiple products:
- SAP AI Core: Runtime for custom ML models (inference and batch scoring)
- SAP Joule: Generative AI copilot for SAP applications (ChatGPT-like assistant for Finance, Supply Chain, HR)
- SAP Analytics Cloud with Predictive Planning: Time series forecasting and planning models
- SAP Data Intelligence: Data pipelines and ETL (required to prep data for AI Core)
- SAP AI Launchpad: Management UI for monitoring, versioning, and deploying AI Core models
The critical insight: None of these work in isolation. To deploy a model in AI Core, you need Data Intelligence to manage data, you need HANA Cloud for fast inference queries, and you need Integration Suite to connect everything. This bundling is intentional—SAP's pricing strategy relies on customers adopting the full stack.
What SAP AI Launchpad Provides (and Doesn't)
SAP AI Launchpad is the operational interface for AI Core. It provides:
- Model registry and versioning (tracking deployed models and their configurations)
- Deployment management (promoting models from dev to production)
- Execution monitoring (job logs, resource utilization, performance metrics)
- Integration with GitHub or other CI/CD pipelines for automated model updates
- Basic alerting and audit trails
What it doesn't provide:
- Model training (you train elsewhere, then register models)
- Automated retraining workflows (you manage this via pipelines)
- Advanced experiment tracking (you use MLflow directly or a third-party tool)
- Data governance or lineage (Data Intelligence handles this)
- Performance optimization or tuning (you optimize models before deployment)
Launchpad is essentially a lightweight deployment and operations layer. Calling it a "complete AI platform" is misleading marketing. It's a wrapper around SAP's container infrastructure.
How Consumption Metering Actually Works: BTP Credits
This is where SAP's licensing model becomes opaque. AI Core and Launchpad consumption is metered in BTP credits, not named users or fixed subscriptions. BTP credits are SAP's cloud currency—like AWS's compute units or Azure's credits, but with less transparency on what you're actually paying for.
BTP Credit Consumption Drivers
You're charged credits based on:
- Compute (CPU/Memory): How much CPU and memory your running models consume. Measured in "virtual machine seconds" or "container hours."
- Storage: Model artifacts and execution data stored in BTP's S3-compatible object storage.
- API calls: Inference requests to deployed models (pay-per-call pricing, similar to AWS Lambda).
- Execution hours: Batch jobs that run models against datasets, priced per hour of compute.
- HANA Cloud queries: If you're using AI Core with HANA Cloud for fast scoring, HANA consumption is separate.
- Data Intelligence pipelines: ETL jobs that feed data into AI Core models.
The problem: SAP doesn't publish transparent per-credit pricing. You receive quotes like "100,000 credits/month for AI Core deployment" without visibility into what those credits translate to in actual infrastructure. This is fundamentally different from hyperscaler models, where you see exact costs per vCPU-hour or per API call.
The Hidden Cost: Total BTP Spend
Enterprises typically underestimate AI Core costs because they forget dependencies:
- HANA Cloud subscription: 10-30% of overall BTP cost
- Data Intelligence: 15-25% (required for data pipelines)
- Integration Suite: 10-20% (connecting external data sources)
- AI Core/Launchpad: 20-40% of the total stack cost
A $50,000/month AI Core "subscription" often means $150,000-$200,000 in total BTP commitments because you're forced to bundle supporting services.
Hidden Dependencies and Lock-In
SAP's bundling strategy creates deliberate lock-in through technical and commercial dependencies.
HANA Cloud Dependency
AI Core is optimized for inference against HANA Cloud databases. If you want real-time scoring of enterprise data (e.g., "is this customer high-risk?"), you're expected to license HANA Cloud. SAP's pitch: "HANA Cloud enables millisecond latency for AI scoring." Truth: HANA Cloud is expensive, and SAP uses AI as a reason to mandate it during renewals.
Data Intelligence Requirement
Data preprocessing and pipeline creation require SAP Data Intelligence. You cannot simply drop a CSV file into AI Core and score it. You need:
- Data Intelligence Modeler for building pipelines
- Metadata explorer for cataloging datasets
- Connection management for pulling data from ERP, cloud sources, third-party systems
Data Intelligence pricing is per-user-month or per-pipeline, typically $5,000-$15,000/month for production deployments. SAP bundles this as "required infrastructure" for AI Core projects.
Integration Suite: The Middleware Tax
To connect external data sources, third-party APIs, or non-SAP systems to AI Core, you need SAP Integration Suite (formerly Integration Cloud). This includes:
- API Management for exposing your models
- Cloud Integration for data ingestion pipelines
- Event mesh for real-time data feeds
Integration Suite adds another $8,000-$20,000/month to the stack. SAP sells it as "enterprise-grade connectivity," but in practice, enterprises could use cheaper alternatives (Mulesoft, Apache Kafka, Zapier) if not locked into BTP.
The AI Lock-In Problem
SAP AI Core creates switching costs that favor long-term dependency on SAP's ecosystem.
Proprietary Model Format
Models deployed in AI Core are stored in SAP's MLflow wrapper format. While MLflow is open-source, SAP's implementation includes proprietary extensions for versioning, deployment manifests, and BTP integration. Migrating a production model to AWS SageMaker or Azure ML means:
- Exporting model artifacts and re-registering them in the hyperscaler platform
- Rebuilding deployment configurations (different format, different requirements)
- Re-integrating with your data pipelines (no longer using Data Intelligence)
- Revalidating performance in the new environment
It's technically possible, but operationally expensive—often $100,000+ for 10-20 production models. This "migration tax" means enterprises stay with SAP even when costs exceed alternatives.
Joule Dependency
SAP is aggressively positioning Joule (its generative AI assistant) as the standard way users interact with SAP applications. Joule runs on top of AI Core infrastructure. If you're committed to Joule, you're locked into SAP's AI stack—no equivalent exists in AWS or Azure.
Why SAP Is Pushing AI During Renewals
SAP's contract renewal strategy for 2025-2026 increasingly includes mandatory AI add-ons. Here's why:
AI Justifies Price Increases
SAP positions AI capability as justification for 10-20% contract increases during renewal. "You get new AI-powered features in S/4HANA, you get Joule access, and you get the option to deploy custom models in AI Core. All this costs more."
Enterprises face pressure: either accept the AI rider and higher pricing, or go through a competitive procurement process (which SAP discourages via long renewal timelines).
AI Drives BTP Adoption
SAP's long-term strategy is to move customers from on-premise or RISE (cloud ERP with limited services) to deep BTP consumption. AI is the Trojan horse. Once an enterprise commits to AI Core, they're forced to adopt the full BTP stack (HANA, Data Intelligence, Integration Suite), which creates recurring revenue and customer lock-in.
Competitive Positioning Against Hyperscalers
AWS, Azure, and GCP offer superior ML platforms (SageMaker, Azure ML, Vertex AI) at lower costs. SAP's response is to bundle AI into SAP ERP licensing, making it appear "included" rather than optional. This obscures the true cost comparison.
The AI Licensing Model: Clarity You're Not Getting
SAP's official pricing for AI Core is intentionally vague:
- Stated: "BTP credits based on actual consumption"
- Reality: Minimum monthly commitments ($20,000-$100,000 depending on enterprise size) plus usage overages
- Stated: "Launchpad is a free tool on top of Core"
- Reality: Launchpad requires minimum BTP compute commitments; you can't use Launchpad without Core
- Stated: "Optional add-on to your existing BTP deployment"
- Reality: SAP forces bundling of HANA, Data Intelligence, and Integration Suite to make AI workable
This vagueness is deliberate. SAP avoids price transparency to prevent cost comparisons with hyperscalers. You're expected to sign a contract, commit to credits, and discover true costs during implementation.
What CTOs and Cloud Architects Should Evaluate
Before signing an AI Core rider, demand answers to these questions:
1. Total Cost of Ownership (TCO) vs. Hyperscalers
- Get a binding price quote for: AI Core, Launchpad, HANA Cloud, Data Intelligence, Integration Suite (in credits and currency, not hand-wavy "approximately")
- Request a 3-year cost projection with escalation rates
- Compare line-by-line against AWS SageMaker (for training + inference), Azure ML (for model management), and Databricks (for data + ML)
- Include migration costs if you switch from SAP AI Core to a hyperscaler
2. Performance and Latency Requirements
- Test AI Core inference latency against your SLAs. If you need sub-100ms scoring, does AI Core meet it? (HANA Cloud helps, but at cost.)
- Benchmark against AWS Lambda (for lightweight inference) or SageMaker Endpoints (for production models)
- Quantify whether HANA Cloud is truly required or if you can run inference with cheaper alternatives
3. Data Source and Pipeline Complexity
- How much data are you moving through Data Intelligence monthly? (This drives consumption costs.)
- Can you achieve the same data integration with Cloud Integration or Apache Kafka instead of SAP Integration Suite?
- Get a detailed breakdown of Data Intelligence and Integration Suite costs as percentage of total BTP spend
4. Model Portability and Exit Costs
- Request a technical assessment: how long to migrate 10 production models from AI Core to Azure ML or AWS SageMaker?
- What's the cost to re-host inference (BTP vs. hyperscaler)?
- Does your license agreement allow you to export model artifacts and training data if you leave SAP?
5. Team Capacity and Training
- How many data scientists and ML engineers will SAP AI Core require? BTP has a steep learning curve compared to Azure ML or SageMaker.
- Is your team more comfortable with Python + open-source tools (which hyperscalers support) or SAP-specific tooling (which locks you in)?
- Budget for retraining if you switch platforms later
6. Joule Dependency
- Is your requirement truly AI Core, or is it Joule (the conversational AI assistant)? Joule is becoming mandatory during renewals, but you may not need Core to benefit from it.
- If Joule is the main value, negotiate separately—don't bundle Core unless you have a specific use case
7. Escalation and Renewal Terms
- Get written commitment on BTP credit costs for 3 years (with escalation caps, e.g., max 5% annually)
- Negotiate "use-it-or-lose-it" clauses so unused credits don't roll into higher baselines
- Define what happens at renewal if SAP restructures BTP pricing (this has happened)
SAP AI Core vs. Hyperscaler Native AI: A Comparison
| Dimension | SAP AI Core | AWS SageMaker | Azure ML | GCP Vertex AI |
|---|---|---|---|---|
| Model Training | Not included; bring your own | Full training + AutoML | Full training + AutoML | Full training + AutoML |
| Inference Pricing | Per-call + compute; no transparency | Per-vCPU-hour; clear pricing | Per-vCPU-hour; clear pricing | Per-vCPU-hour; clear pricing |
| Integration with SAP ERP | Native; via SAP Integration Suite | Via API; custom work required | Via API; custom work required | Via API; custom work required |
| Data Pipeline (ETL) | Requires Data Intelligence | SageMaker Data Wrangler + AWS Glue | Azure Data Factory | Dataflow + Vertex AI Data |
| Portability | Low (SAP format lock-in) | High (standard container formats) | High (standard container formats) | High (standard container formats) |
| Pricing Transparency | Low; opaque BTP credits | High; detailed cost calculator | High; detailed cost calculator | High; detailed cost calculator |
| Typical Monthly Cost (10 models in production) | $30,000–$80,000 (all-in BTP) | $5,000–$15,000 (inference only) | $5,000–$15,000 (inference only) | $5,000–$15,000 (inference only) |
The verdict: For pure AI/ML workloads, hyperscalers are 40-60% cheaper and far more flexible. You only choose SAP AI Core if you're deeply committed to the SAP ecosystem and need tight integration with SAP ERP or Joule.
Negotiation Strategy: What to Demand in Contracts
If you're locked into SAP and must negotiate AI licensing, use these tactics:
1. Demand Transparency on Per-Credit Costs
Negotiate for a schedule that shows:
- Price per AI Core credit (in USD or local currency)
- Price per Data Intelligence transaction
- Price per Integration Suite API call
- Minimum monthly commitment and how unused credits rollover
If SAP refuses, they're hiding margin. Walk.
2. Cap Annual Cost Escalation
Lock in maximum 3-5% annual increases on BTP credits. SAP often tries 8-10% per year, compounding to 40% over 5 years. Negotiate aggressively here.
3. Create a "Hyperscaler Comparison Clause"
Insert language like: "If customer demonstrates that an equivalent AI/ML workload on AWS SageMaker costs less than 50% of SAP BTP pricing, SAP will match or improve pricing by X%." This creates accountability.
4. Negotiate Portability Rights
Demand that you retain rights to export:
- All model artifacts in standard formats (ONNX, SavedModel, pickle)
- All training data and metadata
- All pipeline definitions (in standard formats, not proprietary)
If you ever leave SAP, you shouldn't be held hostage by data lock-in.
5. Separate AI Core from Overall ERP Renewal
SAP bundles AI into base ERP pricing to obscure costs. Demand separate line items for:
- ERP subscription
- AI Core services
- BTP infrastructure
This transparency forces SAP to justify each component and makes it easier to swap pieces (e.g., use hyperscaler AI instead of Core).
Real-World Impact: What This Means for Your Business
Let's ground this in a realistic scenario:
Scenario: Manufacturing Company with 5 AI Models
A mid-market manufacturer wants to deploy 5 predictive models: demand forecasting, equipment maintenance prediction, quality scoring, supply chain risk, and customer churn.
Option 1: SAP AI Core (locked into BTP)
- AI Core: $25,000/month
- HANA Cloud: $20,000/month (required for scoring latency)
- Data Intelligence: $15,000/month (for data pipelines)
- Integration Suite: $10,000/month (for external data)
- Total: $70,000/month = $840,000/year
Option 2: AWS SageMaker (hyperscaler)
- SageMaker training + inference: $8,000/month
- Lambda/API Gateway for API exposure: $2,000/month
- AWS Glue for ETL: $3,000/month
- RDS for database (non-SAP): $4,000/month
- Total: $17,000/month = $204,000/year
Annual savings with AWS: $636,000. Over 5 years: $3.2 million.
Even factoring in $200,000 in migration effort, the ROI is massive. But SAP's bundling strategy banks on customers not doing this math—or not having the leverage to negotiate.
Is Your AI Licensing Locked In with SAP?
Our license optimization team has helped enterprises unlock $2M+ in cumulative BTP savings by renegotiating AI agreements and identifying alternatives.
Frequently Asked Questions
See Real Results: How We Reduced a Customer's BTP Spend by 45%
A Fortune 500 financial services company negotiated out of SAP AI Core lock-in and migrated to Azure ML. Read how we structured the exit and achieved $4.8M in 3-year savings.