Practical answers to common setup, security and configuration questions for SAP’s AI Adapter
Sensitive data should not be sent to publicly exposed AI providers. The recommended approach is to use your own grounded API or a private OpenAI license within your organization.
Prompt design is highly flexible and should be adapted to your specific requirements. Prompts are free-text fields, which means you can create simple instructions or more advanced versions depending on your use case. For example, a basic prompt may ask the AI to resolve a technical exception, while a more complex prompt can help generate better results in specialized scenarios.
For basic configuration, you do not need to know OpenAI’s API specification because the adapter handles this automatically. For advanced configurations, such as when using your own agent exposed as an API, you will need the endpoint URL from OpenAI, which can then be used as a relative URL in the adapter.
The AI adapter supports multiple LLM providers, and Gemini models are included in SAP AI Core. With the correct SAP license, Gemini can be activated and tested. Support for additional LLM providers is also planned and will be added in future updates.
To configure OAuth credentials, you first activate the AI Core service in your BTP account and create a service instance with service keys. From the service key, you obtain the client credentials and client ID. In Cloud Integration, you then create an OAuth client credential security artifact using these details, and this artifact is applied in the AI adapter to enable communication.
Sensitive data should not be sent to publicly exposed AI models. Instead, you should use a private or production OpenAI license, or a grounded API provider within your organization, to ensure that sensitive information remains in a secure and controlled environment.
For basic configuration, you do not need to know OpenAI’s API specification or provide an API key, since the AI adapter automatically manages communication. For advanced configuration, if you are using a custom agent exposed as an API, you will need the endpoint URL and the corresponding API key from OpenAI, which can then be used in the adapter’s advanced configuration as a relative URL.
In the following link you will find detailed information to configure the service instance: https://help.sap.com/docs/sap-ai-core/sap-ai-core-service-guide/initial-setup
In Cloud Integration, create an OAuth client credential security artifact using these credentials.
The prompt field in the adapter is a free-text input that accepts any custom prompt. The adapter does not alter or influence the provided prompt; it forwards it directly to the LLM in the format the LLM expects.
The prompt field in the adapter is a free-text input that accepts any custom prompt(No specific format required). The adapter does not alter or influence the provided prompt; it forwards it directly to the LLM in the format the LLM expects.
The response from the AI will depend on the following :
There are multiple other use case that can be implemented other than error handling scenario.
Responses of models are based on the training and data received. For specific data analysis, there are different ways to ground your AI and train the model, thus enabling the AI to suffice other use case.
AI adapter is Camel3x compatible. There should not be any impact regarding BTP/IS runtime upgrades.
The Iflow need to be configured in order to get response from the AI. This can be done by adding a request reply step(. e.g. Exception Handling Subprocess) where using the AI adapter you provide the right prompt to get respective error resolution from AI model.
For more details, you can access our AI Adapter on-demand Webinar here .
If you have further questions or would like to find out how the AI adapter can help streamline processes and create value for your organization, please don’t hesitate to reach out to us for support and guidance on implementation.