SAP HANA Developer Guide For SAP HANA Web Workbench
SAP HANA Developer Guide For SAP HANA Web Workbench
1 SAP HANA Developer Guide for SAP HANA Web-Based Development Workbench. . . . . . . . . . . 6
This guide explains how to build applications using SAP HANA, including how to model data, how to write
procedures, and how to build application logic in SAP HANA Extended Application Services, classic model.
The SAP HANA Developer Guide for SAP HANA Web Workbench explains the steps required to develop, build,
and deploy applications that run in the SAP HANA XS classic model run-time environment using the tools
provided with the browser-based SAP HANA Web-based Workbench. It also describes the technical structure
of applications that can be deployed to the XS classic run-time platform. The information in the guide is
organized as follows:
SAP HANA Extended Application Services (SAP HANA XS) provides the SAP HANA Web-based Development
Workbench that you can use to build and test development artifacts in the SAP HANA environment.
The SAP HANA Web-based Development Workbench allows you to develop entire applications in a Web
browser without having to install any development tools and is therefore a quick and easy alternative to using
the SAP HANA studio for developing native applications for SAP HANA XS. It provides an intuitive user
interface and simplifies development by providing many convenient functions. For example, it includes a wizard
for creating applications and automatically generates the application descriptors (the .xsapp and .xsaccess
files) that SAP HANA applications are required to have.
The SAP HANA Web-based Development Workbench is available on the SAP HANA XS Web server at the
following URL:
http://<WebServerHost>:80<SAPHANAinstance>/sap/hana/ide
Note
The SAP HANA Web-based Development Workbench supports Microsoft Internet Explorer (10+), Mozilla
Firefox, and Google Chrome Web browsers.
Required Roles
Before you start using the SAP HANA Web-based Development Workbench, the SAP HANA administrator must
set up a user account for you in the database and assign the developer roles you require for the specific tools:
Editor Inspect, create, change, delete and activate SAP HANA re sap.hana.ide.roles::EditorDeveloper
pository objects
Catalog Create, edit, execute and manage SQL catalog artifacts in sap.hana.ide.roles::CatalogDeveloper
the SAP HANA database
Traces View and download SAP HANA trace files and set trace lev sap.hana.ide.roles::TraceViewer
els
Note
The parent role sap.hana.ide.roles::Developer allows you to use all tools included in the SAP HANA Web-based Develop
ment Workbench. To use the debugging features, however, you also need to be assigned the
sap.hana.xs.debugger::Debugger role.
Related Information
SAP HANA Web-based Development Workbench includes an all-purpose editor tool that enables you to
maintain and run design-time objects in the SAP HANA Repository.
The Editor component of the SAP HANA Web-based Development Workbench provides a browser-based
environment for developing SAP HANA Extended Services (SAP HANA XS) repository artifacts. To use the
Editor you must have the privileges granted by the role sap.hana.ide.roles::EditorDeveloper or the parent role
sap.hana.ide.roles::Developer.
Note
The Web-based Editor tool is available on the SAP HANA XS Web server at the following URL: http://
<WebServerHost>:80<SAPHANAinstance>/sap/hana/ide/editor
Feature Overview
In addition to the basic functions of creating, editing, and executing repository objects, the Editor provides
additional features, which are described briefly in the table below.
Feature Description
Multi-file drop zone You can upload multiple files at once to a repository package by dragging and dropping the se
lected files (from your desktop, for example) into the area marked as the multi-file drop zone.
The multi-file drop zone is visible when a package is selected in the package tree.
Multiple editors You can open multiple editor tabs or you can open an editor as its own browser tab.
File history The tabs (files) you opened during your last session will reopen when you log in to a new ses
sion.
Version history You can view a list of file versions, compare one version of a file with another one, and revert a
file to a previous version.
Templates You can use templates for standard SAP HANA XS applications, including SAPUI5 applications
and SAP Fiori applications. You can also use code snippet templates for individual files, such
as .hdbtable, .hdbschema, .xsjs, .xsodata, and .hdbprocedure files.
Automatic syntax high Syntax highlighting of source code, code completion options, and checks for syntax errors are
lighting and code comple available for most artifact types.
tion
ESLint code checks The JavaScript editor includes the ESLint validation tool, which highlights any code that does
not conform to ESLint standards. You can configure or disable the check in the editor settings.
Inactive save and object You can save and execute an inactive version of your artifact before activating it to make it avail
execution able for others to use. You can enable this functionality in the editor settings.
Direct testing You can test html pages and XSJS services directly from the editor in the browser. The applica
tion preview allows you to test HTML pages with various form factors.
Tip
The direct-testing feature adds a timestamp as a URL parameter to bypass caching. How
ever, internal Ajax calls in your application might be cached by the browser. Try clearing the
browser-specific cache if you do not immediately see your changes reflected on execution.
Debugging You can use the integrated debugging features to debug your application. Note that to be able
to use the debugging features, you need to be assigned the sap.hana.xs.debugger::Debugger
role.
Immediate feedback This option lets you inspect the execution of an xsjs function, including execution times for SQL
queries.
Function flow A code flow visualizer that shows you which JavaScript functions are called in a file and allows
you to navigate between them.
JSDoc You can generate JSDoc documentation for XSJS, XSJSLIB, and JS files.
Related Information
Set your preferences for working with the Editor tool of the SAP HANA Web-based Development Workbench.
You can open the Editor Settings dialog box by choosing (Settings) in the toolbar.
Option Description
Editor font size Use the slider to adjust the font size.
Editor background dark Select this checkbox to change the background color of the text editor to black.
Code Check
Configure the ESLint code check, which is included in the JavaScript editor, on the General tab.
Option Description
Code check level Select a new code check level or disable the code check entirely by selecting Disable.
Disable code check on Select this checkbox so the code check is not triggered every time you make a change to your
change code.
Enable these features on the General tab to be able to save and execute inactive versions of your artifacts
before activating them to make them available for others to use.
Option Description
● As a toolbar button
Enable inactive object execution Lets you run inactive versions of an object.
Note
In the version history of any file that has an inactive version, the inactive version is marked with [l] and is
listed above the active versions. You can compare the inactive version with other versions.
Select a debug session when you want to debug JavaScript (XSJS and XSJSLIB) code from a different session
(for example, a different application or browser instance). The debugger will be connected to the selected
session.
Option Description
Session to debug The session you want to use for debugging JavaScript (XSJS and XSJSLIB) files.
Auto-Save
On the Auto-Save tab, you can enable the auto-save feature to automatically save the editor state to the local
browser cache at preset intervals. Use this option to avoid losing work due to server session timeout, browser
crash, or accidental closure.
Option Description
Enable client-side auto-save Select this checkbox to save changes in all open files automatically at regular intervals.
Changes in all open files will be saved automatically at the chosen interval and when
ever you close a file or switch to another tab.
Caution
● All cached file contents are stored unencrypted in the browser's local storage and can therefore be
viewed by anyone else using the computer.
● The cache will be cleared as soon as you disable the auto-save option.
Note
In the version history of any file that has a locally cached version, the locally cached version is marked with
[L] and is listed above the inactive version and active versions. You can compare the locally cached version
with other versions and restore it.
SQL Result
On the SQL Execution tab, use the following options to determine how SQL query results are displayed in the
result table on the Result tab of the SQL console.
Limit for LOB columns Enter the limit in bytes for LOB columns displayed in the result table. Values exceeding this limit
(bytes) will be truncated.
Representation of null Change the character used to represent NULL values in the result table. The default value is "?".
values
Maximum result size Enter the maximum number of records to be fetched from the database and displayed in the re
sult table.
Related Information
In the editor, you can use common shortcut key codes to make quick edits in open files. Whether a feature is
supported depends on the artifact type.
Add multi-cursor below CTRL + ALT + DOWN ARROW CTRL + Option + DOWN ARROW
Add next occurrence to multi-selection CTRL + ALT + RIGHT ARROW CTRL + Option + RIGHT ARROW
Add previous occurrence to multi- CTRL + ALT + LEFT ARROW CTRL + Option + LEFT ARROW
selection
Copy lines down ALT + SHIFT + DOWN ARROW Command + Option + DOWN ARROW
Move multicursor from current line to CTRL + ALT + SHIFT + UP ARROW CTRL + Option + SHIFT + UP
the line above ARROW
Move multicursor from current line to CTRL + ALT + SHIFT + DOWN CTRL + Option + SHIFT + DOWN
the line below ARROW ARROW
Remove current occurrence from CTRL + ALT + SHIFT + RIGHT CTRL + Option + SHIFT + RIGHT
multi-selection and move to next ARROW ARROW
Remove current occurrence from CTRL + ALT + SHIFT + LEFT CTRL + Option + SHIFT + LEFT
multi-selection and move to previous ARROW ARROW
Select to line end ALT + SHIFT + RIGHT ARROW Command + SHIFT + RIGHT ARROW
Select to line start ALT + SHIFT + LEFT ARROW Command + SHIFT + LEFT ARROW
Select word left CTRL + SHIFT + LEFT ARROW Option + SHIFT + LEFT ARROW
Select word right CTRL + SHIFT + RIGHT ARROW Option + SHIFT + RIGHT ARROW
The SAP HANA Web-based Development Workbench Editor provides quick access to a variety of useful
developer tools.
Toolbar
Settings Opens the dialog box for maintaining editor settings, such as inactive
save and object execution, auto-save, and the ESLint code check.
Navigation Links Provides direct navigation to other tools: Catalog, Security, Trace,
Lifecycle Management
Menu Provides a dropdown menu with file, edit, search, view, and tools
menu options
Find File Searches within the Content tree for the specified file. The search in
cludes both inactive and activated artifacts. The file name is case
sensitive.
Insert Snippet Inserts the appropriate code snippets and templates for the selected
artifact type
Run Runs HTML pages and XSJS services directly in the browser, and al
lows you to test HTML pages with various form factors in an applica
tion preview
Assign Execution Authoriza Assigns schema privileges to your user, by default, EXECUTE, SE
tion LECT, INSERT, UPDATE, and DELETE
Maintain Credentials/Details Goes directly to the related runtime configuration in the SAP HANA
XS Administration Tool, such as security and authentication, XS jobs,
and HTTP destinations
Show Function Flow Shows an outline view of all functions in the selected file
Show Immediate Feedback Opens a panel for inspecting the execution of an XSJS function, in
cluding execution times for SQL queries
Show/Hide SQL Result Shows or hides the result table displayed for SQL queries
Context Menu
Some additional tools and features that are available in the context-sensitive menu for the different artifact
types are listed in the table below.
Packages
Import -> File Import a file from your local file system to the selected package.
Import -> Archive Import an archive (ZIP) from your local file system to the selected package.
Export Package Download the package as a ZIP file to your local file system.
Delivery Unit -> Assign Assign the selected delivery unit to the package and sub-packages (optional).
Delivery Unit -> Unassign Unassign the delivery unit from the selected package and sub-packages.
Change Package Attributes Display and edit details of the selected package, for example, the package de
scription and the person responsible for the package’s creation and mainte
nance.
Create Application Create an SAP HANA XS application using a standard template (use the empty
template if you just want to generate the application descriptors).
Synchronize with Github Synchronize changes made to local file versions with the versions of the files on
GitHub.
Activate All Generate catalog object versions for all inactive artifacts in the currently se
lected package.
Force Delete Use force delete when normal deletion is prevented due to the package contain
ing inactive objects from a workspace that is not the standard workspace used
by the SAP HANA Web-based Development Workbench. The files in the other
workspaces will be deleted as well.
Search Text Search for specific text within files contained in your package structure and op
tionally replace with other text.
The search results are displayed in a separate panel, where each row represents
a file in which the search string was found. Click a row multiple times to cycle
through the occurrences in a file. To replace text, use the Previous hit, Next hit,
Replace, and Replace all buttons.
Generate JSDoc Generate JSDoc documentation for XSJS, XSJSLIB, and JS files contained in the
selected package.
Copy Shortcut Open the shortcut dialog box to copy the file shortcut.
Cut, Copy, Paste Cut and paste or copy and paste the selected artifact at the location you require
in the package structure. Manually correct inconsistencies resulting from this ac
tion.
Rename Rename the selected package or file. Manually correct inconsistencies resulting
from the renaming.
Files
Open With -> Code Editor Open the selected file in the appropriate editor.
Versions Display the complete list of file versions available for the selected item. Right-
click an entry in the history list to compare two versions of the file or restore a file
version.
Activate Generate a catalog object version for the currently selected inactive artifact.
When you activate an artifact, the XS repository also tries to activate dependent
objects.
Activate without Cascade Activate the selected artifact only. This is useful for artifacts that have depend
encies to others, in particular where there are complex dependency chains.
Revert Restore the latest active version of the file. In cut-and-paste cases, the file does
not have an active version so far and is therefore just deleted.
Related Information
The text search functionality requires the full-text index queues to be active. The queues are, by default, active
initially. As an administrator, you can check the status of the queues and restart them, if necessary, using the
commands listed below.
Use the M_FULLTEXT_QUEUES view to check the full-text index queue status and SYS.FULLTEXT_INDEXES to
view the configuration of a full-text index.
Command
Use the ALTER FULLTEXT INDEX statement to suspend or reactivate the queue.
Command
Related Information
M_FULLTEXT_QUEUES
FULLTEXT_INDEXES
SAP HANA Web-based Development Workbench includes a catalog tool that enables you to develop and
maintain SQL catalog objects in the SAP HANA database.
The Catalog component of the SAP HANA Web-based Development Workbench contains the database objects
that have been activated, for example, from design-time objects or from SQL DDL statements. The objects are
divided into schemas, which is a way to organize activated database objects. To use the Catalog you must have
the privileges granted by the role sap.hana.ide.roles::CatalogDeveloper or the parent role
sap.hana.ide.roles::Developer.
The Web-based Catalog tool is available on the SAP HANA XS Web server at the following URL: http://
<WebServerHost>:80<SAPHANAinstance>/sap/hana/ide/catalog
Feature Overview
In addition to the basic tools for creating, editing, and executing database objects, the Catalog provides
additional features, which are described briefly in the table below.
Feature Description
Catalog objects The catalog browser allows you to import and export catalog objects, filter on the catalog tree,
search for specific catalog objects, and display a where-used list.
Favorites You can save your frequently used objects for easier access.
SQL console You can execute SQL statements (such as CREATE, SELECT, INSERT, UPDATE, DELETE,
GRANT). You can also generate SQL statements, for example, to create, select, or insert entries
in database tables or invoke procedures or functions.
Code completion In the SQL editor you can use the semantic code completion feature, a context-based search
tool that lists suggested catalog objects and local variables.
Definition view Displays the definition of runtime objects, for example, stored procedures, functions, tables, or
views.
Content view Displays the content of tables or views. You have additional options, such as copying rows or
cells, changing layouts, and exporting data as csv files.
Debugging The SQL debugger allows you to set breakpoints in the runtime object and call it from the SQL
console.
Performance analysis The performance analysis option allows you to acquire performance measurement data while
executing a SQL statement to assess whether a SQL statement is problematic.
Related Information
Set your general preferences for working with the Catalog tool of the SAP HANA Web-based Development
Workbench.
You can open the Settings dialog box by choosing (Settings) in the toolbar.
Option Description
Editor font size Use the slider to adjust the font size.
Example:
Example:
SAP Morlock (dark theme, but more colorful and vivid than SAP Basement)
Example:
Auto-Save
Enable the auto-save feature to automatically save SQL console contents to the local browser cache.
Option Description
Auto-save SQL console to local Select this checkbox to automatically save SQL console contents to the local storage.
● All cached file contents are stored unencrypted in the browser's local storage and can therefore be
viewed by anyone else using the computer.
● The cache will be cleared as soon as you disable the auto-save option.
Use the following options to determine how SQL query results are displayed in the result table on the Result tab
of the SQL console and in the data preview table.
Option Description
Limit for LOB Columns Enter the limit in bytes for LOB columns displayed in the result table and data preview table.
(Bytes) Values exceeding this limit will be truncated.
Representation of Null Change the character used to represent NULL values in the result table and data preview table.
Value The default value is "?".
Maximum Result Size Enter the maximum number of records to be fetched from the database and displayed in the
result table and data preview table.
Browser
Use the following option to retrieve a limited number of objects in the catalog browser.
Option Description
Limit the number of ob Enter the maximum number of objects to be displayed by the browser.
jects to retrieve
For example, if you enter 10, a maximum of 10 schemas will be displayed and under each schema
a maximum of 10 tables, 10 procedures, 10 table types, and so on.
If the number of available objects exceeds the number specified here, the message Object
limit <number> reached appears.
The SQL console provides an SQL editor that allows you to work with SQL statements. You can enter, execute,
and analyze SQL statements in the SQL console.
Each SQL console works with its own schema and parameters. You can see with which schema the SQL
console is currently associated in the now editing section on the screen, as shown in the example below:
The current schema is used when SQL statements use database object names, for example, table names, that
are not prefixed with a schema name.
Tip
You can change the current schema by dragging and dropping a schema from the catalog tree into the now
editing section.
● You can write SQL syntax elements in upper case or lower case.
● You can add any number of spaces and line breaks.
● To force the system to distinguish between upper-case and lower-case letters in database object names
(such as table names), enter the name between double quotation marks, for example, "My_Table".
● Separate multiple SQL statements with a semicolon (;).
● To comment out a line, use -- (double hyphens) at the start of the line.
The code completion feature provides a list of syntactic and semantic proposals based on the given context
and textual input. Code completion proposals include the following:
● Local variables, such as input and output parameters, declared scalar variables
● Catalog objects, such as schemas, views, table functions, procedures, scalar functions, synonyms
● Code snippets
● Keywords
To open a list of proposals, press the key combination CTRL + SPACEBAR at the appropriate point in your code
and then use the arrow keys to scroll through the displayed list. Press ENTER on a highlighted entry to apply it
to your code.
Example:
Press CTRL and hover over a table, view, or procedure in the SQL console to display the definition of the object.
Example:
Execute SQL statements by choosing (Run) in the toolbar. If you have entered several statements, you can
execute them individually by highlighting the statement and choosing (Run). If you do not highlight an
individual statement, all statements are executed.
The Result tab appears with the statement's results. Multiple Result tabs may open depending on the number
of statements executed.
You can open, save, and download SQL console contents as SQL files.
Note that you can enable auto-save for SQL console contents in the catalog settings.
The results of a SQL select statement or procedure call are shown in a result table on the Result tab. The
number of tabs shown depends on the number of statements you have executed.
Example:
The content view lets you preview the data contained in a table. It also shows the SQL query that was used to
retrieve the data.
To open the data preview for a table or view, locate the object in the catalog structure and, in the context menu,
choose Open Content.
Example:
Table Actions
Copy selected rows Opens a dialog box that allows you to copy the selected rows. If
you want, you can change the delimiter and remove the header.
Change layout Opens a dialog box for selecting the columns to be displayed.
Export Exports the table data as a csv file. You have the option of chang
ing the delimiter.
Setting table properties Opens a dialog box for configuring table properties.
Maintain table data (data pre Lets you insert, edit, and delete table data.
view only)
Add filter (data preview only) Provides an advanced filter option with a list of filter operators.
Example:
Configuration Settings
SAP HANA Web-based Development Workbench includes a trace tool that enables you to view the SAP HANA
trace files.
The Trace component of the SAP HANA Web-based Development Workbench provides a browser-based
environment for viewing trace files and configuring traces. To use the Trace component, you must have the
privileges granted by the role sap.hana.ide.roles::TraceViewer or the parent role sap.hana.ide.roles::Developer.
Note
The Web-based Trace tool is available on the SAP HANA XS Web server at the following URL: http://
<WebServerHost>:80<SAPHANAinstance>/sap/hana/ide/trace.
● XS application trace
Provides tracing functionality for SAP HANA XS applications. Application-specific trace messages are
written into a trace file according to the trace level you specify, for example, "Info", "Error", "Debug".
● SQL trace
Collects information about all SQL statements executed on the XS Engine and saves it in a trace file for
further analysis.
● Database Trace
Records information about activity in the components of the SAP HANA database. You can use this
information to analyze performance and to diagnose and debug errors.
● Plan trace
Allows you to collect SQL queries and their execution plans, executed in a given time frame for a particular
application session.
The trace tool allows you to manually delete trace files you no longer require. You have the following options:
In the toolbar, choose (Delete Trace Files) to open the Delete Trace Files dialog box.
Note
Trace files of running services or an active SQL trace cannot generally be deleted.
Related Information
Set your preferences for working with the Trace tool of the SAP HANA Web-based Development Workbench.
You can open the Settings dialog box by choosing (Trace Settings) in the toolbar.
Option Description
Editor font size Use the slider to adjust the font size.
File Display
Use this option to determine which part of the trace file should be displayed by default.
Option Description
Display Part Specify either the start of the file, end of the file (default), or the entire file.
Display Lines The number of lines to be displayed from either the start of the file or the end of the file.
Browser
Use this option to retrieve a limited number of objects in the trace browser.
Option Description
Limit the number of SQL objects to retrieve Enter the maximum number of objects to be retrieved.
Text Highlight
Option Description
Error Color Text Enter the text you want to be highlighted in the error color.
Warning Color Text Enter the text you want to be highlighted in the warning color.
Preview Shows how the text will be highlighted. Press ENTER to update the preview.
The database trace records information about activity in the components of the SAP HANA database. You can
use this information to analyze performance and to diagnose and debug errors.
Each service of the SAP HANA database writes to its own trace file. The file names follow the default naming
convention:
<service>_<host>.<port_number>.<3_digit_file_counter>.trc.
Example
indexserver_veadm009.34203.000.trc
<service>_alert_<host>.trc
You can access database trace files in the Trace Files tree on the left of the Trace screen.
You configure the database trace in the Database Trace section on the Trace Configuration tab.
Database tracing is always active. Information about error situations is always recorded.
If a trace component is available in all services, it can be configured for all services at once. You can also
configure the trace level of a component individually for a specific service. The trace level of a component
configured at service level overrides the trace level configured at the ALL SERVICES level. Some components
are only available in a particular service and cannot be changed at the ALL SERVICES level.
In the Database Trace Configuration dialog box, a trace level that has been inherited from the ALL SERVICES
configuration (either the default or system configuration) is shown in brackets.
Not all trace components are visible by default in the Database Trace Configuration dialog box. To view all
additional components, select Show All Components.
Example
You change the trace level of the memory component to ERROR for all services and for the indexserver
service, you change it to WARNING. This means that the memory component of the indexserver service will
trace up to level WARNING and the memory component of all other services will trace to the level ERROR.
● NONE (0)
● FATAL (1)
● ERROR (2)
● WARNING (3)
● INFO (4)
● DEBUG (5)
The higher the trace level, the more detailed the information recorded by the trace.
Note
Even if you select NONE, information about error situations is still recorded.
The SQL trace collects information about all SQL statements executed on the XS Engine and saves it in a trace
file for further analysis. It is inactive by default.
Information collected by the SQL trace includes overall execution time of each statement, the number of
records affected, potential errors (for example, unique constraint violations) that were reported, the database
connection being used, and so on. So the SQL trace is a good starting point for understanding executed
statements and their potential effect on the overall application and system performance, as well as for
identifying potential performance bottlenecks at statement level.
You can view SQL trace files under the SQL Trace node in the Trace Files tree on the left of the Trace screen.
You activate and configure the SQL trace in the SQL Trace section on the Trace Configuration tab. Since the SAP
HANA Web-based Development Workbench runs on the XS Engine, you need to enable the SQL trace
specifically for the XS Engine:
Note
Writing SQL trace files can impact database performance significantly. They also consume storage space
on the disk. Therefore, it is not recommended that you leave the SQL trace enabled all the time.
The plan trace enables you to collect SQL queries and their execution plans, executed in a given time frame for
a particular application session. For each SQL query that has been traced, you can drill down the specific
execution plan in order to analyze its performance.
Note
As of SPS 10, only 'SELECT' statements are traced with the plan trace.
Prerequisites
1. Open the Trace Configuration tab if it is not already open by choosing (Configuration) in the toolbar.
1. Switch to the catalog and execute some SELECT queries in the SQL console.
2. Deactivate the trace tool.
Repeat the procedure described above for activating the trace, but make sure that the Inactive radio button
is selected.
Drilldown
To drill down into further details, you have the following options:
● Save the traced plan list by downloading it. A trace_log.zip file will be downloaded to your PC, which
you can investigate further using the SAP HANA studio.
Note
● Show detailed plan information. To do this, select a plan and choose (Show Plan). The Plan Analysis
page opens.
To develop applications using SAP HANA Extended Application Services (SAP HANA XS), you use a
hierarchical package structure to organize the design-time artifacts that make up your applications. Within this
structure you apply application descriptors to define which application content is to be exposed and to control
access to it.
1. To set up a package hierarchy, you create a root package for your application-development activities, and
within this package you create additional subpackages to organize the applications and the application
content. All artifacts are stored in the SAP HANA repository.
To avoid conflicts with applications from SAP or other providers, we recommend that you use your
company’s dedicated name space (DNS) as the name of your root application development folder, for
example, com.acme.
2. To expose and control access to application content, you create application descriptors. These are files
that define the following:
○ The root point in the package hierarchy from which content can be served to client requests
○ Whether the application is permitted to expose data to client requests and what kind of access to the
data is allowed
3. To secure the application, you decide how to grant access to the applications you develop. For example,
you specify which authentication method, if any, is to be used to grant access to content exposed by an
application, and what content is visible.
Related Information
In this tutorial, you use the SAP HANA Web-based Development Workbench Editor to develop an SAPUI5
application that displays a button in a browser window which fades out when clicked.
Prerequisites
You have the privileges granted by the role sap.hana.ide.roles::EditorDeveloper; this role is included in the
parent role sap.hana.ide.roles::Developer.
You use the SAPUI5 Hello World template to create an application that consists of the following files:
Procedure
The system creates the index.html, .xsaccess, and .xsapp files, and automatically opens the
index.html file.
Related Information
In this tutorial, you use the SAP HANA Web-based Development Workbench Editor to create and debug a
server-side JavaScript application. The application displays a browser window where you can enter two values
in URL parameters and display the results immediately in the browser window.
Prerequisites
● You have the privileges granted by the role sap.hana.ide.roles::EditorDeveloper; this role is included in the
parent role sap.hana.ide.roles::Developer.
● You have been assigned the user role sap.hana.xs.debugger::Debugger.
● Your SAP HANA administrator has enabled debugging in the SAP HANA system.
Procedure
b. Close the dialog box and choose (Resume (F10)) to finish the debugger session.
c. Switch to the application window to confirm that the change you made is reflected in the displayed
result.
Related Information
All content delivered as part of the application you develop for SAP HANA is stored in packages in the SAP
HANA repository. The packages are arranged in a hierarchy that you define to help make the process of
maintaining the packages transparent and logical.
Context
To perform the high-level tasks that typically occur during the process of maintaining repository packages, you
need to be familiar with the concepts of packages and package hierarchies. Packages enable you to group
together the artifacts you create and maintain for your applications. You must also be aware of the privileges
the application developers require to access (and perform operations on) the packages.
Procedure
Related Information
A package hierarchy can include sub-packages, for example, to isolate the data model from the business logic.
You can create a package hierarchy, for example, by establishing a parent-child type relationship between
packages. The assignment of packages to delivery units is independent of the package hierarchy; packages in a
The package hierarchy for a new project typically includes sub-packages, for example, to isolate the data model
from the business logic. Although there are no package interfaces to enforce visibility of objects across
packages, this separation of logical layers of development is still a recommended best practice.
The following simple example shows a package structure containing tutorials for the use of a new application:
sap
\
hana
\
app1
\
code
demos
docs
\
tutorials
manuals
help
● Package hierarchy
Each vendor uses a dedicated namespace, for example, com.acme.
● Package type
Some packages contain content; other packages contain only other (sub)packages. Packages can also
contain both objects and (sub)packages.
● Package naming conventions
There are recommendations and restrictions regarding package names.
All content delivered by SAP should be in a sub-package of "sap". Partners and customers should choose their
own root package to reflect their own name (for example, the domain name associated with the company) and
must not create packages or objects under the "sap" root structural package. This rule ensures that customer-
or partner-created content will not be overwritten by an SAP update or patch.
Note
SAP reserves the right to deliver without notification changes in packages and models below the "sap" root
structural package.
There are no system mechanisms for enforcing the package hierarchy. The "sap" root structural package is not
automatically protected. However, by default you cannot change the content of packages that did not originate
in the system. In addition, an authorization concept exists, which enables you to control who can change what
inside packages.
In the SAP HANA repository, you can set package authorizations for a specific user or for a role.
Context
Authorizations that are assigned to a repository package are implicitly assigned to all sub-packages, too. You
can also specify if the assigned user authorizations can be passed on to other users.
Procedure
5. Choose the (Add) button and enter all or part of the package name to locate the repository package.
6. Select the relevant package from the list of matching items and choose OK.
The selected package is added to the table.
7. In the table, select the package you just added and in the privileges list select the required privileges, for
example:
a. REPO.READ
Read access to the selected package and design-time objects (both native and imported)
b. REPO.EDIT_NATIVE_OBJECTS
Authorization to modify design-time objects in packages originating in the system the user is working
in
c. REPO.ACTIVATE_NATIVE_OBJECTS
Authorization to activate/reactivate design-time objects in packages originating in the system the user
is working in
d. REPO.MAINTAIN_NATIVE_PACKAGES
Authorization to update or delete native packages, or create sub-packages of packages originating in
the system in which the user is working
8. Save your changes.
Privileges granted on a repository package are implicitly assigned to the design-time objects in the package, as
well as to all sub-packages. Users are only allowed to maintain objects in a repository package if they have the
necessary privileges for the package in which they want to perform an operation, for example to read or write
to an object in that package. To be able perform operations in all packages, a user must have privileges on the
root package .REPO_PACKAGE_ROOT.
If the user authorization check establishes that a user does not have the necessary privileges to perform the
requested operation in a specific package, the authorization check is repeated on the parent package and
recursively up the package hierarchy to the root level of the repository. If the user does not have the necessary
privileges for any of the packages in the hierarchy chain, the authorization check fails and the user is not
permitted to perform the requested operation.
In the context of repository package authorizations, there is a distinction between native packages and
imported packages.
● Native package
A package that is created in the current system and expected to be edited in the current system. Changes
to packages or to objects the packages contain must be performed in the original development system
where they were created and transported into subsequent systems. The content of native packages are
regularly edited by developers.
● Imported package
A package that is created in a remote system and imported into the current system. Imported packages
should not usually be modified, except when replaced by new imports during an update. Otherwise,
imported packages or their contents should only be modified in exceptional cases, for example, to carry
out emergency repairs.
Note
The SAP HANA administrator can grant the following package privileges to an SAP HANA user: edit,
activate, and maintain.
Related Information
Note
To be able perform operations in all packages in the SAP HANA repository, a user must have privileges on
the root package .REPO_PACKAGE_ROOT.
A native repository package is created in the current SAP HANA system and expected to be edited in the
current system. To perform application-development tasks on native packages in the SAP HANA repository,
developers typically need the privileges listed in the following table:
An imported repository package is created in a remote SAP HANA system and imported into the current
system. To perform application-development tasks on imported packages in the SAP HANA repository,
developers need the privileges listed in the following table:
Note
It is not recommended to work on imported packages. Imported packages should only be modified in
exceptional cases, for example, to carry out emergency repairs.
Related Information
In SAP HANA, a package contains a selection of repository objects. You assemble a collection of packages into
a delivery unit, which you can use to transport the repository objects between SAP HANA systems.
Context
You can use repository packages to manage the various elements of your application development project in
the SAP HANA repository.
Procedure
3. From the context menu of the Content folder (or a folder of your choice), choose New Package .
The Create Package dialog box appears.
4. Maintain the package details.
a. Enter a name for the new package.
The package name is mandatory. Note that guidelines and conventions apply to package names.
b. Fill in the other optional information as required:
○ Enter a package description.
○ Assign responsibility of the package to a specific user.
Note
If omitted, the user responsible for a new package is, by default, the user who created it.
Next Steps
To specify the delivery unit that a package is to be assigned to, select the package and from the context menu
choose Delivery Unit Assign . Then enter the name of the delivery unit and choose Assign.
Related Information
In SAP HANA, a package typically consists of a collection of repository objects, which can be transported
between systems. Multiple packages can be combined in a delivery unit (DU).
An SAP HANA package specifies a namespace in which the repository objects exist. Every repository object is
assigned to a package, and each package must be assigned to a specific delivery unit. In the repository, each
object is uniquely identified by a combination of the following information:
● Package name
● Object name
● Object type
Note
Multiple objects of the same type can have the same object name if they belong to different packages.
Before you start the package development process, consider the following important points:
● Package hierarchy
Each vendor uses a dedicated namespace, and the package hierarchy you create enables you to store the
various elements of an application in a logical order that is easy to navigate.
● Package type
Packages can be structural or non-structural; some packages contain content; other packages contain
only other (sub)packages.
● Package naming conventions
There are recommendations and restrictions regarding package names, for example, the name's maximum
length and which characters must not be used.
● Permitted characters
Lower/upper case letters (aA-zZ), digits (0-9), hyphens (-), underscores (_), and dots (.) are permitted in
package names. Dots in a package name define a logical hierarchy. For example, "a.b.c" specifies a package
"a" that contains sub-package "b", which in turn contains sub-package "c".
● Forbidden characters
A package name must not start with either a dot (.) or a hyphen (-) and cannot contain two or more
consecutive dots (..).
● Package name length
The name of the complete package namespace hierarchy (for example, “aa.bb.cc.zz” including dots) must
not be more than 190 characters long. In addition, on object activation, the maximum permitted length of a
generated catalog name (which includes the package path, the separating dots, and the object base name)
is restricted to 127 characters.
○ hdbtable hdbview, hdbsequence, hdbstructure, hdbprocedure objects
sap.test.hana.db::myObject
○ CDS objects
sap.test.hana.db::myContext.myEntity
SAP HANA enables the use of various types of package, which are intended for use in particular scenarios.
SAP HANA Application Services provide or allow the following package types:
● Structural
Package only contains sub-packages; it cannot contain repository objects.
● Non-Structural
Package contains both repository objects and subpackages.
● sap
Transportable package reserved for content delivered by SAP. Partners and customers must not use the
sap package; they must create and use their own root package to avoid conflicts with software delivered by
SAP, for example when SAP updates or overwrites the sap package structure during an update or patch
process.
● system-local
Non-transportable, structural packages (and subpackages). Content in this package (and any
subpackages) is considered system local and cannot be transported. This is similar to the concept of the
$tmp development class in SAP NetWeaver ABAP.
● system-local.generated
Non-transportable, structural packages for generated content, that is; content not created by manual user
interaction
In SAP HANA development, repository packages are used to manage various elements of your application
development project. Sometimes you need to delete a package that contains repository objects from other
developers.
Prerequisites
Procedure
Related Information
The application descriptors describe the framework in which an SAP HANA XS application runs. The
framework defined by the application descriptors includes the root point in the package hierarchy where
content is to be served to client requests, and who has access to the content.
Prerequisites
● You must be familiar with the concept of the application descriptor file (.xsapp), the application-access
file (.xsaccess), and if required, the application-privileges file (.xsprivileges).
Context
When you develop and deploy applications in the context of SAP HANA Extended Application Services (SAP
HANA XS), you must define the application descriptors. Maintaining the application descriptors involves the
following tasks:
Procedure
Each application that you want to develop and deploy on SAP HANA Extended Application Services (SAP HANA
XS) must have an application-descriptor (.xsapp) file.
Prerequisites
Context
This file is the core file that you use to indicate an application's availability within SAP HANA XS. It marks the
point in the package hierarchy at which an application's content is available, that is, the package that contains
this file becomes the root path of the resources exposed by the application you develop.
Note that the .xsapp file has no content except the set of curly brackets {} and no name; it only has the file
extension .xsapp.
Tip
When creating applications, remember that it is not necessary to manually create the application
descriptors. Choose an appropriate application template or the empty application template and these files
will be created automatically.
Procedure
2. Select the package where you want to create the new file and from the context menu choose New
File .
3. Enter the file name .xsapp and choose Create.
Related Information
Each application that you want to develop and deploy on SAP HANA Extended Application Services (SAP HANA
XS) must have an application descriptor file. The application descriptor is the core file that you use to describe
an application's framework within SAP HANA XS.
The package that contains the application descriptor file becomes the root path of the resources exposed to
client requests by the application you develop.
Note
The application-descriptor file has no name and no content; it only has the file extension “xsapp”, for
example, .xsapp. For backward compatibility, content is allowed in the .xsapp file but ignored.
The application root is determined by the package containing the .xsapp file. For example, if the package
sap.test contains the file .xsapp, the application will be available under the URL http://<host>:<port>/
sap.test/. Application content is available to requests from users.
Caution
Make sure that the folder containing the .xsapp application descriptor file also contains an .xsaccess
file, which controls access to the application.
The contents of the package where the .xsapp file resides (and any subfolders) are exposed to user requests
and, as a result, potentially reachable by attackers. You can protect this content with the appropriate
authentication settings in the corresponding application-access (.xsaccess) file, which resides in the same
package. Bear in mind that by exposing Web content, you run the risk of leaking information; the leaked
information can be used in the following ways:
● Directly
Data files such as .csv files used for the initial database load can contain confidential information.
● Indirectly
File descriptors can give details about the internal coding of the application, and files that contain the
names of developers are useful; they can be used by an attacker in combination with social-engineering
techniques.
To help protect your application from security-related issues, place the application descriptor (.xsapp) as
deep as possible in the package hierarchy. In addition, include only the index page in this package; all other
application data should be placed in sub-folders that are protected with individual application-access files.
Keep the application package hierarchy clean. Do not place in the same package as the .xsapp file (or sub-
package) any unnecessary content, for example, files which are not required for the application to work.
Related Information
The application-access (.xsaccess) file enables you to specify who or what is authorized to access the
content exposed by the application package and what content they are allowed to see.
Prerequisites
Context
You can use a set of keywords in the application-access file to specify if authentication is required to enable
access to package content, which data is exposed, and if rewrite rules are in place to hide target and source
URLs, for example, from users and search engines. You can also specify what, if any, level of authorization is
required for the package and whether SSL is mandatory for client connections.
The application-access file does not have a name before the dot (.); it only has the file extension .xsaccess.
The contents of the .xsaccess file must be formatted according to JavaScript Object Notation (JSON) rules.
Procedure
{
"exposed" : true
}
{
"authentication" : { "method" : "Form"}
}
Note
Provided the Public security option is disabled (default setting) in the SAP HANA XS Administration
Tool, both the form-based authentication and basic authentication options are automatically enabled.
You can use the SAP HANA XS Administration Tool to configure applications to use additional
authentication methods, for example, logon tickets, or Single Sign On (SSO) providers such as SAML2
and X509.
{
"authorization":
["<package.path>::Execute",
"<package.path>::Admin"
]
}
7. Save the file in the package with which you want to associate the rules you have defined.
Related Information
SAP HANA XS enables you to define access to each individual application package that you want to develop
and deploy.
The application-access file enables you to specify who or what is authorized to access the content exposed by
a SAP HANA XS application package and what content they are allowed to see. For example, you use the
application-access file to specify if authentication is to be used to check access to package content and if
rewrite rules are in place that hide or expose target and source URLs.
The application-access file does not have a name; it only has the file extension .xsaccess. The content of
the .xsaccess file is formatted according to JSON rules, and the settings specified in an .xsaccess file apply
not only to the package the .xsaccess file belongs to but also any subpackages lower in the package
hierarchy. Multiple .xsaccess files are allowed, but only at different levels in the package hierarchy. You
cannot place two .xsaccess files in the same package.
Note
The settings specified in an .xsaccess file in a subpackage take precedence over any settings specified in
a .xsaccess file higher up the package hierarchy; the subpackage settings are also inherited by any
packages further down the package hierarchy. Any settings not modified by the .xsaccess in the
subpackage remain unchanged, that is: as defined in the parent package or, where applicable, the default
settings.
Using multiple .xsaccess files enables you to specify different application-access rules for individual
subpackages in the package hierarchy. Following the inheritance rule, any applications below the application
package containing the modified access settings inherit the new, modified settings.
The following example shows the composition and structure of the SAP HANA XS application access
(.xsaccess) file, which comprises a list of key-value pairs that specify how the application service responds to
client requests. For example, in this file, "exposed" : true indicates that data is available to client requests;
"force_ssl" : true specifies that standard HTTP requests are not allowed by the Web browser.
Note
Some elements can also be specified in the application's runtime configuration, for example, using the SAP
HANA XS Administration Tool. For example, you can configure applications to refuse insecure HTTP
connections, allow the use of e-tags, or enable additional authentication methods such as Single Sign On
(SSO) providers SAML2 and X509.
Example:
The Application-Access (.xsaccess) File
{
"exposed" : true, // Expose data via http
"authentication" :
{
"method": "Form"
},
Related Information
The application-access (.xsaccess) file enables you to specify whether or not to expose package content,
which authentication method is used to grant access, and what content is visible.
Example:
The Application Access (.xsaccess) File
Note
This example of the .xsaccess file is not a working model; it is used to illustrate the syntax for all possible
options.
{
"exposed" : false,
"authentication" :
{
"method": "Form"
},
"authorization":
[
"sap.xse.test::Execute",
"sap.xse.test::Admin"
],
"anonymous_connection" : "sap.hana.sqlcon::AnonConn",
"default_connection" : "sap.hana.sqlcon::sqlcc",
"cache_control" : "no-store",
"cors" :
{
"enabled" : false
},
"default_file" : "index_1.html",
"enable_etags" : false,
"force_ssl" : true,
"mime_mapping" :
[
{
"extension":"jpg", "mimetype":"image/jpeg"
}
],
"prevent_xsrf" : true,
"rewrite_rules" :
[{
"source" : "...",
"target" : "..."
}]
"headers":
{
"enabled": true,
"customHeaders": [ {"name":"X-Frame-Options","value":"<VALUE>"} ]
}
}
{
"exposed" : false,
}
The exposed keyword enables you define if content in a package (and its subpackages) is to be made available
by HTTP to client requests. Values are Boolean true or false. If no value is set for exposed, the default setting
(false) applies.
Tip
Only expose content that is absolutely necessary to enable the application to run.
Consider whether it is necessary to expose data via HTTP/S. Not exposing data via HTTP enables you to keep
your files accessible to other programs but prevent direct access to the data via URL. Since the application's
index.html page must normally remain reachable, consider storing the index.html file separately with a
dedicated .xsaccess file that enables access (“exposed”: true). You can keep all other content hidden, for
example, in separate package to which access is denied (“exposed”: false).
Packages without a dedicated .xsaccess file inherit the application-access settings defined in the parent
folder. If an .xsaccess file exists but the exposed keyword is not defined, the default setting false applies.
anonymous_connection
{
"anonymous_connection" : "sap.hana.sqlcon::AnonConn",
}
The anonymous_connection keyword enables you to define the name of the .xssqlcc file that will be used
for SQL access when no user credentials are provided. SAP HANA XS enables you to define the configuration
for individual SQL connections. Each connection configuration has a unique name, for example, Registration,
AnonConn, or AdminConn, which is generated from the name of the corresponding connection-configuration
file (Registration.xssqlcc, AnonConn.xssqlcc, or AdminConn.xssqlcc) on activation in the repository.
If no value is set, the default setting is “null”.
Tip
If it is necessary to provide anonymous access to an application, design your application in such a way that all
files requiring anonymous access are placed together in the same package, which you can then protect with
the permissions defined in a dedicated .xsaccess file. Remember that the behavior of the anonymous
connection depends on the details specified in the corresponding SQL configuration file (.xssqlcc).
{
"default_connection" : "sap.hana.sqlcon::sqlcc",
}
If the default_connection is set in the .xsaccess file, the specified SQL connection configuration (for
example, defined in sqlcc) is used for all SQL executions in this package, whether or not the requesting user is
authenticated in SAP HANA or not. The difference between the default_connection and the
anonymous_connection is that the anonymous SQL connection configuration is only used if the requesting
user is not authenticated. Like any other property of the xsaccess file, the default_connection is inherited
down the package hierarchy, for example, from package to subpackage. The default_connection can also
be overwritten, for example, by locating an xsaccess file with a different default_connection in one or
more subpackages.
Tip
If the requesting user is authenticated, the user name will be available in the connection as the
APPLICATIONUSER session variable.
The credentials to use for an SQL execution are determined according to the following order of priority:
1. The SQL connection configuration (SQLCC) specified in $.db.getConnection(sqlcc); this applies only
in XS JavaScript (not OData, for example)
2. The value specified in default_connection (if set)
3. An authenticated user
4. The valued specified in anonymous_connection (if set)
The default_connection is intended for use with anonymous parts of the application that require the same
privileges for all users. If the anonymous part of an application is designed to behave according to the privileges
granted to authenticated users, the anonymous_connection should be used. This is particularly important if
analytic privileges are involved, for example, to restrict the amount of returned rows (not overall access to the
table). In most cases, the default_connection should be used.
authentication
{
"authentication" :
{
"method": "Form"
},
}
The authentication keyword is required in the .xsaccess file and must be set to the value "form", for
example "method" : "Form", to ensure that form-based logon works when you enable it using the SAP
HANA XS Administration Tool.
Use the SAP HANA XS Administration Tool to configure applications to use additional authentication
methods, for example, basic, logon tickets, or Single Sign On (SSO) providers such as SAML2 and X509.
You must also enable the Form-based authentication checkbox, if you want your application (or
applications) to use form-based logon as the authentication method. Any other keywords in the
authentication section of the .xsaccess file are ignored.
● Form-based authentication
Redirect the logon request to a form to fill in, for example, a Web page.
To ensure that, during the authentication process, the password is transmitted in encrypted form, it is
strongly recommended to enable SSL/HTTPS for all application connections to the XS engine, for
example, using the force_ssl keyword. If you set the force_ssl option, you must ensure that the SAP Web
Dispatcher is configured to accept and manage HTTPS requests.
Form-based authentication requires the libxsauthenticator library, which must not only be available
but also be specified in the list of trusted applications in the xsengine application container. The application
list is displayed in the SAP HANA studio's Administration Console perspective in the following location:
Administration Configuration tab xsengine.ini application_container application_list . If it is not
displayed, ask the SAP HANA administrator to add it.
Note
If you need to troubleshoot problems with form-based logon, you can configure the generation of
useful trace information in the XSENGINE section of the database trace component using the following
entry: xsa:sap.hana.xs.formlogin.
authorization
{
"authorization":
[
"sap.xse.test::Execute",
"sap.xse.test::Admin"
],
}
The authorization keyword in the .xsaccess file enables you to specify which authorization level is
required for access to a particular application package, for example, execute or admin on the package
sap.xse.text.
Note
The authorization levels you can choose from are defined in the .xsprivileges file for the package, for
example, "execute" for basic privileges, or "admin" for administrative privileges on the specified package. If
you do not define any authorization requirements, any user can launch the application.
If you use the authorization keyword in the .xsaccess file, for example, to require “execute” privileges for
a specific application package, you must create a .xsprivileges file for the same application package (or a
parent package higher up the hierarchy, in which you define the “execute” privilege level declared in
the .xsaccess file.
{
"authorization": null
}
Bear in mind that the “authorization”:null setting applies not only to the package in which
the .xsaccess with the null setting is located but also to any subpackages further down the package
hierarchy. You can re-enable authorization in subpackage levels by creating new a .xsaccess file.
cache_control
{
"cache_control":"no-store",
}
The cache_control keyword enables you to override the cache-control header for Web content served by the
SAP HANA XS Web server. So-called cache-control directives (for example, public, private, no-store)
enable you to control the behavior of the Web browser and proxy caches, for example, whether or not to store a
page, how to store it, or where. For more information about the values you can use to set cache_control, see
the HTTP standard for cache-control directives. If no value for code_controlis set in the .xsaccess file, the
default setting is “null”.
Tip
For security reason, it is recommended to set the cache_control keyword to “no-cache, no-store”.
However, if nothing is cached or stored, there is an obvious impact on application performance.
If application performance allows, the no-cache, no-store setting is advisable for the following reasons:
{
"cors" :
{
"enabled" : false
},
}
The cors keyword enables you to provide support for cross-origin requests, for example, by allowing the
modification of the request header. Cross-origin resource sharing (CORS) permits Web pages from other
domains to make HTTP requests to your application domain, where normally such requests would
automatically be refused by the Web browser's security policy.
If CORS support is disabled ("enabled" : false), the following settings apply on the Web server:
To enable support for CORS, set the cors keyword to {“enabled”:true}, which results in the following
default corsconfiguration:
{"cors":{"enabled":true,"allowMethods":
[“GET”,”POST”,”HEAD”,”OPTIONS”],"allowOrigin": [“*”], “maxAge”:”3600”}}
The following table describes the options that are supported with the cors keyword:
{"cors":{"enabled":true, "allowMethods":<ALLOWED_METHODS>,
"allowOrigin":<ALLOWED_ORIGIN>,
“maxAge”:<MAX_AGE>, “allowHeaders”:<ALLOWED_HEADERS>,
“exposeHeaders”:<EXPOSED_HEADERS>}}
ALLOWED_METHODS A single permitted method or a comma-separated list of methods that are allowed by the
server, for example, “GET”, “POST”. If allowMethods is defined but no method is
specified, the default “GET”, “POST”, “HEAD”, “OPTIONS” (all) applies. Note
that matching is case-sensitive.
ALLOWED_ORIGIN A single host name or a comma-separated list of host names that are allowed by the
server, for example: www.sap.com or *.sap.com. If allowOrigin is defined but
no host is specified, the default “*” (all) applies. Note that matching is case-sensitive.
ALLOW_HEADERS A single header or a comma-separated list of request headers that are allowed by the
server. If allowHeaders is defined but no header is specified as allowed, no default
value is supplied.
MAX_AGE A single value specifying how long a preflight request should be cached for. If maxAge is
defined but no value is specified, the default time of “3600” (seconds) applies.
EXPOSE_HEADERS A single header or a comma-separated list of response headers that are allowed to be
exposed. If exposeHeaders is defined but no response header is specified for expo
sure, no default value is supplied.
default_file
{
"default_file" : "new_index.html",
}
The default_file keyword enables you to override the default setting for application access (index.html) when
the package is accessed without providing a file in the URI. If you use the default_file but do not specify a value,
the default setting “index.html” is assumed.
Tip
It is good practice to specify a default file name manually. Changing the default from index.html to
something else can help make your application less vulnerable to automated hacker tools.
rewrite_rules
{
"rewrite_rules" :
[{
"source": "...",
"target": "..."
}],
}
The rewrite_rules keyword enables you hide the details of internal URL paths from external users, clients,
and search engines. Any rules specified affect the local application where the .xsaccess file resides (and any
subpackage, assuming the subpackages do not have their own .xsaccess files); it is not possible to define
global rewrite rules. URL rewrite rules are specified as a source-target pair where the source is written in the
JavaScript regex syntax and the target is a simple string where references to found groups can be inserted
using $groupnumber.
Tip
In the following example, the rule illustrated hides the filename parameter and, as a result, makes it harder to
guess that the parameter provided after /go/ will be used as a filename value. Note that it is still necessary to
validate the received input
{
"rewrite_rules" :
[{
mime_mapping
{
"mime_mapping" :
[
{
"extension":"jpg", "mimetype":"image/jpeg"
}
],
}
The mime_mapping keyword enables you to define how to map certain file suffixes to required MIME types. For
example, you can map files with the .jpg file extension to the MIME type image/jpeg.
This list you define with the mime_mapping keyword supersedes any default mapping defined by the server;
the Web browser uses the information to decide how to process the related file types.
Caution
Make sure you do not instruct the browser to execute files that are not meant to be executed, for example,
by mapping .jpg image files with the MIME type application/javascript.
The default MIME mappings remain valid for any values you do not define with the mime_mapping keyword.
Consider restricting any explicit mappings to file types where the default behavior does not work as expected
or where no default value exists, for example, for file types specific to your application.
force_ssl
{
"force_ssl" : false,
}
The force_ssl keyword enables you to refuse Web browser requests that do not use secure HTTP (SSL/
HTTPS) for client connections. If no value is set for force_ssl, the default setting (false) applies and non-
secured connections (HTTP) are allowed.
Tip
To ensure that, during the authentication process, passwords are transmitted in encrypted form, it is
strongly recommended to enable SSL/HTTPS for all application connections to the XS engine. If you set
the force_ssl option, you must ensure that the SAP Web Dispatcher is configured to accept and manage
HTTPS requests. For more information, see the SAP HANA XS section of the SAP HANA Administration
Guide.
Caution
If a runtime configuration exists for your application, the force_ssl setting in the runtime configuration
supersedes the force_ssl in the .xsaccess.
enable_etags
{
"enable_etags" : true,
}
You can allow or prevent the generation of entity tags (etags) for static Web content using the enable_etags
keyword. If no value is set, the default setting (true) applies, in which case etags are generated. Etags are used
to improve caching performance, for example, so that the same data is not resent from the server if no change
has occurred since the last time a request for the same data was made.
If etags are enabled, the browser sends with each HTTP request the etag retrieved from its cached page. If the
etag from the cached page matches the etag from the server, the server answers with the status code 304 (not
modified) and does send the full requested page. Although enabling etags has the positive side-effect of
helping to prevent cache poisoning attacks, there is no direct security risk associated with disabling etags from
the developer's perspective.
prevent_xsrf
{
"prevent_xsrf" : true,
}
You can use the prevent_xsrf keyword in the .xsaccess file to protect applications from cross-site request-
forgery (XSRF) attacks. XSRF attacks attempt to trick a user into clicking a specific hyperlink, which shows a
(usually well-known) Web site and performs some actions on the user’s behalf, for example, in a hidden iframe.
If the targeted end user is logged in and browsing using an administrator account, the XSRF attack can
compromise the entire Web application. There is no good reason why you would explicitly set this keyword to
false.
Note
It is recommended to enable the prevent_xsrf feature for all applications that are not read-only.
The prevent_xsrf keyword prevents the XSRF attacks by ensuring that checks are performed to establish
that a valid security token is available for a given Browser session. The existence of a valid security token
Note
The default setting is false, which means there is no automatic prevention of XSRF attacks. If no value is
assigned to the prevent_xsrf keyword, the default setting (false) applies.
Setting the prevent_xsrf keyword to true ensures XSRF protection only on the server side. On the client
side, to include the XSRF token in the HTTP headers, you must first fetch the token as part of a GET request, as
illustrated in the following example:
xmlHttp.setRequestHeader("X-CSRF-Token", "Fetch");
You can use the fetched XSRF token in subsequent POST requests, as illustrated in the following code example:
xmlHttp.setRequestHeader("X-CSRF-Token", xsrf_token);
headers
{
"headers":
{
"enabled": true,
"customHeaders": [ {"name":"X-Frame-Options","value":"<VALUE>"} ]
}
}
Enable support for the X-Frame-Options HTTP header field, which allows the server to instruct the client
browser whether or not to display transmitted content in frames that are part of other Web pages. You can also
enable this setting in the application's corresponding runtime configuration file, for example, using the XS
Administration Tool.
Caution
● DENY
● SAMEORIGIN
● ALLOW-FROM <URL>
You can only specify one URL with the ALLOW-FROM option, for example: "value":"ALLOW-FROM
http://www.site.com".
Note
To allow an application to use custom headers, you must enable the headers section.
Rewriting URLs enables you to hide internal URL path details from external users, clients, and search engines.
You define URL rewrite rules in the application-access file (.xsaccess) for each application or for an
application hierarchy (an application package and its subpackages).
The rewrite rules you define in the .xsaccess file apply only to the local application to which the .xsaccess
file belongs; it is not possible to define global rules to rewrite URLs. Rules are specified as a source-target pair
where the source is written in the JavaScript regex syntax, and the target is a simple string where references
to found groups can be inserted using $groupnumber.
The following examples show how to use a simple set of rewrite rules to hide internal URLs from requesting
clients and users.
The first example illustrates the package structure that exists in the repository for a given application; the
structure includes the base package apptest, the subpackages subpackage1 and subpackage2, and several
other subpackages:
sap---apptest
|---logic
| |---users.xsjs
| |---posts.xsjs
|---posts
| |---2011...
|---subpackage1
| |---image.jpg
|---subpackage2
| |---subsubpackage
| | |---secret.txt
| |---script.xsjs
|---subpackage3
| |---internal.file
|---users
| |---123...
|---.xsapp
|---.xsaccess
|---index.html
The application-access file for the package apptest (and its subpackages) includes the following rules for
rewriting URLs used in client requests:
{
"rewrite_rules": [
{
"source": "/users/(\\d+)/",
"target": "/logic/users.xsjs?id=$1"
},
{
"source": "/posts/(\\d+)/(\\d+)/(\\d+)/",
"target": "/logic/posts.xsjs?year=$1&month=$2&day=$3"
}
Assuming we have the package structure and URL rewrite rules illustrated in the previous examples, the
following valid URLs would be exposed; bold URLs require authentication:
/sap/apptest/
/sap/apptest/index.html
/sap/apptest/logic/users.xsjs
/sap/apptest/logic/posts.xsjs
The application-privileges (.xsprivileges) file allows you to define the authorization levels required to
access an application, for example, to start the application or perform administrative actions on an application.
You can assign the application privileges to the individual users who require them.
Prerequisites
Context
The .xsprivileges file must reside in the same application package that you want to define the access
privileges for.
Note
If you use the .xsprivileges file to define application-specific privileges, you must also add a
corresponding entry to the same application's .xsaccess file, for example, using the authorization
keyword.
The application-privileges file does not have a name; it only has the file extension .xsprivileges. The
contents of the .xsprivileges file must be formatted according to JavaScript Object Notation (JSON) rules.
Note
Multiple .xsprivileges files are allowed, but only at different levels in the package hierarchy; you
cannot place two .xsprivileges files in the same application package. The privileges defined in
a .xsprivileges file are bound to the package to which the file belongs and can only be applied to
this package and its subpackages.
{
"privileges" :
[
{ "name" : "Execute", "description" : "Basic execution
privilege" },
{ "name" : "Admin", "description" : "Administration
privilege" }
]
}
5. Specify which privileges are required to access the application or application package.
If you use the .xsprivileges file to define application-specific privileges, you must also add a
corresponding entry to the same application's .xsaccess file, for example, using the authorization
keyword.
Note
The .xsprivileges file lists the authorization levels that are available for access to an application
package; the .xsaccess file defines which authorization level is assigned to which application
package.
a. Locate and open the application access file (.xsaccess) for the application for which you want to
define application privileges.
b. Specify the privileges required to access the application or application package.
Use the authorization keyword in the .xsaccess file to specify which authorization level is required by
a user to access a particular application package.
call
"_SYS_REPO"."GRANT_APPLICATION_PRIVILEGE"('"<package.path>::Execute"','<UserNa
me>')
To revoke the execute application privilege to a user, run the following command in the SQL editor:
call
"_SYS_REPO"."REVOKE_APPLICATION_PRIVILEGE"('"<package.path>::Execute"','<UserN
ame>')
Related Information
In SAP HANA Extended Application Services (SAP HANA XS), the application-privileges (.xsprivileges) file
can be used to create or define the authorization privileges required for access to an SAP HANA XS application,
for example, to start the application or to perform administrative actions on an application. These privileges
can be checked by an application at runtime.
The application-privileges file has only the file extension .xsprivileges; it does not have a name and is
formatted according to JSON rules. Multiple .xsprivileges files are allowed, but only at different levels in
the package hierarchy; you cannot place two .xsprivileges files in the same application package. The
package privileges defined in a .xsprivileges file are bound to the package to which the .xsprivileges
file belongs and can only be used in this package and its subpackages.
As an application privilege is created during activation of an .xsprivileges file, the only user who has the
privilege by default is the _SYS_REPO user. To grant or revoke the privilege to (or from) other users you can use
the GRANT_APPLICATION_PRIVILEGE or REVOKE_APPLICATION_PRIVILEGE procedure in the _SYS_REPO
schema.
Note
The .xsprivileges file lists the authorization levels that are available for access to an application
package; the .xsaccess file defines which authorization level is assigned to which application package.
In the following above, if the application-privileges file is located in the application package sap.hana.xse,
then the following privileges are created:
● sap.hana.xse::Execute
● sap.hana.xse::Admin
The privileges defined apply to the package where the .xsprivileges file is located as well as any packages
further down the package hierarchy unless an additional .xsprivileges file is present, for example, in a
subpackage. The privileges do not apply to packages that are not in the specified package path, for example,
sap.hana.app1.
Example:
The SAP HANA XS Application-Privileges File
The following example shows the composition and structure of a basic SAP HANA XS application-privileges file.
{
"privileges" :
[
{ "name" : "Execute", "description" : "Basic execution
privilege" },
{ "name" : "Admin", "description" : "Administration privilege" }
]
}
If the .xsprivileges file shown in the example above is located in the package sap.hana.xse, you can
assign the Execute privilege for the package to a particular user by calling the
GRANT_APPLICATION_PRIVILEGE procedure, as illustrated in the following code:
call "_SYS_REPO"."GRANT_APPLICATION_PRIVILEGE"('"sap.hana.xse::Execute"',
'<user>')
The design-time building blocks of an SAP HANA applications are called development objects (or artifacts),
and many have a mandatory file extension, for example, .hdbtable (design-time table definition), .hdbview
(design-time SQL-view definition), or .hdbrole (design-time role definition).
Some of the development objects you encounter when creating an application, such as projects and packages,
are designed to help you structure your application. Other objects such as schemas, table definitions, or
analytical and attribute views, help you organize your data. Design-time definitions of procedures and server-
side JavaScript code are the core objects of an SAP HANA application; these, too, have mandatory file
extensions, for example, .hdbprocedure or .xsjs. Other types of development objects help you control the
access to runtime objects.
When you activate an application artifact, the file extension (for example, .hdbdd, .xsjs, or
hdbprocedure, ...) is used to determine which runtime plug-in to call during the activation process. The plug-
in reads the repository artifact selected for activation (for example, a table definition, a complete CDS
document, or server-side JavaScript code), interprets the object description in the file, and creates the
appropriate runtime object in the designated catalog schema.
The file extensions associated with application artifacts are used in other contexts, too. For example, in SAP
HANA studio, a context-sensitive menu is displayed when you click an artifact with the alternate mouse button;
the options displayed in the menu is determined, amongst other things, according to the file extension.
Related Information
The design-time building blocks of your SAP HANA applications have a mandatory file extension, for
example, .hdbtable (design-time table definition) or .hdbview (design-time SQL-view definition).
In SAP HANA, application artifacts have a mandatory file extension, which is used to determine the Repository
tools required to parse the contents of the design-time artifact on activation. The following tables list the most
commonly used building blocks of an SAP HANA application; the information provided shows any mandatory
file extension and, if appropriate, indicates where to find more information concerning the context in which the
object can be used.
.aflpmml Procedure A file used by the application function modeler to store details of
a procedure defined using application functions in the Predictive
Analysis Library * (PAL) or Business Function Library * (BFL). Us
ing the AFM also generates a .diagram and a .aflmodel file.
.analyticview Analytic view A file containing a design-time definition of an analytic view; the
view can be referenced in an OData service definition.
.attributeview Attribute view A file containing a design-time definition of an attribute view; the
view can be referenced in an OData service definition.
.hdbscalarfunction Scalar user-defined func A file containing a design-time definition of a a scalar user-de
tion fined function (UDF), which is a custom function that can be
called in the SELECT and WHERE clauses of an SQL statement.
.hdbtablefunction Table user-defined func A file containing a design-time definition of a table user-defined
tion function (UDF), which is a custom function that can be called in
the FROM–clause of an SQL statement.
.hdbtextbundle Resource Bundle A file for defining translatable UI texts for an application. Used in
SAP UI5 applications.
.hdbti Table Import definition A table-import configuration that specifies which .csv file is im
ported into which table in the SAP HANA system.
.hdbview SQL View A design-time definition of a database view, which is a virtual ta
ble based on an SQL query.
.proceduretemplate Procedure template A design-time artifact containing a base script with predefined
placeholders for objects such as tables, views and columns.
.searchruleset Search Rule Set * A file that defines a set of rules for use with fuzzy searches. The
rules help decide what is a valid match in a search.
.xsaccess Application Access File An application-specific configuration file that defines permis
sions for a native SAP HANA application, for example, to manage
access to the application and running objects in the package.
.xshttpdest HTTP destination config- A file that defines details for connections to a remote destination
uration by HTTP (or HTTPS)
.xsjob Scheduled XS job A JSON-compliant file used to define recurring tasks that run in
the background (independent of any HTTP request/response
process); a scheduled job can either execute a JavaScript func
tion or call a SQLScript procedure.
.xsjs Server-Side JavaScript A file containing JavaScript code that can run in SAP HANA Ex
Code tended Application Services and be accessed via URL
.xsjslib Server-Side JavaScript A file containing JavaScript code that can run in SAP HANA Ex
Library tended Application Services but cannot be accessed via URL.
The code can be imported into an .xsjs code file.
.xsoauthappconfig OAuth application con A file describing high-level details of an application that enables
figuration file logon to a service running on a remote HTTP destination using
OAuth
.xsoauthclientconfi OAuth client configura- A file containing detailed information about a client application
g tion file that uses OAuth as the authentication mechanism for logon to a
remote HTTP destination
.xsoauthclientflavo OAuth client flavor file The corresponding OAuth flavors file for the OAuth client configu-
r ration
.xsodata OData Descriptor A design-time object that defines an OData service that exposes
SAP HANA data from a specified end point.
.xsprivileges Application Privilege A file that defines a privilege that can be assigned to an SAP
HANA Extended Application Services application, for example,
the right to start or administer the application.
.xssecurestore Application secure store The design-time file that creates an application-specific secure
store; the store is used by the application to store data safely and
securely in name-value form.
.xssqlcc SQL Connection Config- A file that enables execution of SQL statements from inside
uration server-side JavaScript code with credentials that are different to
those of the requesting user
.xswidget Widget A file that defines a standalone SAP HANA application for the
purpose of integration into an application site
.xsxmla XMLA Descriptor A design time object that defines an XMLA service that exposes
SAP HANA data
Caution
(*) For information about the capabilities available for your license and installation scenario, refer to the
Feature Scope Description for SAP HANA.
Package A container in the repository for development objects. Packages are represented by
folders.
Attribute, Analytic and A view created with modeling tools and designed to model a busi Created with the Systems
Calculation View ness use case. view.
Decision Table A table used to model business rules, for example, to manage
data validation and quality.
Analytic Privilege A set of rules that allows users to seeing a subset of data in a ta
ble or view.
The version history lets you view a list of file versions, compare one version of a file with another one, and revert
a file to a previous version.
You can open the version history by choosing Versions from the context menu of a file. The version history is
shown in the Versions panel on the right.
To compare one version of a file with another one, select the radio button of one of the file versions (referred to
as the base version) and from the context menu of the other file version choose Compare with selected.
The differences in the two file versions are highlighted in a side-by-side display on the compare tab. You can
switch to a unified view by choosing Toggle diff view in the Versions panel.
In the side-by-side view, a line between the two panes shows the location of a change in the two files.
From the context menu of the relevant file version, choose Revert to version. You can restore a previous version
or a locally cached version.
Version Information
The version history provides the following information about each version:
● Version number: Available for active and inactive versions, but not for locally cached versions and GitHub
versions.
● Version marker:
○ Inactive version: [l]
○ Locally cached version: [L]
○ Deleted version: [D]
○ GitHub version: [G]
● Date and time when the version was changed.
● The name of the user who made the change. Depending on the version type, this is either the user’s GitHub
user name or their SAP HANA database user name.
Example
Related Information
You can use GitHub to store and share your source code.
Prerequisites
The SAP HANA Web-based Development Workbench allows you to push and pull changes made to your code
between your SAP HANA repository and your repositories on GitHub. GitHub requires your user credentials
whenever you commit changes to your GitHub repositories.
Procedure
Note
You can save this information as defaults in your Editor settings. You will then be connected
automatically with the specified credentials. You will still be able to choose (Edit GitHub
repository connection details) to overwrite these values.
Your SAP HANA package is now connected to the selected repository, allowing you to fetch and commit
code changes from and to the selected branch.
3. Clone the GitHub repository branch to the SAP HANA package.
Under Synchronization, choose Fetch.
All files contained in the repository are pulled to your SAP HANA package. In the message console you can
see exactly which files have been pulled.
4. Push changes to the GitHub repository.
a. Make a change to a file contained in the SAP HANA package.
b. Under Synchronization, choose Commit.
c. In the Commit dialog box, enter a commit description and choose Commit.
In the message console, you can see which files have been pushed to GitHub.
5. Check which version of a file is currently contained in the GitHub repository.
Related Information
As part of the application-development process, you must decide how to provide access to the applications you
develop. Application access includes security-related matters such as authentication methods and
communication protocols
In addition to the features and functions you can enable with keywords in the .xsaccess file, SAP HANA
Extended Application Services (SAP HANA XS) provides a dedicated SAP HANA XS administration tool that is
designed to help you configure and maintain the authentication mechanism used to control access to the
applications you develop. The SAP HANA XS Administration Tool enables you to configure the following runtime
elements for an application:
● Security
Choose the security level you want to set to provide access to the application. For example, you can expose
the application with/without requiring authentication (public/private) and force the application to accept
only requests that use SSL/HTTPS.
● Authentication
Select an authentication type to use when checking user credentials before authorizing access to an
application, for example: form-based authentication (with user name and password), SAML (SSO with
Security Assertion Markup Language), SAP logon tickets...
Related Information
To restrict access to the applications you develop, you must configure the application to work with particular
authentication methods and communication protocols.
Prerequisites
To perform the steps in this task, you must ensure the following prerequisites are met:
Context
You must specify whether or not to expose application content, which authentication method is used to grant
access to the exposed content, and what content is visible.
Procedure
Note
In the default configuration, the URL redirects the request to a logon screen, which requires the
credentials of an authenticated SAP HANA database user to complete the logon process. To ensure
access to all necessary features, the user who logs on should have the SAP HANA XS role
sap.hana.xs.admin.roles::RuntimeConfAdministrator.
Note
Security settings are automatically inherited by applications further down the application hierarchy.
However, you can override the inherited security settings at any application level by modifying the
settings for a particular application. Applications below the application with the modified security
settings inherit the new, modified settings.
a. Use the Public (no authentication required) option to specify if applications require user authentication
to start.
Note
Enabling the Force SSL option only ensures that the selected application refuses any request
that does not use HTTPS; it does not set up the Secure Sockets Layer (SSL) protocol for you.
The SAP HANA administrator must configure the SAP Web Dispatcher to accept (and forward)
HTTPS requests in addition.
Related Information
To restrict access to the applications you develop, you must configure the application to work with particular
authentication methods and communication protocols.
Prerequisites
To perform the steps in this task, you must ensure the following prerequisites are met:
Before you define which authentication methods an application uses to grant access to the application content,
you must use the application security tools to define whether or not to expose application content and, if so,
which content to expose. SAP HANA XS enables you to define multiple authentication methods to verify the
credentials of users who request access to the exposed content; multiple authentication methods are
considered according to a specific order of priority. For example, if the first authentication method fails, SAP
HANA tries to authenticate the user with the next authentication method specified. To configure the
authentication method an application uses to verify user credentials, perform the following steps:
Procedure
Note
In the default configuration, the URL redirects the request to a logon screen, which requires the
credentials of an authenticated SAP HANA database user to complete the logon process. To ensure
access to all necessary features, the user who logs on should have the SAP HANA XS role
sap.hana.xs.admin.roles::RuntimeConfAdministrator.
Note
Security settings are automatically inherited by applications further down the application hierarchy.
However, you can override the inherited security settings at any application level by modifying the
settings for a particular application. Applications below the application with the modified security
settings inherit the new, modified settings.
a. Use the Public (no authentication required) option to specify if applications require user authentication
to start.
○ Disabled
This is the default setting. In disabled mode, Form-based authentication and Basic authentication
options are enabled automatically in the Authentication screen area.
○ Enabled
If you enable the Public option , no authentication is required to start an application; the
Authentication screen area is hidden, and you cannot select any authentication-method options.
b. Use the Force SSL option to specify if client requests must use secure HTTP (HTTPS).
○ Disabled
This is the default setting. With Force SSL disabled, the application returns a response to all
requests (both HTTP and HTTPS).
○ Enabled
Note
Enabling the Force SSL option only ensures that the selected application refuses any request
that does not use HTTPS; it does not set up the Secure Sockets Layer (SSL) protocol for you.
The SAP HANA administrator must configure the SAP Web Dispatcher to accept (and forward)
HTTPS requests in addition.
Note
Enabling an application-security option (for example, SAML2 or X509) only ensures that the selected
application uses the enabled authentication method when required; it does not perform any setup
operation for the authentication method itself. The SAP HANA administrator must maintain the
selected authentication infrastructure (SAML2, X509, or SAP logon tickets) in an additional step.
You can choose any selection of the following application-related authentication methods; if you enable
multiple authentication methods for your application, a priority applies depending on whether the
application logon is interactive or non-interactive:
a. Enable the SAML2 option.
The SAP HANA administrator must already have configured the authentication infrastructure, for
example, to enable the creation of SAML2 assertions to permit SSO in Web browsers.
b. Enable the X509 Authentication option
The SAP HANA administrator must already have configured the appropriate authentication
infrastructure, for example, to enable users to be authenticated by client certificates signed by a
trusted Certification Authority (CA).
c. Enable the SAP logon ticket option
The SAP HANA administrator must already have configured the appropriate authentication
infrastructure, for example, to enable users to be be authenticated by a logon ticket that is issued when
the same user logs on to an SAP system that is configured to create logon tickets (for example, the
SAP Web Application Server or Portal).
d. Enable the Form-based authentication option
If the Public security option is disabled, the Form-based authentication option is enabled by default.
e. Enable the Basic authentication option
If the Public security option is disabled, the Basic authentication option is enabled by default.
Related Information
An HTTP destination defines connection details for services running on specific hosts whose details you want
to define and distribute. The definition can be referenced by an application.
Context
If you want to configure an SAP HANA XS application to access data on a specific server that offers a specific
service, for example, a service that is only available outside your network, it is recommended to configure the
HTTP connection parameters in an HTTP destination file that you store locally as a design-time artifact. You
can use an HTTP destination to call an external resource directly from a server-side JavaScript application. You
can also use an HTTP destination when configuring a transport route, for example, to automate the process of
exporting a delivery unit from one system and importing it into another. To create an HTTP destination
configuration for an SAP HANA XS application, you must perform the following high-level steps.
Procedure
1. Create a package for the SAP HANA XS application that will use the HTTP destination you define.
2. Define the details of the HTTP destination.
You define the details of an HTTP destination in a configuration file and using a specific syntax. The
configuration file containing the details of the HTTP destination must have the file extension .xshttpdest
and be located in the same package as the application that uses it or one of the application's subpackages.
3. Define any extensions to the HTTP destination configuration.
You can extend a configured HTTP destination, for example, by providing additional details concerning
proxy servers and logon details. The details concerning the extensions to the HTTP destination must be
specified in a separate configuration file. Like the original HTTP destination that the extension modifies,
the configuration-file extension must have the file extension .xshttpdest and be located in the same
package as the HTTP destination configuration file it extends and the application that uses it.
4. Check the HTTP destination configuration using the SAP HANA XS Administration Tool.
The SAP HANA XS Administration Tool is available on the SAP HANA XS Web server at the following URL:
http://<WebServerHost>:80<SAPHANAinstance>/sap/hana/admin/cockpit.
Note
Access to details of HTTP destinations in the SAP HANA XS Administration Tool requires the
credentials of an authenticated database user and one of the following SAP HANA roles:
○ HTTPDestViewer
○ HTTPDestAdministrator
Create an HTTP destination defining connection details for services running on specific hosts. The definition
can be referenced by an application.
Prerequisites
● You have the privileges granted by the role sap.hana.ide.roles::EditorDeveloper; this role is included in the
parent role sap.hana.ide.roles::Developer.
● You have been assigned the HTTPDestViewer or HTTPDestAdministrator user role.
Context
An HTTP destination defines connection details for services running on specific hosts whose details you want
to define and distribute. HTTP destination configurations are defined in a text file and can be referenced by an
application. You can also provide more (or modified) connection details in additional files called “extensions”;
values specified in extensions overwrite values specified in the original HTTP destination configuration.
Procedure
Caution
You must place the HTTP destination configuration and the XSJS application that uses it in the same
application package. An application cannot reference an HTTP destination configuration that is located
in another application package.
a. From the context menu of the testApp folder, choose New File .
b. Enter a file name, for example, yahoo.xshttpdest, and choose Create.
c. Enter the following code in the new file yahoo.xshttpdest.
host = "download.finance.yahoo.com";
port = 80;
description = "my stock-price checker";
useSSL = false;
pathPrefix = "/d/quotes.csv?f=a";
authType = none;
proxyType = none;
proxyHost = "";
proxyPort = 0;
timeout = 0;
d. If necessary, set proxyType to http and enter your proxy host and port number.
e. Save the file.
4. View the activated HTTP destination.
You can use the SAP HANA XS Administration Tool to check the contents of an HTTP destination
configuration.
Note
To make changes to the HTTP Destination configuration, you must use a text editor, save the changes
and reactivate the file.
To start the SAP HANA XS Administration Tool, select the yahoo.xshttpdest file and choose
(Maintain Credentials) in the toolbar. The details of the HTTP destination are displayed.
If you are using the Web-based XS Administration Tool, you can only make limited changes to the displayed
HTTP destination configuration, as follows:
○ Save
Commit to the repository any modifications made to the HTTP destination configuration in the current
session.
○ Edit
Display details of the corresponding extension to the selected HTTP destination configuration. If no
extension exists, the Edit option is not available.
○ Extend
Enables you to create an extension to the selected XS HTTP destination and associate the extension
with another (new or existing) package.
This option is only available if the selected HTTP destination is provided as part of a delivery unit,
for example, as a destination template.
Related Information
An HTTP destination defines connection details for services running on specific hosts whose details you want
to define and distribute. Syntax rules apply to the contents of the HTTP destination configuration are checked
when you activate the configuration in the repository.
Example:
The .xshttpdest Configuration File
The following example shows all possible keyword combinations in the SAP HANA XS application-access
(.xshttpdest) file.
Note
In the form shown below, the .xshttpdest file is not a working model; it is used to illustrate the syntax for
all possible options.
host = "download.finance.yahoo.com";
port = 80;
//All of the following keywords are optional
description = "";
useSSL = false;
sslAuth = client;
sslHostCheck = true;
pathPrefix = "/d/quotes.csv?f=a";
authType = none;
samlProvider = "";
samlACS = "header";
samlAttributes = "";
samlNameId = ["email"];
proxyType = none;
proxyHost = ""; //in-line comments are allowed
proxyPort = 0;
timeout = 0;
remoteSID = "Q7E";
remoteClient = "007";
oAuthAppConfigPackage = "sap.hana.test";
When you are defining the HTTP destination, bear in mind the following important syntax rules:
● A semi-colon (;) is required at the end of each line in the HTTP destination configuration, including the last
line in the file.
● String values must be wrapped in quotes (""), for example:
host = "download.finance.yahoo.com";
Note
The host and port keywords are mandatory; all other keywords are optional.
host
host = "download.finance.yahoo.com";
The host keyword is mandatory: it enables you to specify the hostname of the HTTP destination providing the
service or data you want your SAP HANA XS application to access.
port
port = 80;
The port keyword is mandatory; it enables you to specify the port number to use for connections to the HTTP
destination hosting the service or data you want your SAP HANA XS application to access.
description
The optional keyword description enables you to provide a short description of the HTTP destination you want
to configure. If you do not want to provide a description, include the description but leave the entry between the
quotes empty, for example, “”.
useSSL
Note
Setting this option does not configure SSL; if you want to use SSL to secure connections to the configured
destination, you must ensure that SAP HANA is already set up to enable secure outbound connections
using SSL.
If useSSL = true, you can set the authentication type with the keyword sslAuth. You can also use the
sslHostCheck to enable a check which ensures that the certificate used for authentication is valid (matches
the host).
sslAuth
If useSSL = true, you can use the keyword sslAuth to set the authentication type. The following values are
permitted:
● client
(Default setting). You must create a TRUST store entry in the SAP HANA XS Admin Tool's Trust manager (or
use an existing one that is known to the HTTP destination configuration) and maintain the trust
relationship with the SSL server, for example, by adding a certificate to the trust store that is used for the
authentication process.
● anonymous
A built-in key is used for SSL encryption; no TRUST store is needed.. No authentication via SSL is possible.
sslHostCheck
If useSSL = true, you can use the keyword sslHostCheck to enable a check which ensures that the
certificate used for authentication is valid (matches the host). The following values are permitted:
● true
(Default setting). The SSL certificate subject must match the host name. For example, if SSL server
certificate CN=server1.acme.com, then the host parameter must be server1.acme.com. If there is no
match, SSL terminates.
● false
No host check is performed. Note that if the SSL server certificate is CN=server1.acme.com, and you use
“localhost” as a connection parameter (because this certificate is installed on its own server), then this
works with sslHostCheck deactivated (sslHostCheck=false).
pathPrefix = "";
The optional keyword pathPrefix enables you to specify a text element to add to the start of the URL used for
connections to the service specified in the HTTP destination configuration. For example, pathPrefix = "/d/
quotes.csv?f=a" inserts the specified path into the URL called by the connection.
authType
The optional keyword authType enables you to specify the authentication method that must be used for
connection requests for the service located at the HTTP destination specified in the configuration, for example,
“basic”, which requires users to provide a user name and password as authentication credentials. Permitted
values for the authType are “none”, “basic”, and “AssertionTicket”. If no authentication type is specified, the
default setting “none” applies.
The AssertionTicket option is for use with XSJS applications that want to enable access to HTTP services
running on remote SAP servers using single sign-on (SSO) with SAP assertion tickets. If the AssertionTicket
option is enabled, a user with administration privileges in SAP HANA must use the parameter
saplogontickettruststore to specify the location of the trust store containing the assertion tickets.
Tip
If authType = AssertionTicket is set you also need to set values for the keywords remoteSID and
remoteclient.
For authType = SamlAssertion;, you must also set the subproperties samlProvider, samlACS,
samlAttributes, and samlNameId.
samlProvider
samlProvider = "";
If you set authType = SamlAssertion, you must also set the subproperty samlProvider, which enables
you to specify the entityID of the remote SAML party.
samlACS = "header";
If you set authType = SamlAssertion, you must also set the subproperty samlACS, which enables you to
specify the way in which SAML assertions or responses are sent. The following values are supported:
samlAttributes
samlAttributes = "name1=<property>&name2=<property>";
If you set authType = SamlAssertion, you must also set the subproperty samlAttributes, which enables
you to specify additional attributes for the SAML Assertion.
samlNameId
If you set authType = SamlAssertion, you must also set the subproperty samlNameId, which enables you
to define a list of name-ID mappings. The following values are supported:
● email
● unspecified
For example, if you have an e-mail maintained in SAP HANA User Self Services (USS), the SAML assertion
contains your e-mail address; if you do not have a e-mail address maintained in SAP HANA USS, the mapping
is “unspecified”.
proxytype = none;
The optional keyword proxyType enables you to specify if a proxy server must be used to resolve the host name
specified in the HTTP destination configuration file, and if so, which type of proxy. The following values are
allowed:
● none
● http
● socks
Caution
proxyType replaces and extends the functionality previously provided with the keyword useProxy. For
backward compatibility, the useProxy is still allowed but should not be used any more.
To define the proxy host and the port to connect on, use the keywords proxyHost and proxyPort
respectively.
If you want to include the proxy-related information in a separate configuration (a so-called extension to the
original HTTP destination configuration), you must set proxyType = none in the original HTTP destination
configuration. In the HTTP destination extension that references and modifies the original HTTP destination,
you can change the proxy setting to proxyType = http. You must then provide the corresponding host name
of the proxy server and a port number to use for connections.
proxyHost
proxyHost = "";
If you use the keyword useProxy = true to specify that a proxy server must be used to resolve the target
host name specified in the HTTP destination configuration, you must use the proxyHost and proxyPort
keywords to specify the fully qualified name of the host providing the proxy service (and the port number to use
for connections). The name of the proxy host must be wrapped in quotes, as illustrated in the following
example,
proxyHost = "myproxy.hostname.com"
proxyPort
proxyPort = 8080;
If you use the keyword useProxy = true to indicate that a proxy server must be used to resolve the host
name specified in the HTTP destination configuration, you must also use the proxyPort keyword (in
combination with proxyHost =) to specify the port on which the proxy server accepts connections.
timeout = -1;
The optional keyword timeout enables you to specify for how long (in milliseconds) an application tries to
connect to the remote host specified in the HTTP destination configuration, for example, timeout = 5000;
(5 seconds). By default, the timeout interval is set to -1, which means that there is no limit to the time required
to connect to the server specified in the HTTP destination configuration. In the default setting, the application
keeps trying to connect to the destination server either until the server responds, however long this takes, or
the underlying request-session timeout (300 seconds) is reached. The default setting (-1) is intended to help in
situations where the destination server is slow to respond, for example, due to high load.
remoteSID
remoteSID = "Q7E";
The optional keyword remoteSID enables you to specify the SID of a remote ABAP system. You use this
keyword in combination with the remoteClient keyword, for example, to enable an application to log on to an
ABAP system that is configured to provide SAP assertion tickets. If the XSJS application service requires
access to remote services, you can create an HTTP destination that defines the logon details required by the
remote ABAP system and specifies SSO with SAP assertion tickets as the logon authentication method.
Note
In the XS Administration Tool, the value specified in an HTTP destination configuration file with the
remoteSID keyword is displayed in the SAP SID field in the AUTHENTICATION section of the application's
runtime configuration. The SAP SID option is only available if you select SAP Assertion Ticket as the
authentication type in the application's runtime configuration.
remoteClient
remoteClient = "007";
The optional keyword remoteClient enables you to specify the client number to use when logging on to a
remote ABAP system. You use this keyword in combination with the remoteSID keyword, for example, to
enable an application to logon to an ABAP system that is configured to provide SAP assertion tickets. If the
XSJS application service requires access to remote services, you can create an HTTP destination that defines
the logon details required by the remote ABAP system and specifies SSO with SAP assertion tickets as the
logon authentication method.
Note
In the XS Administration Tool, the value specified in an HTTP destination configuration file with the
remoteClient keyword is displayed in the SAP Client field in the AUTHENTICATION section of the
oAuthAppConfigPackage
oAuthAppConfigPackage = "sap.hana.test";
Use the optional keyword oAuthAppConfigPackage enables you to specify the location of the package that
contains the oAuth application configuration to be used by an HTTP destination configuration.
oAuthAppConfig
oAuthAppConfig = "abapTest";
Use the optional keyword oAuthAppConfig enables you to specify the name of the oAuth application
configuration to be used by an HTTP destination configuration. The OAuth application configuration is a file
describing the application-specific OAuth parameters that are used to enable access to a resource running on a
remote HTTP destination. The OAuth application configuration is defined in a design-time artifact with the
mandatory file suffix .xsoauthappconfig; the configuration file must be specified using the JSON format.
modifies
modifies pkg.path.testApp:yahoo.xshttpdest;
The keyword modifies can only be used in an HTTP extension file and enables you to reference an existing
HTTP destination (or extension) whose settings you want to further extend or modify. The settings in an HTTP
destination extension overwrite any identical settings in the original HTTP destination configuration. The HTTP
destination configuration referenced by the modifies keyword must already exist.
Note
The HTTP destination extension does not have to be tied to a particular XSJS application; it can be located
in any application package or subpackage. For this reason, you must include the full package path to the
HTTP destination extension when using the modifies keyword.
Related Information
An HTTP destination defines connection details for services running on specific hosts whose details you want
to define and distribute. An extension to an HTTP destination provides additional information or modifies
values set in the original configuration.
You can use one or more extension to an HTTP destination configuration; the extensions include additions to
the original settings or modifications to the values set in the original configuration. For example, you could
include basic configuration settings in an HTTP destination and provide details of any required proxy settings in
a separate, so-called “extension”.
You define an extension to an HTTP destination configuration in a text file that contains the details of the
modifications you want to apply to the connection details for the original HTTP destination. The HTTP
destination extension uses a mandatory syntax comprising a list of keyword=value pairs, for example, host =
"download.finance.myhoo.com";. The same syntax rules apply for the basic HTTP destination
configuration and any extensions. Both files must also have the file suffix .xshttpdest, for example,
myHTTPdestination.xshttpdest or myHTTPextension.xshttpdest.After creating and saving the HTTP
destination extension, you must activate it in the SAP HANA repository.
Note
The HTTP destination extension does not have to be tied to a particular XSJS application; it can be located
in any application package or subpackage. For this reason, you must include the full package path to the
HTTP destination extension.
The following configuration file for the HTTP destination yahooProxy.xshttpdest illustrates how to modify
the proxy settings specified in the HTTP destination yahoo.xshttpdest, located in the application package
pkg.path.testApp.
modifies pkg.path.testApp:yahoo.xshttpdest;
proxyType = http;
proxyHost = "proxy.host.name.com";
proxyPort = 8080;
Note
For backward compatibility, the keyword userProxy still works; however, it has been replaced with the
keyword proxyType, which takes the values: [none | http | socks].
After activation, you can view the details of the new HTTP destination extension using the SAP HANA XS
Administration tool.
Note
Access to details of HTTP destinations in the SAP HANA XS Administration Tool requires the credentials of
an authenticated database user and one of the following SAP HANA roles:
● HTTPDestViewer
● HTTPDestAdministrator
Create the files required to enable a service that uses OAuth to authorize access to a resource running on a
remote HTTP destination.
Prerequisites
Context
An OAuth configuration package is a collection of configuration files that define the details of how an
application uses OAuth to enable logon to a resource running on a remote HTTP destination.
An HTTP destination defines connection details for services running on specific hosts whose details you want
to define and distribute. Additional syntax rules that apply to the contents of the HTTP destination
configuration are checked when you activate the configuration in the repository.
Tip
You connect the OAuth configuration to the HTTP destination configuration in the HTTP destination's
runtime configuration. Access to the runtime configuration tools requires the permissions included in an
administrator role.
You need to create the base configuration for your OAuth application in a design-time file with the
mandatory file-extension .xsoauthappconfig. The application configuration is stored in the SAP HANA
repository and must be activated to create the corresponding catalog objects.
a. Create the design-time file that contains your OAuth application configuration, for example,
oauthDriveApp.xsoauthappconfig.
b. Define the details of the new OAuth application configuration, as follows:
{
"clientConfig" :
"sap.hana.xs.oAuth.lib.providerconfig.providermodel:abap_ac",
"mandatoryScopes" : ["OAUTH2_TEST_SCOPE1", "OAUTH2_TEST_SCOPE2"],
"description" : "ABAP Testapplication for OAuth"
}
Note
You create the client configuration for your OAuth application in a design-time file with the mandatory file-
extension .xsoauthclientconfig. You can either use an existing client configuration from the package
sap.hana.xs.oAuth.lib.providerconfig.providermodel or create your own client configuration.
The application configuration is stored in the SAP HANA repository and must be activated to create the
corresponding catalog objects.
a. Create the design-time file that contains your OAuth client configuration, for example,
ABAPv1.xsoauthclientconfig.
b. Define the details of the new OAuth client configuration, as follows:
{
"clientFlavor" :
"sap.hana.xs.oAuth.lib.providerconfig.providermodel:abap_ac",
"clientID" : "<OAuth ClientId registered at ABAP>",
"clientAuthType" : "basic",
"authorizationEndpointURL" : "/sap/bc/sec/oauth2/authorize",
"tokenEndpointURL" : "/sap/bc/sec/oauth2/token",
"revocationEndpointURL" : "/sap/bc/sec/oauth2/revoke",
"redirectURL" : "<External_XS_HOST>:<PORT>/sap/hana/xs/
oAuth/lib/runtime/tokenRequest.xsjs",
"flow" : "authCode",
"scopeReq" : "maxScopes",
"description" : "OAuth Client for SAP Application Server
ABAP - Authorization Code Flow"
}
Tip
You do not have to create the OAuth client flavor from scratch; SAP HANA provides some example
OAuth client flavors which you can use. The example OAuth client flavors are located in the following
package: sap.hana.xs.oAuth.lib.providerconfig.providermodel.
The following example shows the required format and syntax for the contents of
the .xsoauthclientflavor artifact:
{ "parameters":[
{ "flavorStep":"1Aut", "paramLocation":"uri", "paramName":"client_id",
"paramValue":"client_id", "valueType":"eval",
"paramMandatory":"true" },
{ "flavorStep":"2Gra", "paramLocation":"head", "paramName":"Authorization",
"paramValue":"Basic Authentication", "valueType":"sec",
"paramMandatory":"true" },
{ "flavorStep":"3Prc", "paramLocation":"head", "paramName":"Bearer",
"paramValue":"access_token", "valueType":"sec",
"paramMandatory":"true" },
{ "flavorStep":"4Ref", "paramLocation":"head", "paramName":"Authorization",
"paramValue":"Basic Authentication", "valueType":"sec",
"paramMandatory":"true" },
{ "flavorStep":"5Rev", "paramLocation":"para", "paramName":"token",
"paramValue":"access_token", "valueType":"sec",
"paramMandatory":"true" },
] }
Note
The example above is not complete; it is intended for illustration purposes only.
To start the SAP HANA XS Administration Tool, select the xshttpdest file and choose (Maintain
Credentials) in the toolbar. The details of the HTTP destination are displayed.
a. Choose the OAuth Details tab.
b. Choose Edit Browse OAuth App Configs .
c. Select an OAuth application configuration from the list displayed.
The name of the application configuration you choose and the absolute path to the package where it is
located are displayed in the appropriate fields, for example:
○ OAuth App Config Package: sap.hana.test
○ OAuth App Config Name: abapTest
Note
The values displayed here must also be present in the HTTP destination configuration to which the
OAuth configuration applies.
oAuthAppConfigPackage = "sap.hana.test";
oAuthAppConfig = "abapTest";
d. Navigate to the OAuth client configuration and set the client secret.
e. Choose Save to update the runtime confguration for the HTTP destination.
Related Information
The format and syntax required in a design-time artifact describing an OAuth application configuration.
The OAuth application configuration is a file describing the application-specific OAuth parameters that are
used to enable access to a resource running on a remote HTTP destination. The OAuth application
configuration is defined in a design-time artifact with the mandatory file suffix .xsoauthappconfig; the
configuration file must be specified using the JSON format.
Note
The following code example is not a working example; it is provided for illustration purposes, only.
{
"clientConfig":"sap.hana.xs.oAuth.lib.providerconfig.providermodel:abap_ac",
"description":"ABAP test application for OAuth",
"mandatoryScopes":["OAUTH2_TEST_SCOPE1", "OAUTH2_TEST_SCOPE2"],
"optionalScopes":["OAUTH2_TEST_SCOPE3", "OAUTH2_TEST_SCOPE4"],
"modifies":"sap.hana.test:abapTest"
}
Use the clientConfig keyword to specify the fully qualified name of the associated xsoauthclientconfig
artifact, using the format <path.to.package>:<XSOauthClientConfigObjectName>.
"clientConfig":"sap.hana.xs.oAuth.lib.providerconfig.providermodel:abap_ac",
Note
It is mandatory to specify the name and location of the package containing the associated OAuth client
configuration.
description
Use the description keyword to provide an optional short description of the contents of the OAuth
application configuration.
mandatoryScopes
Use the mandatoryScopes keyword to specify one or more (in an array) of strings describing the mandatory
permissions requested by the client.
"mandatoryScopes":["OAUTH2_TEST_SCOPE1", "OAUTH2_TEST_SCOPE2"],
optionalScopes
Use the optionalScopes keyword to specify one or more (in an array) of strings describing the optional
permissions to be used by the client.
"optionalScopes":["OAUTH2_TEST_SCOPE3", "OAUTH2_TEST_SCOPE4"],
modifies
Use the modifies keyword to indicate that the current XS OAuth application configuration (for example,
abapTest2.xsoauthappconfig is based on (and extends) another SAP HANA XS OAuth application
configuration (for example, abapTest.xsoauthappconfig). You must specify the fully qualified name of the
"modifies":"sap.hana.test:abapTest.xsoauthappconfig",
Related Information
The format and syntax required in a design-time artifact describing the OAuth client configuration.
The OAuth client configuration is a file describing details of the client parameters for an application which uses
the services provided by a corresponding OAuth application that enables access to a resource running on a
remote HTTP destination. The OAuth client configuration is defined in a design-time artifact with the
mandatory file suffix .xsoauthclientconfig; the configuration file must be specified using the JSON
format. The following code example shows the contents of a typical OAuth client configuration.
Note
The following code example is not a working example; it is provided for illustration purposes, only.
{
"clientFlavor":"sap.hana.xs.oAuth.lib.providerconfig.providermodel:abap_ac",
"clientID":"<The OAuth ClientId you registered at ABAP>",
"clientAuthType":"basic",
"authorizationEndpointURL":"/sap/bc/sec/oauth2/authorize",
"tokenEndpointURL":"/sap/bc/sec/oauth2/token",
"revocationEndpointURL":"/sap/bc/sec/oauth2/revoke",
"flow":"authCode",
"description":"OAuth Client for ABAP server",
"samlIssuer":"" ,
"redirectURL":"<HOST>:<PORT>/sap/hana/xs/oAuth/lib/runtime/tokenRequest.xsjs",
"scopeReq":"maxScopes",
"shared":"true",
"modifies":"sap.hana.xs.oAuth.lib.providerconfig.providermodel:abap_ac"
}
In this example, the OAuth client configuration is located in the package com.acme.oAuth.lib; change the
path specified in clientFlavor to suit your own requirements. You will also have to change the value
specified for clientID and redirectURL.
Tip
SAP HANA provides some example OAuth client configurations which you can use; you can find them in the
following package: sap.hana.xs.oAuth.lib.providerconfig.providermodel
Use the clientFlavor keyword to specify the fully qualified name of the associated XS OAuth client flavor
configuration artifact, for example, ABAPv1.xsoauthclientfavor; you must use the format
<path.to.package>:<ObjectName> (no file extension is required).
"clientFlavor":"sap.hana.xs.oAuth.lib.providerconfig.providermodel:abap_ac",
Note
It is mandatory to specify the name and location of the package containing the associated OAuth client
flavor configuration.
clientID
Use the clientID keyword to define a string that specifies the customer's ID, which is used to identify the
client with the server. The clientID must be changed to suit your requirements. Typically, the client ID is
obtained by registering with a specific service provider.
Note
clientAuthType
Use the clientAuthType keyword to define a number that specifies the client authentication type, for
example, “cert” or “basic”.
"clientAuthType" : "basic",
Note
Use the authorizationEndpointURL keyword to specify a string that defines the authorization endpoint.
The authorization endpoint is the endpoint on the authorization server where the resource owner logs on and
grants authorization to the client application.
"authorizationEndpointURL" : "/sap/bc/sec/oauth2/authorize",
Note
tokenEndpointURL
Use the tokenEndpointURL keyword to to specify a string that defines the token endpoint. The token
endpoint is the endpoint on the authorization server where the client application exchanges the authorization
code, the client ID, and the client secret for an access token.
"tokenEndpointURL" : "/sap/bc/sec/oauth2/token",
Note
revocationEndpointURL
Use the revocationEndpointURL keyword to to specify a string that defines the token endpoint. The token
endpoint is the endpoint on the authorization server where the client application exchanges the authorization
code, the client ID, and the client secret for an access token.
"revocationEndpointURL" : "/sap/bc/sec/oauth2/revoke",
Note
flow
Use the flow keyword to specify a number that defines the authorization flow used during the authentication
exchange, for example, saml2Bearer or authCode.
"flow" :"saml2Bearer",
● saml2Bearer
● authCode
description
Use the optional description keyword to provide a short description of the OAuth client configuration.
"description": "OAuth Client for SAP App Server ABAP - Authorization Code Flow"
samlIssuer
Use the optional samlIssuer keyword to specify a string that defines the SAML issuer ID. The SAML issuer ID
describes the issuer of the SAML token. The SAML bearer extension enables the validation of SAML tokens as
part of granting the OAuth access token.
Note
You set this parameter only if the parameter flow is set to saml2Bearer, for example,
"flow" :"saml2Bearer".
"samlIssuer" : "" ,
redirectURL
Use the redirectURL keyword to specify a string that defines the redirection endpoint. The redirection
endpoint is the endpoint in the client application where the resource owner is redirected to, after having
granted authorization at the authorization endpoint. The redirectURL must be changed to suit your
requirements.
"redirectURL" : "<HOST>:<PORT>/sap/hana/xs/oAuth/lib/runtime/tokenRequest.xsjs",
Note
Use the scopeReq keyword to specify whether the maximum available scope from all applications using this
client configuration is always requested or the scope set is specified iteratively.
"scopeReq" : "maxScopes",
● maxScopes
● iterativeScopes
Note
shared
Use the shared keyword to specify a number that defines whether the if the XS OAuth client configuration can
be shared between applications.
"shared" : "false",
● true (shared)
● false (not shared)
Note
modifies
Use the modifies keyword to indicate that the current XS OAuth client configuration, for example,
abap_ac1.xsoauthclientconfig, is based on (and extends) another SAP HANA XS OAuth client
configuration (for example, abap_ac.xsoauthclientconfig). You must specify the fully qualified name of
the associated OAuth client configuration artifact (<fileName>.xsoauthclientconfig), using the format
<path.to.package>:<ArtifactName>.xsoauthclientconfig.
"modifies":"sap.hana.xs.oAuth.lib.providerconfig.providermodel:abap_ac.xsoauthcli
entconfig",
The format and syntax required in a design-time artifact that describes the OAuth client flavors.
The OAuth client flavor file provides details of the OAuth protocol for a client application that uses the services
provided by a corresponding OAuth application. The OAuth client flavor steps are defined in a design-time
artifact with the mandatory file suffix .xsoauthclientflavor; the configuration file must be specified using
the JSON format.
Note
The following example of an OAuth client flavor configuration is incomplete; it is intended for illustration
purposes only.
{ "parameters":[
{ "flavorStep":"1Aut", "paramLocation":"uri", "paramName":"client_id",
"paramValue":"client_id", "valueType":"eval",
"paramMandatory":"true" },
{ "flavorStep":"1Aut", "paramLocation":"uri", "paramName":"redirect_uri",
"paramValue":"redirect_uri", "valueType":"eval",
"paramMandatory":"true" },
{ "flavorStep":"1Aut", "paramLocation":"uri", "paramName":"scope",
"paramValue":"scope", "valueType":"eval",
"paramMandatory":"true" },
{ "flavorStep":"1Aut", "paramLocation":"uri", "paramName":"response_type",
"paramValue":"code", "valueType":"litr",
"paramMandatory":"true" },
{ "flavorStep":"1Aut", "paramLocation":"uri", "paramName":"state",
"paramValue":"state", "valueType":"eval",
"paramMandatory":"true" },
{ "flavorStep":"2Gra", "paramLocation":"head", "paramName":"Authorization",
"paramValue":"Basic Authentication", "valueType":"sec",
"paramMandatory":"true" },
{ "flavorStep":"2Gra", "paramLocation":"head", "paramName":"Content-Type",
"paramValue":"application/x-www-form-urlencoded", "valueType":"litr",
"paramMandatory":"true" },
{ "flavorStep":"2Gra", "paramLocation":"para", "paramName":"code",
"paramValue":"code", "valueType":"eval",
"paramMandatory":"true" },
{ "flavorStep":"2Gra", "paramLocation":"para", "paramName":"grant_type",
"paramValue":"authorization_code", "valueType":"litr",
"paramMandatory":"true" },
{ "flavorStep":"2Gra", "paramLocation":"para", "paramName":"client_id",
"paramValue":"client_id", "valueType":"eval",
"paramMandatory":"true" },
{ "flavorStep":"2Gra", "paramLocation":"para", "paramName":"redirect_uri",
"paramValue":"redirect_uri", "valueType":"eval",
"paramMandatory":"true" },
{ "flavorStep":"3Prc", "paramLocation":"head", "paramName":"Bearer ",
"paramValue":"access_token", "valueType":"sec",
"paramMandatory":"true" },
It is not necessary to create your own OAuth client flavor from scratch; SAP HANA provides some OAuth client
flavors for a selection of OAuth server scenarios, which you can use without modification.
Tip
However, you do need to modify the OAuth client flavor artifact for the following scenarios:
● Modifications are required (or have already been made) to the API of an available OAuth server.
● A connection is required to a new OAuth server not covered by the scenarios included in the SAP HANA
configuration templates.
parameters
Use the parameters keyword to define a list of parameter-values pairs, for example,
"paramLocation":"uri" that support the specification defined in the OAuth client configuration file
<filename>.oxauthclientconfig.
flavorStep
Use the flavorStep keyword to specify a step in the procedure used by the client flavor, as illustrated in the
following example
"flavorStep":"saml",
● IAut
● 2Gra
● 3Prc
● 4Ref
paramLocation
Use the paramLocation keyword to specify the location of the parameter defined, as shown in the following
example:
"paramLocation":"uri",
● uri
Universal resource indicator
● head
In the request header
● para
In the request body
paramName
Use the paramName keyword to specify the name of the parameter defined in “paramLocation”, as shown in
the following example:
"paramName":"token",
The parameter name depends on the local setup of your client configuration.
paramValue
Use the paramValue keyword to specify a value for the parameter name specified in “paramName”.
"paramValue":"access_token",
The parameter name depends on the local setup of your client configuration.
valueType
Use the valueType keyword to specify the type of value expected by the parameter defined in “paramValue”.
"valueType":"sec",
● litr
Literal value
● eval
The value is evaluated by the OAuth client runtime
● sec
The value is evaluated by the OAuth client runtime in a secure way
paramMandatory
"paramMandatory":"true",
● true
Required
● false
Not Required
Related Information
The persistence model defines the schema, tables, sequences, and views that specify what data to make
accessible for consumption by XS applications and how.
In SAP HANA Extended Application Services (SAP HANA XS), the persistence model is mapped to the
consumption model that is exposed to client applications and users so that data can be analyzed and displayed
in the appropriate form in the client application interface. The way you design and develop the database
objects required for your data model depends on whether you are developing applications that run in the SAP
HANA XS classic or XS advanced run-time environment.
SAP HANA XS classic model enables you to create database schema, tables, views, and sequences as design-
time files in the SAP HANA repository. Repository files can be read by applications that you develop. When
implementing the data persistence model in XS classic, you can use either the Core Data Services (CDS)
syntax or HDBtable syntax (or both). “HDBtable syntax” is a collective term; it includes the different
configuration schema for each of the various design-time data artifacts, for example: schema (.hdbschema),
sequence (.hdbsequence), table (.hdbtable), and view (.hdbview).
All repository files including your view definition can be transported (along with tables, schema, and
sequences) to other SAP HANA systems, for example, in a delivery unit. A delivery unit is the medium SAP
HANA provides to enable you to assemble all your application-related repository artifacts together into an
archive that can be easily exported to other systems.
Note
You can also set up data-provisioning rules and save them as design-time objects so that they can be
included in the delivery unit that you transport between systems.
The rules you define for a data-provisioning scenario enable you to import data from comma-separated values
(CSV) files directly into SAP HANA tables using the SAP HANA XS table-import feature. The complete data-
import configuration can be included in a delivery unit and transported between SAP HANA systems for reuse.
As part of the process of setting up the basic persistence model for SAP HANA XS, you create the following
artifacts in the XS classic repository:
Association .hdbdd -
Note
(*) To create a schema, a synonym, or a sequence, you must use the appropriate HDBTable syntax, for
example, .hdbschema, .hdbsynonym, or .hdbsequence. In a CDS document, you can include references
to both CDS and HDBTable artifacts.
On activation of a repository artifact, the file suffix (for example, .hdbdd or .hdb[table|view]) is used to
determine which run-time plug-in to call during the activation process. When you activate a design-time artifact
in the SAP HANA Repository, the plug-in corresponding to the artifact's file suffix reads the contents of
repository artifact selected for activation (for example, a table, a view, or a complete CDS document that
contains multiple artifact definitions), interprets the artifact definitions in the file, and creates the appropriate
corresponding run-time objects in the catalog.
For the XS advanced run time, you develop multi-target applications (MTA), which contain modules, for
example: a database module, a module for your business logic (Node.js), and a UI module for your client
interface (HTML5). The modules enable you to group together in logical subpackages the artifacts that you
need for the various elements of your multi-target application. You can deploy the whole package or the
individual subpackages.
As part of the process of defining the database persistence model for your XS advanced application, you use
the database module to store database design-time artifacts such as tables and views, which you define using
Core Data Services (CDS). However, you can also create procedures and functions, for example, using
SQLScript, which can be used to insert data into (and remove data from) tables or views.
Note
In general, CDS works in XS advanced (HDI) in the same way that it does in the SAP HANA XS classic
Repository. For XS advanced, however, there are some incompatible changes and additions, for example, in
the definition and use of name spaces, the use of annotations, the definition of entities (tables) and
structure types. For more information, see CDS Documents in XS Advanced in the list of Related Links
below.
Tip
You can also define the analytic model, for example, the calculation views and analytic privileges that
are to be used to analyze the underlying data model and specify who (or what) is allowed access.
Related Information
Core data services (CDS) is an infrastructure that can be used to define and consume semantically rich data
models in SAP HANA.
The model described in CDS enables you to use the Data Definition Language to define the artifacts that make
up the data-persistence model. You can save the data-persistence object definition as a CDS artifact, that is; a
design-time object that you manage in the SAP HANA repository and activate when necessary. Using a data
definition language (DDL), a query language (QL), and an expression language (EL), CDS enables write
operations, transaction semantics, and more.
You can use the CDS specification to create a CDS document which defines the following artifacts and
elements:
● Entities (tables)
● Views
● User-defined data types (including structured types)
● Contexts
● Associations
● Annotations
Note
To create a schema, a synonym, or a sequence, you must use the appropriate .hdbtable artifact, for
example, .hdbschema, .hdbsynonym, or .hdbsequence. You can reference these artifacts in a CDS
document.
CDS artifacts are design-time definitions that are used to generate the corresponding run-time objects, when
the CDS document that contains the artifact definitions is activated in the SAP HANA repository. In CDS, the
objects can be referenced using the name of the design-time artifact in the repository; in SQL, only the name of
the catalog object can be used. The CDS document containing the design-time definitions that you create
using the CDS-compliant syntax must have the file extension .hdbdd, for example, MyCDSTable.hdbdd.
Related Information
You can use the Data Definition Language (DDL) to define a table, which is also referred to as an “entity” in SAP
HANA Core Data Services (CDS). The finished artifact is saved in the repository with the extension
(suffix) .hdbdd, for example, MyTable.hdbdd.
Prerequisites
Procedure
Note
If you are using a CDS document to define a single CDS-compliant entity, the name of the CDS
document must match the name of the entity defined in the CDS document, for example, with the
entity keyword. In the example in this tutorial, you would save the entity definition “BOOK” in the
CDS document BOOK.hdbdd.
Note
Choose (Insert Snippet) to insert a code template. The following mandatory keywords are, by
default, assigned the following values:
○ namespace = <Current package path>
○ context = <New DDL file name>
The name space declared in a CDS document must match the repository package in which the object
the document defines is located.
namespace mycompany.myapp1;
@Schema : 'MYSCHEMA'
@Catalog.tableType: #COLUMN
@Catalog.index: [ { name : 'MYINDEX1', unique : true, order : #DESC,
elementNames : ['ISBN'] } ]
entity BOOK {
key Author : String(100);
key BookTitle : String(100);
ISBN : Integer not null;
Publisher : String(100);
};
Catalog.MYSCHEMA.Tables.mycompany.myapp1::BOOK
The following public synonym is also created, which can be referenced using the standard CDS-compliant
query notation:
"mycompany.myapp1::BOOK"
c. Choose (Insert).
d. In the new row, enter some test data and save your entries, as shown in the example below:
A CDS document is a design-time source file that contains definitions of the objects you want to create in the
SAP HANA catalog.
Prerequisites
● You have created a schema for the CDS catalog objects generated when the CDS document is activated in
the repository, for example, MYSCHEMA.
● You have SELECT privileges on the schema so you can see the generated catalog objects.
Context
CDS documents are design-time source files that contain DDL code that describes a persistence model
according to rules defined in Core Data Services. CDS documents have the file suffix .hdbdd. Activating the
CDS document creates the corresponding catalog objects in the specified schema.
Procedure
The @Schema annotation defines the name of the schema to use to store the artifacts that are
generated when the CDS document is activated. The schema name must be inserted before the top-
level element in the CDS document, which in this example is the context MyModel.
namespace acme.com.hana.cds.data;
@Schema: 'SAP_HANA_CDS'
context MyModel {
};
Note
If the schema you specify does not exist, you cannot activate the new CDS document.
namespace acme.com.hana.cds.data;
@Schema: 'SAP_HANA_CDS'
context MyModel {
type BusinessKey : String(10);
type SString : String(40);
type <[...]>
<[...]>
};
namespace acme.com.hana.cds.data;
@Schema: 'SAP_HANA_CDS'
context MyModel {
type BusinessKey : String(10);
type SString : String(40);
type <[...]>
context MasterData {
<[...]>
};
context Sales {
<[...]>
namespace acme.com.hana.cds.data;
@Schema: 'SAP_HANA_CDS'
context MyModel {
type BusinessKey : String(10);
type SString : String(40);
type <[...]>
context MasterData {
@Catalog.tableType : #COLUMN
Entity Addresses {
key AddressId: BusinessKey;
City: SString;
PostalCode: BusinessKey;
<[...]>
};
@Catalog.tableType : #COLUMN
Entity BusinessPartner {
key PartnerId: BusinessKey;
PartnerRole: String(3);
<[...]>
};
};
context Sales {
<[...]>
};
context Purchases {
<[...]>
};
};
c. Select one of the new tables you have just created to display its definition.
CDS documents are design-time source files that contain DDL code that describes a persistence model
according to rules defined in Core Data Services.
CDS documents have the file suffix .hdbdd. Each CDS document must contain the following basic elements:
Note
If you use the file-creation wizard to create a new CDS document, the name space is inserted
automatically; the inserted name space reflects the repository location you select to create the new
CDS document.
● A schema definition
The schema you specify is used to store the catalog objects that are defined in the CDS document, for
example: entities, structured types, and views. The objects are generated in the catalog when the CDS
document is activated in the SAP HANA repository.
● CDS artifact definitions
The objects that make up your persistence model, for example: contexts, entities, structured types, and
views
Each CDS document must contain one top-level artifact, for example: a context, a type, an entity, or a view. The
name of the top-level artifact in the CDS document must match the file name of the CDS document, without
the suffix. For example, if the top-level artifact is a context named MyModel, the name of the CDS document
must be MyModel.hdbdd.
Note
On activation of a repository file in, the file suffix, for example, .hdbdd, is used to determine which runtime
plug-in to call during the activation process. The plug-in reads the repository file selected for activation, in
this case a CDS-compliant document, parses the object descriptions in the file, and creates the appropriate
runtime objects in the catalog.
If you want to define multiple CDS artifacts within a single CDS document (for example, multiple types,
structured types, and entities), the top-level artifact must be a context. A CDS document can contain multiple
contexts and any number and type of artifacts. A context can also contain nested sub-contexts, each of which
can also contain any number and type of artifacts.
When a CDS document is activated, the activation process generates a corresponding catalog object for each
of the artifacts defined in the document; the location in the catalog is determined by the type of object
generated. The following table shows the catalog location for objects generated by the activation of common
CDS artifacts.
The following example shows the basic structure of a single CDS document that resides in the package
acme.com.hana.cds.data in the SAP HANA repository. the CDS document defines the following CDS
artifacts:
● Types:
○ BusinessKey and SString
● Entities:
○ Addresses, BusinessPartners, Header, and Item
● Contexts:
○ MyModel, which contains the nested contexts: MasterData, Sales, and Purchases
● External references
The using keyword enables you to refer to artifacts defined in separate CDS documents, for example,
MyModelB.hdbdd. You can also assign an alias to the reference, for example, AS <alias>.
● Annotations
Built-in annotations, for example, @Catalog, @Schema, and @nokey, are important elements of the CDS
syntax used to define CDS-compliant catalog objects. You can define your own custom annotations, too.
Note
The following code snippet is incomplete [...]; it is intended for illustration purposes only.
Sample Code
namespace acme.com.hana.cds.data;
using acme.com.hana.cds.data::MyModelB.MyContextB1 as ic;
@Schema: 'SAP_HANA_CDS'
context MyModel {
type BusinessKey : String(10);
type SString : String(40);
type <[...]>
context MasterData {
@Catalog.tableType : #COLUMN
Entity Addresses {
key AddressId: BusinessKey;
City: SString;
PostalCode: BusinessKey;
<[...]>
};
@Catalog.tableType : #COLUMN
Entity BusinessPartner {
key PartnerId: BusinessKey;
PartnerRole: String(3);
<[...]>
};
};
context Sales {
@Catalog.tableType : #COLUMN
Entity Header {
key SalesOrderId: BusinessKey;
<[...]>
};
@Catalog.tableType : #COLUMN
Related Information
You can define an artifact in one CDS document by referring to an artifact that is defined in another CDS
document.
The CDS syntax enables you to define a CDS artifact in one document by basing it on an “external” artifact - an
artifact that is defined in a separate CDS document. Each external artifact must be explicitly declared in the
source CDS document with the using keyword, which specifies the location of the external artifact, its name,
and where appropriate its CDS context.
Tip
The using declarations must be located in the header of the CDS document between the namespace
declaration and the beginning of the top-level artifact, for example, the context.
The external artifact can be either a single object (for example, a type, an entity, or a view) or a context. You can
also include an optional alias in the using declaration, for example, ContextA.ContextA1 as ic. The alias
(ic) can then be used in subsequent type definitions in the source CDS document.
//Filename = Pack1/Distributed/ContextB.hdbdd
namespace Pack1.Distributed;
using Pack1.Distributed::ContextA.T1;
using Pack1.Distributed::ContextA.ContextAI as ic;
using Pack1.Distributed::ContextA.ContextAI.T3 as ict3;
using Pack1.Distributed::ContextA.ContextAI.T3.a as a; // error, is not an
artifact
context ContextB {
type T10 {
a : T1; // Integer
b : ic.T2; // String(20)
c : ic.T3; // structured
d : type of ic.T3.b; // String(88)
The CDS document ContextB.hdbdd shown above uses external artifacts (data types T1 and T3) that are
defined in the “target” CDS document ContextA.hdbdd shown below. Two using declarations are present in
the CDS document ContextB.hdbdd; one with no alias and one with an explicitly specified alias (ic). The first
using declaration introduces the scalar type Pack1.Distributed::ContextA.T1. The second using
declaration introduces the context Pack1.Distributed::ContextA.ContextAI and makes it accessible by
means of the explicitly specified alias ic.
Note
If no explicit alias is specified, the last part of the fully qualified name is assumed as the alias, for example
T1.
The using keyword is the only way to refer to an externally defined artifact in CDS. In the example above, the
type x would cause an activation error; you cannot refer to an externally defined CDS artifact directly by using
its fully qualified name in an artifact definition.
//Filename = Pack1/Distributed/ContextA.hdbdd
namespace Pack1.Distributed;
context ContextA {
type T1 : Integer;
context ContextAI {
type T2 : String(20);
type T3 {
a : Integer;
b : String(88);
};
};
};
Note
Whether you use a single or multiple CDS documents to define your data-persistence model, each CDS
document must contain only one top-level artifact, and the name of the top-level artifact must correspond
to the name of the CDS document. For example, if the top-level artifact in a CDS document is ContextA,
then the CDS document itself must be named ContextA.hdbdd.
Rules and restrictions apply to the names of CDS documents and the package in which the CDS document
resides.
The rules that apply for naming CDS documents are the same as the rules for naming the packages in which
the CDS document is located. When specifying the name of a package or a CDS document (or referencing the
name of an existing CDS object, for example, within a CDS document), bear in mind the following rules:
Caution
Although it is possible to use quotation marks (“”) to wrap a name that includes forbidden characters, as a
general rule, it is recommended to follow the naming conventions for CDS documents specified here in
order to avoid problems during activation in the repository.
Related Information
The namespace is the path to the package in the SAP HANA Repository that contains CDS artifacts such as
entities, contexts, and views.
In a CDS document, the first statement must declare the namespace that contains the CDS elements which
the document defines, for example: a context, a type, an entity, or a view. The namespace must match the
package name where the CDS elements specified in the CDS document are located. If the package path
specified in a namespace declaration does not already exist in the SAP HANA Repository, the activation
process for the elements specified in the CDS document fails.
It is possible to enclose in quotation marks (“”) individual parts of the namespace identifier, for example,
"Pack1".pack2. Quotes enable the use of characters that are not allowed in regular CDS identifiers; in CDS, a
quoted identifier can include all characters except the dot (.) and the double colon (::). If you need to use a
reserved keyword as an identifier, you must enclose it in quotes, for example, “Entity”. However, it is
recommended to avoid the use of reserved keywords as identifiers.
Note
You can also use quotation marks (“”) to wrap the names of CDS artifacts (entities, views) and elements
(columns...).
The following code snippet applies to artifacts created in the Repository package /Pack1/pack2/ and shows
some examples of valid namespace declarations, including namespaces that use quotation marks (“”).
Note
namespace Pack1.pack2;
namespace "Pack1".pack2;
namespace Pack1."pack2";
namespace "Pack1"."pack2";
The following code snippet applies to artifacts created in the Repository package /Pack1/pack2/ and shows
some examples of invalid namespace declarations.
The examples of namespace declarations in the code snippet above are invalid for the following reasons:
● pack1.pack2;
pack1 is spelled incorrectly; the namespace element requires a capital P to match the corresponding
location in the Repository, for example, Pack1.
● "Pack1.pack2";
You cannot quote the entire namespace path; only individual elements of the namespace path can be
quoted, for example, "Pack1".pack2; or Pack1."pack2";.
● Pack1.pack2.MyDataModel;
The namespace declaration must not include the names of elements specified in the CDS document itself,
for example, MyDataModel.
Related Information
The following example illustrates how to assign two simple entities to a context using the CDS-
compliant .hdbdd syntax; you store the context-definition file with a specific name and the file
extension .hdbdd, for example, MyContext.hdbdd.
Note
If you are using a CDS document to define a CDS context, the name of the CDS document must match the
name of the context defined in the CDS document, for example, with the “context” keyword.
In the example below, you must save the context definition “Books” in the CDS document Books.hdbdd. In
addition, the name space declared in a CDS document must match the repository package in which the object
the document defines is located.
The following code example illustrates how to use the CDS syntax to define multiple design-time entities in a
context named Books.
namespace com.acme.myapp1;
@Schema : 'MYSCHEMA'
context Books {
@Catalog.tableType: #COLUMN
@Catalog.index : [ { name : 'MYINDEX1', unique : true, order : #DESC,
elementNames : ['ISBN'] } ]
entity Book {
key AuthorID : String(10);
key BookTitle : String(100);
ISBN : Integer not null;
Publisher : String(100);
};
@Catalog.tableType: #COLUMN
@Catalog.index : [ { name: 'MYINDEX2', unique: true, order: #DESC,
elementNames: ['AuthorNationality'] } ]
entity Author {
key AuthorName : String(100);
AuthorNationality : String(20);
AuthorBirthday : String(100);
AuthorAddress : String(100);
};
};
Activation of the file Books.hdbdd containing the context and entity definitions creates the catalog objects
“Book” and “Author”.
The namespace specified at the start of the file, for example, com.acme.myapp1 corresponds to the
location of the entity definition file (Books.hdbdd) in the application-package hierarchy .
Nested Contexts
The following code example shows you how to define a nested context called InnerCtx in the parent context
MyContext. The example also shows the syntax required when making a reference to a user-defined data type
in the nested context, for example, (field6 : type of InnerCtx.CtxType.b;).
The type of keyword is only required if referencing an element in an entity or in a structured type; types in
another context can be referenced directly, without the type of keyword. The nesting depth for CDS contexts
is restricted by the limits imposed on the length of the database identifier for the name of the corresponding
SAP HANA database artifact (for example, table, view, or type); this is currently limited to 126 characters
(including delimiters).
Note
The context itself does not have a corresponding artifact in the SAP HANA catalog; the context only
influences the names of SAP HANA catalog artifacts that are generated from the artifacts defined in a given
CDS context, for example, a table or a structured type.
namespace com.acme.myapp1;
@Schema: 'MySchema'
context MyContext {
// Nested contexts
context InnerCtx {
Entity MyEntity {
…
};
Type CtxType {
a : Integer;
b : String(59);
};
};
type MyType1 {
field1 : Integer;
field2 : String(40);
field3 : Decimal(22,11);
field4 : Binary(11);
};
type MyType2 {
field1 : String(50);
field2 : MyType1;
};
type MyType3 {
field1 : UTCTimestamp;
field2 : MyType2;
};
The sequence of definitions inside a block of CDS code (for example, entity or context) does not matter for
the scope rules; a binding of an artifact type and name is valid within the confines of the smallest block of code
containing the definition, except in inner code blocks where a binding for the same identifier remains valid. This
rules means that the definition of nameX in an inner block of code hides any definitions of nameX in outer code
blocks.
Note
An identifier may be used before its definition without the need for forward declarations.
context OuterCtx
{
type MyType1 : Integer;
type MyType2 : LocalDate;
context InnerCtx
{
type Use1 : MyType1; // is a String(20)
type Use2 : MyType2; // is a LocalDate
type MyType1 : String(20);
};
type invalidUse : Use1; // invalid: Use1 is not
// visible outside of InnerCtx
type validUse : InnerCtx.Use1; // ok
};
No two artifacts (including namespaces) can be defined whose absolute names are the same or are different
only in case (for example, MyArtifact and myartifact), even if their artifact type is different (entity and
view). When searching for artifacts, CDS makes no assumptions which artifact kinds can be expected at certain
source positions; it simply searches for the artifact with the given name and performs a final check of the
artifact type.
The following example demonstrates how name resolution works with multiple nested contexts, Inside context
NameB, the local definition of NameA shadows the definition of the context NameA in the surrounding scope. This
means that the definition of the identifier NameA is resolved to Integer, which does not have a sub-
component T1. The result is an error, and the compiler does not continue the search for a “better” definition of
NameA in the scope of an outer (parent) context.
context OuterCtx
{
context NameA
{
type T1 : Integer;
Related Information
CDS supports built-in annotations, for example, @Catalog, @Schema, and @nokey, which are important
elements of the CDS documents used to define CDS-compliant catalog objects. However, you can define your
own custom annotations, too.
Example
namespace mycompany.myapp1;
@Schema : 'MYSCHEMA'
context Books {
@Catalog.tableType: #COLUMN
@Catalog.index: [ { name : 'MYINDEX1', unique : true, order : #DESC,
elementNames : ['ISBN'] } ]
entity BOOK {
key Author : String(100);
key BookTitle : String(100);
ISBN : Integer not null;
Publisher : String(100);
};
@Catalog.tableType : #COLUMN
@nokey
entity MyKeylessEntity
{
element1 : Integer;
element2 : UTCTimestamp;
@SearchIndex.text: { enabled: true }
element3 : String(7);
};
@GenerateTableType : false
Type MyType1 {
field1 : Integer;
field2 : Integer;
field3 : Integer;
};
};
The following list indicates the annotations you can use in a CDS document:
● @Catalog
● @nokey
● @Schema
● @GenerateTableType
● @SearchIndex
● @WithStructuredPrivilegeCheck
@Catalog
The @Catalog annotation supports the following parameters, each of which is described in detail in a dedicated
section below:
● @Catalog.index
Specify the type and scope of index to be created for the CDS entity, for example: name, order, unique/
non-unique
● @Catalog.tableType
Specify the table type for the CDS entity, for example, column, row, global temporary.
You use the @Catalog.index annotation to define an index for a CDS entity. The @Catalog.index annotation used
in the following code example ensures that an index called Index1 is created for the entity MyEntity1 along
with the index fields fint and futcshrt. The order for the index is ascending (#ASC) and the index is unique.
namespace com.acme.myapp1;
@Catalog.tableType : #COLUMN
@Schema: 'MYSCHEMA'
@Catalog.index:[ { name:'Index1', unique:true, order:#ASC, elementNames:['fint',
'futcshrt' ] } ]
entity MyEntity1 {
key fint:Integer;
fstr :String(5000);
fstr15 :String(51);
fbin :Binary(4000);
fbin15 :Binary(51);
fint32 :Integer64;
fdec53 :Decimal(5,3);
fdecf :DecimalFloat;
fbinf :BinaryFloat;
futcshrt:UTCDateTime not null;
flstr :LargeString;
flbin :LargeBinary;
};
You can define the following values for the @Catalog.index annotation:
You use the @Catalog.tableType annotation to define the type of CDS entity you want to create. The
@Catalog.tableType annotation determines the storage engine in which the underlying table is created.
namespace com.acme.myapp1;
@Schema: 'MYSCHEMA'
context MyContext1 {
@Catalog.tableType : #COLUMN
entity MyEntity1 {
key ID : Integer;
name : String(30);
};
@Catalog.tableType : #ROW
entity MyEntity2 {
key ID : Integer;
name : String(30);
};
@Catalog.tableType : #GLOBAL_TEMPORARY
entity MyEntity3 {
ID : Integer;
name : String(30);
};
};
You can define the following values for the @Catalog.tableType annotation:
● #COLUMN
Create a column-based table. If the majority of table access is through a large number of tuples, with only a
few selected attributes, use COLUMN-based storage for your table type.
● #ROW
Create a row-based table. If the majority of table access involves selecting a few records, with all attributes
selected, use ROW-based storage for your table type.
● #GLOBAL_TEMPORARY
Set the scope of the created table. Data in a global temporary table is session-specific; only the owner
session of the global temporary table is allowed to insert/read/truncate the data. A global temporary table
Note
The SAP HANA database uses a combination of table types to enable storage and interpretation in both
ROW and COLUMN forms. If no table type is specified in the CDS entity definition, the default value
#COLUMN is applied to the table created on activation of the design-time entity definition.
@nokey
An entity usually has one or more key elements, which are flagged in the CDS entity definition with the key
keyword. The key elements become the primary key of the generated SAP HANA table and are automatically
flagged as “not null”. Structured elements can be part of the key, too. In this case, all table fields resulting from
the flattening of this structured field are part of the primary key.
Note
However, you can also define an entity that has no key elements. If you want to define an entity without a key,
use the @nokey annotation. In the following code example, the @nokey annotation ensures that the entity
MyKeylessEntity defined in the CDS document creates a column-based table where no key element is
defined.
namespace com.acme.myapp1;
@Schema: 'MYSCHEMA'
@Catalog.tableType : #COLUMN
@nokey
entity MyKeylessEntity
{
element1 : Integer;
element2 : UTCTimestamp;
element3 : String(7);
};
@Schema
The @Schema annotation is only allowed as a top-level definition in a CDS document. In the following code
example @Schema ensures that the schema MYSCHEMA is used to contain the entity MyEntity1, a column-
based table.
namespace com.acme.myapp1;
@Schema: 'MYSCHEMA'
@Catalog.tableType : #COLUMN
entity MyEntity1 {
key ID : Integer;
name : String(30);
};
If the schema specified with the @Schema annotation does not already exist, an activation error is
displayed and the entity-creation process fails.
The schema name must adhere to the SAP HANA rules for database identifiers. In addition, a schema name
must not start with the letters SAP*; the SAP* namespace is reserved for schemas used by SAP products and
applications.
@GenerateTableType
For each structured type defined in a CDS document, an SAP HANA table type is generated, whose name is
built by concatenating the elements of the CDS document containing the structured-type definition and
separating the elements by a dot delimiter (.). The new SAP HANA table types are generated in the schema
that is specified in the schema annotation of the respective top-level artifact in the CDS document containing
the structured types.
Note
Table types are only generated for direct structure definitions; no table types are generated for derived
types that are based on structured types.
If you want to use the structured types inside a CDS document without generating table types in the catalog,
use the annotation @GenerateTableType : false.
@SearchIndex
The annotation @SearchIndex enables you to define which of the columns should be indexed for search
capabilities, for example, {enabled : true}. To extend the index search definition, you can use the
properties text or fuzzy to specify if the index should support text-based or fuzzy search, as illustrated in the
following example:
entity MyEntity100
{
element1 : Integer;
@SearchIndex.text: { enabled: true }
element2 : LargeString;
@SearchIndex.fuzzy: { enabled: true }
element3 : String(7);
};
Tip
For more information about setting up search features and using the search capability, see the SAP HANA
Search Developer Guide .
The annotation @WithStructuredPrivilegeCheck enables you to control access to data (for example, in a
view) by means of privileges defined with the Data Control Language (DCL), as illustrated in the following
example:
@WithStructuredPrivilegeCheck
view MyView as select from Foo {
<select_list>
} <where_groupBy_Having_OrderBy>;
Related Information
The built-in core annotations that SAP HANA provides, for example, @Schema, @Catalog, or @nokey, are
located in the namespace sap.cds; the same namespace is used to store all the primitive types, for example,
sap.cds::integer and sap.cds::SMALLINT.
However, the CDS syntax also enables you to define your own annotations, which you can use in addition to the
existing “core” annotations. The rules for defining a custom annotation in CDS are very similar way the rules
that govern the definition of a user-defined type. In CDS, an annotation can be defined either inside a CDS
context or as the single, top-level artifact in a CDS document. The custom annotation you define can then be
assigned to other artifacts in a CDS document, in the same way as the core annotations, as illustrated in the
following example:
@Catalog.tableType : #ROW
@MyAnnotation : 'foo'
entity MyEntity {
key Author : String(100);
key BookTitle : String(100);
ISBN : Integer not null;
Publisher : String(100);
}
● Scalar annotations
● Structured annotations
● Annotation arrays
In annotation definitions, you can use both the enumeration type and the Boolean type, as illustrated in the
following example.
type Color : String(10) enum { red = 'rot'; green = 'grün'; blue = 'blau'; };
annotation MyAnnotation_3 : Color;
annotation MyAnnotation_4 : Boolean;
Structured Annotations
annotation MyAnnotation_5 {
a : Integer;
b : String(20);
c : Color;
d : Boolean;
};
The following example shows how to nest annotations in an anonymous annotation structure.
annotation MyAnnotation_7 {
a : Integer;
b : String(20);
c : Color;
d : Boolean;
s {
a1 : Integer;
b1 : String(20);
c1 : Color;
d1 : Boolean;
};
};
Array Annotations
When you have defined an annotation, the user-defined annotation can be used to annotate other definitions. It
is possible to use the following types of user-defined annotations in a CDS document:
Scalar annotations [page 132] For use with simple integer or string annotations and enumeration or
Boolean types
Structured annotations [page 133] For use where you need to create a simple annotation structure or nest
an annotation in an anonymous annotation structure
Annotation arrays [page 133] For use where you need to assign the same annotation several times to
the same object.
Scalar Annotations
@MyAnnotation_1 : 18
type MyType1 : Integer;
@MyAnnotation_2 : 'sun'
@MyAnnotation_1 : 77
type MyType2 : Integer;
@MyAnnotation_2 : 'sun'
@MyAnnotation_2 : 'moon' // error: assigning the same annotation twice is not
allowed.
type MyType3 : Integer;
Note
It is not allowed to assign an annotation to the same object more than once. If several values of the same
type are to be annotated to a single object, use an array-like annotation.
For annotations that have enumeration type, the enum values can be addressed either by means of their fully
qualified name, or by means of the shortcut notation (using the hash (#) sign. It is not allowed to use a literal
value, even if it matches a literal of the enum definition.
@MyAnnotation_3 : #red
type MyType4 : Integer;
@MyAnnotation_3 : Color.red
type MyType5 : Integer;
@MyAnnotation_3 : 'rot' // error: no literals allowed, use enum symbols
type MyType6 : Integer;
For Boolean annotations, only the values “true” or “false” are allowed, and a shortcut notation is available
for the value “true”, as illustrated in the following examples:
@MyAnnotation_4 : true
type MyType7 : Integer;
@MyAnnotation_4 // same as explicitly assigning the value “true”
Structured Annotations
Structured annotations can be assigned either as a complete unit or, alternatively, one element at a time. The
following example show how to assign a whole structured annotation:
The following example shows how to assign the same structured annotation element by element.
Note
@MyAnnotation_5.a : 12
@MyAnnotation_5.b : 'Jupiter'
@MyAnnotation_5.c : #blue
@MyAnnotation_5.d : false
type MyType12 : Integer;
@MyAnnotation_5.c : #green
type MyType13 : Integer;
@MyAnnotation_5.c : #blue
@MyAnnotation_5.d // shortcut notation for Boolean (true)
type MyType14 : Integer;
It is not permitted to assign the same annotation element more than once; assigning the same annotation
element more than once in a structured annotation causes an activation error.
Array-like Annotations
Although it is not allowed to assign the same annotation several times to the same object, you can achieve the
same effect with an array-like annotation, as illustrated in the following example:
@MyAnnotation_8 : [1,3,5,7]
type MyType30 : Integer;
@MyAnnotation_9 : ['Earth', 'Moon']
type MyType31 : Integer;
@MyAnnotation_10 : [{ a: 52, b: 'Mercury'}, { a: 53, b: 'Venus'}]
type MyType32 : Integer;
The Core Data Services (CDS) syntax enables you to insert comments into object definitions.
Example:
Comment Formats in CDS Object Definitions
namespace com.acme.myapp1;
/**
* multi-line comment,
* for doxygen-style,
* comments and annotations
*/
type Type1 {
element Fstr: String( 5000 ); // end-of-line comment
Flstr: LargeString;
/*inline comment*/ Fbin: Binary( 4000 );
element Flbin: LargeBinary;
Fint: Integer;
element Fint64: Integer64;
Ffixdec: Decimal( 34, 34 /* another inline comment */);
element Fdec: DecimalFloat;
Fflt: BinaryFloat;
//complete line comment element Flocdat: LocalDate; LocalDate
temporarily switched off
//complete line comment Floctim: LocalTime;
element Futcdatim: UTCDateTime;
Futctstmp: UTCTimestamp;
};
Overview
You can use the forward slash (/) and the asterisk (*) characters to add comments and general information to
CDS object-definition files. The following types of comment are allowed:
● In-line comment
● End-of-line comment
● Complete-line comment
● Multi-line comment
The in-line comment enables you to insert a comment into the middle of a line of code in a CDS document. To
indicate the start of the in-line comment, insert a forward-slash (/) followed by an asterisk (*) before the
comment text. To signal the end of the in-line comment, insert an asterisk followed by a forward-slash
character (*/) after the comment text, as illustrated by the following example:.
End-of-Line Comment
The end-of-line comment enables you to insert a comment at the end of a line of code in a CDS document. To
indicate the start of the end-of-line comment, insert two forward slashes (//) before the comment text, as
illustrated by the following example:.
Complete-Line Comment
The complete-line comment enables you to tell the parser to ignore the contents of an entire line of CDS code.
The comment out a complete line, insert two backslashes (//) at the start of the line, as illustrated in the
following example:
Multi-Line Comments
The multi-line comment enables you to insert comment text that extends over multiple lines of a CDS
document. To indicate the start of the multi-line comment, insert a forward-slash (/) followed by an asterisk (*)
at the start of the group of lines you want to use for an extended comment (for example, /*). To signal the end
of the multi-line comment, insert an asterisk followed by a forward-slash character (*/). Each line between the
start and end of the multi-line comment must start with an asterisk (*), as illustrated in the following example:
/*
* multiline,
* doxygen-style
* comments and annotations
*/
The entity is the core artifact for defining the persistence model using the CDS syntax. You create a database
entity as a design-time file in the SAP HANA repository.
Prerequisites
● You have created a schema for the CDS catalog objects, for example, MYSCHEMA.
● You have SELECT privileges on the schema so you can see the generated catalog objects.
Context
In the SAP HANA database, as in other relational databases, a CDS entity is a table with a set of data elements
that are organized using columns and rows. SAP HANA Extended Application Services (SAP HANA XS) enables
you to use the CDS syntax to create a database entity as a design-time file in the repository. Activating the CDS
entity creates the corresponding table in the specified schema.
Procedure
namespace acme.com.apps.myapp1;
@Schema : 'MYSCHEMA'
@Catalog.tableType : #COLUMN
@Catalog.index : [ { name : 'MYINDEX1', unique : true, order :#DESC,
elementNames : ['ISBN'] } ]
entity MyEntity {
key Author : String(100);
Note
If the schema you specify does not exist, you cannot activate the new CDS entity.
Related Information
In the SAP HANA database, as in other relational databases, a CDS entity is a table with a set of data elements
that are organized using columns and rows.
A CDS entity has a specified number of columns, defined at the time of entity creation, but can have any
number of rows. Database entities also typically have meta-data associated with them; the meta-data might
include constraints on the entity or on the values within particular columns. SAP HANA Extended Application
Services (SAP HANA XS) enables you to create a database entity as a design-time file in the repository. All
repository files including your entity definition can be transported to other SAP HANA systems, for example, in
a delivery unit. You can define the entity using CDS-compliant DDL.
Note
A delivery unit is the medium SAP HANA provides to enable you to assemble all your application-related
repository artifacts together into an archive that can be easily exported to other systems.
The following code illustrates an example of a single design-time entity definition using CDS-compliant DDL. In
the example below, you must save the entity definition “MyTable” in the CDS document MyTable.hdbdd. In
addition, the name space declared in a CDS document must match the repository package in which the object
the document defines is located.
namespace com.acme.myapp1;
@Schema : 'MYSCHEMA'
@Catalog.tableType : #COLUMN
If you want to create a CDS-compliant database entity definition as a repository file, you must create the entity
as a flat file and save the file containing the DDL entity dimensions with the suffix .hdbdd, for example,
MyTable.hdbdd. The new file is located in the package hierarchy you establish in the SAP HANA repository.
The file location corresponds to the namespace specified at the start of the file, for example,
com.acme.myapp1 or sap.hana.xs.app2. You can activate the repository files at any point in time to create
the corresponding runtime object for the defined table.
Note
On activation of a repository file, the file suffix, for example, .hdbdd, is used to determine which runtime
plug-in to call during the activation process. The plug-in reads the repository file selected for activation, in
this case a CDS-compliant entity, parses the object descriptions in the file, and creates the appropriate
runtime objects.
When a CDS document is activated, the activation process generates a corresponding catalog object for each
of the artifacts defined in the document; the location in the catalog is determined by the type of object
generated. For example, the corresponding database table for a CDS entity definition is generated in the
following catalog location:
You can expand the definition of an entity element beyond the element's name and type by using element
modifiers. For example, you can specify if an entity element is the primary key or part of the primary key. The
following entity element modifiers are available:
● key
Defines if the specified element is the primary key or part of the primary key for the specified entity.
Note
Structured elements can be part of the key, too. In this case, all table fields resulting from the flattening
of this structured field are part of the primary key.
● null
Defines if an entity element can (null) or cannot (not null) have the value NULL. If neither null nor
not null is specified for the element, the default value null applies (except for the key element).
● default <literal_value>
Defines the default value for an entity element in the event that no value is provided during an INSERT
operation. The syntax for the literals is defined in the primitive data-type specification.
entity MyEntity {
Spatial Data
CDS entities support the use of spatial data types such as hana.ST_POINT or hana.ST_GEOMETRY to store
geo-spatial coordinates. Spatial data is data that describes the position, shape, and orientation of objects in a
defined space; the data is represented as two-dimensional geometries in the form of points, line strings, and
polygons.
Related Information
Element modifiers enable you to expand the definition of an entity element beyond the element's name and
type. For example, you can specify if an entity element is the primary key or part of the primary key.
Example
entity MyEntity {
key MyKey : Integer;
elem2 : String(20) default 'John Doe';
elem3 : String(20) default 'John Doe' null;
elem4 : String default 'Jane Doe' not null;
};
entity MyEntity1 {
key id : Integer;
a : integer;
b : integer;
c : integer generated always as a+b;
};
entity MyEntity2 {
autoId : Integer generated [always|by default] as identity ( start with 10
increment by 2 );
name : String(100);
};
You can expand the definition of an entity element beyond the element's name and type by using element
modifiers. For example, you can specify if an entity element is the primary key or part of the primary key. The
following entity element modifiers are available:
● key
Defines if the element is the primary key or part of the primary key for the specified entity. You cannot use
the key modifier in the following cases:
○ In combination with a null modifier. The key element is non null by default because NULL cannot
be used in the key element.
Note
Structured elements can be part of the key, too. In this case, all table fields resulting from the flattening
of this structured field are part of the primary key.
null
null defines if the entity element can (null) or cannot (not null) have the value NULL. If neither null nor
not null is specified for the element, the default value null applies (except for the key element), which
means the element can have the value NULL. If you use the null modifier, note the following points:
Caution
The keywords nullable and not nullable are no longer valid; they have been replaced for SPS07 with
the keywords null and not null, respectively. The keywords null and not null must appear at the
end of the entity element definition, for example, field2 : Integer null;.
● The not null modifier can only be added if the following is true:
○ A default it also defined
○ no null data is already in the table
● Unless the table is empty, bear in mind that when adding a new not null element to an existing entity,
you must declare a default value because there might already be existing rows that do not accept NULL as
a value for the new element.
● null elements with default values are permitted
● You cannot combine the element key with the element modifier null.
● The elements used for a unique index must have the not null property.
entity WithNullAndNotNull
{
default
default <literal_value>
For each scalar element of an entity, a default value can be specified. The default element identifier defines
the default value for the element in the event that no value is provided during an INSERT operation.
Note
The syntax for the literals is defined in the primitive data-type specification.
entity WithDefaults
{
key id : Integer;
field1 : Integer default -42;
field2 : Integer64 default 9223372036854775807;
field3 : Decimal(5, 3) default 12.345;
field4 : BinaryFloat default 123.456e-1;
field5 : LocalDate default date'2013-04-29';
field6 : LocalTime default time'17:04:03';
field7 : UTCDateTime default timestamp'2013-05-01 01:02:03';
field8 : UTCTimestamp default timestamp'2013-05-01 01:02:03';
field9 : Binary(32) default x'0102030405060708090a0b0c0d0e0[...]';
field10 : String(10) default 'foo';
};
entity MyEntity1 {
key id : Integer;
a : integer;
b : integer;
c : integer generated always as a+b;
};
The SAP HANA SQL clause generated always as <expression> is available for use in CDS entity
definitions; it specifies the expression to use to generate the column value at run time. An element that is
defined with generated always as <expression> corresponds to a field in the database table that is
present in the persistence and has a value that is computed as specified in the expression, for example, “a+b”.
Restriction
For use in XS advanced only; it is not possible to use generated calculated elements in XS classic. Please
also note that the generated always as <expression> clause is only for use with column-based
tables.
entity MyEntity2 {
autoId : Integer generated always as identity ( start with 10 increment by
2 );
name : String(100);
};
The SAP HANA SQL clause generated as identity is available for use in CDS entity definitions; it enables
you to specify an identity column. An element that is defined with generated as identity corresponds to
a field in the database table that is present in the persistence and has a value that is computed as specified in
the sequence options defined in the identity expression, for example, ( start with 10 increment by
2 ).
In the example illustrated here, the name of the generated column is autoID, the first value in the column is
“10”; the identity expression ( start with 10 increment by 2 ) ensures that subsequent values in
the column are incremented by 2, for example: 12, 14, and so on.
Restriction
For use in XS advanced only; it is not possible to define an element with IDENTITY in XS classic. Please also
note that the generated always as identity clause is only for use with column-based tables.
You can use either always or by default in the clause generated as identity, as illustrated in the
examples in this section. If always is specified, then values are always generated; if by default is specified,
then values are generated by default.
entity MyEntity2 {
autoId : Integer generated by default as identity ( start with 10 increment
by 2 );
name : String(100);
};
Restriction
CDS does not support the use of reset queries, for example, RESET BY <subquery>.
The following table shows the migration strategy that is used for modifications to any given column; the
information shows which actions are performed and what strategy is used to preserve content. During the
Technically, columns are either dropped and added or a completely new “shadow” table is created into which
the existing content is copied. The shadow table will then replace the original table.
generated by de
Before column/ Af generated always as generated always as fault as identity
ter row Plain As <expr> <expr> identity <expr> <expr>
Related Information
The entity is the core design-time artifact for persistence model definition using the CDS syntax.
Example
Note
This example is not a working example; it is intended for illustration purposes only.
namespace Pack1."pack-age2";
@Schema: 'MySchema'
context MyContext {
entity MyEntity1
{
key id : Integer;
name : String(80);
};
@Catalog:
{ tableType : #COLUMN,
index : [
{ name:'Index1', order:#DESC, unique:true, elementNames:['x', 'y'] },
{ name:'Index2', order:#DESC, unique:false, elementNames:['x', 'a'] }
]
}
entity MyEntity2 {
key id : Integer;
x : Integer;
y : Integer;
a : Integer;
field7 : Decimal(20,10) = power(ln(x)*sin(y), a);
};
entity MyEntity {
key id : Integer;
a : Integer;
b : Integer;
c : Integer;
s {
m : Integer;
n : Integer;
};
} technical configuration {
row store;
index MyIndex1 on (a, b) asc;
unique index MyIndex2 on (c, s) desc;
};
context MySpatialContext {
entity Address {
key id : Integer;
street_number : Integer;
street_name : String(100);
zip : String(10);
city : String(100);
Note
For series data, you can use either equidistant or equidistant piecewise, but not both at the same
time. The example above is for illustration purposes only.
Overview
Entity definitions resemble the definition of structured types, but with the following additional features:
On activation in the SAP HANA repository, each entity definition in CDS generates a database table. The name
of the generated table is built according to the same rules as for table types, for example,
Pack1.Pack2::MyModel.MyContext.MyTable.
Note
The CDS name is restricted by the limits imposed on the length of the database identifier for the name of
the corresponding SAP HANA database artifact (for example, table, view, or type); this is currently limited
to 126 characters (including delimiters).
Key Definition
type MyStruc2
{
field1 : Integer;
Usually an entity must have a key; you use the keyword key to mark the respective elements. The key elements
become the primary key of the generated SAP HANA table and are automatically flagged as not null. Key
elements are also used for managed associations. Structured elements can be part of the key, too. In this case,
all table fields resulting from the flattening of this structured element are part of the primary key.
Note
Index Definition
@Catalog:
{ tableType : #COLUMN,
index : [
{ name:'Index1', order:#DESC, unique:true, elementNames:['field1',
'field2'] },
{ name:'Index2', order:#ASC, unique:false, elementNames:['field1',
'field7'] }
]
}
You use the @Catalog.index or @Catalog: { index: [...]} annotation to define an index for a CDS
entity. You can define the following values for the @Catalog.index annotation:
● name : '<IndexName>'
The name of the index to be generated for the specified entity, for example, name:'myIndex'
● order
Create a table index sorted in ascending or descending order. The order keywords #ASC and #DESC can be
only used in the BTREE index (for the maintenance of sorted data) and can be specified only once for each
index.
○ order : #ASC
Creates an index for the CDS entity and sorts the index fields in ascending logical order, for example: 1,
2, 3...
○ order : #DESC
Creates a index for the CDS entity and sorts the index fields in descending logical order, for example:
3, 2, 1...
● unique
Creates a unique index for the CDS entity. In a unique index, two rows of data in a table cannot have
identical key values.
○ unique : true
Creates a unique index for the CDS entity. The uniqueness is checked and, if necessary, enforced each
time a key is added to (or changed in) the index and, in addition, each time a row is added to the table.
Table-Type Definition
namespace com.acme.myapp1;
@Schema: 'MYSCHEMA'
context MyContext1 {
@Catalog.tableType : #COLUMN
entity MyEntity1 {
key ID : Integer;
name : String(30);
};
@Catalog.tableType : #ROW
entity MyEntity2 {
key ID : Integer;
name : String(30);
};
@Catalog.tableType : #GLOBAL_TEMPORARY
entity MyEntity3 {
ID : Integer;
name : String(30);
};
@Catalog.tableType : #GLOBAL_TEMPORARY_COLUMN
entity MyTempEntity {
a : Integer;
b : String(20);
};
};
You use the @Catalog.tableType or @Catalog: { tableType: #<TYPE> } annotation to define the type
of CDS entity you want to create, for example: column- or row-based or global temporary. The
@Catalog.tableType annotation determines the storage engine in which the underlying table is created. The
following table lists and explains the permitted values for the @Catalog.tableType annotation:
#ROW Create a row-based table. If the majority of table access involves selecting a few re
cords, with all attributes selected, use ROW-based storage for your table type.
#GLOBAL_TEMPORARY Set the scope of the created table. Data in a global temporary table is session-spe
cific; only the owner session of the global temporary table is allowed to insert/read/
truncate the data. A global temporary table exists for the duration of the session,
and data from the global temporary table is automatically dropped when the ses
sion is terminated. Note that a temporary table cannot be changed when the table
is in use by an open session, and a global temporary table can only be dropped if
the table does not have any records.
#GLOBAL_TEMPORARY_COLUMN Set the scope of the table column. Global temporary column tables cannot have ei
ther a key or an index.
Note
The SAP HANA database uses a combination of table types to enable storage and interpretation in both
ROW and COLUMN forms. If no table type is specified in the CDS entity definition, the default value
#COLUMN is applied to the table created on activation of the design-time entity definition.
Calculated Fields
The definition of an entity can contain calculated fields, as illustrated in type “z” the following example:
Sample Code
entity MyCalcField {
a : Integer;
b : Integer;
c : Integer = a + b;
s : String(10);
t : String(10) = upper(s);
x : Decimal(20,10);
y : Decimal(20,10);
z : Decimal(20,10) = power(ln(x)*sin(y), a);
};
The calculation expression can contain arbitrary expressions and SQL functions. The following restrictions
apply to the expression you include in a calculated field:
● The definition of a calculated field must not contain other calculated fields, associations, aggregations, or
subqueries.
● A calculated field cannot be key.
● No index can be defined on a calculated field.
● A calculated field cannot be used as foreign key for a managed association.
Note
In SAP HANA tables, you can define columns with the additional configuration “GENERATED ALWAYS AS”.
These columns are physically present in the table, and all the values are stored. Although these columns
behave for the most part like ordinary columns, their value is computed upon insertion rather than
technical configuration
The definition of an entity can contain a section called technical configuration, which you use to define
the elements listed in the following table:
● Storage type
● Indexes
● Full text indexes
Note
The syntax in the technical configuration section is as close as possible to the corresponding clauses in the
SAP HANA SQL Create Table statement. Each clause in the technical configuration must end with a
semicolon.
Storage type
In the technical configuration for an entity, you can use the store keyword to specify the storage type (“row”
or “column”) for the generated table, as illustrated in the following example. If no store type is specified, a
“column” store table is generated by default.
Sample Code
entity MyEntity {
key id : Integer;
a : Integer;
b : Integer;
t : String(100);
s {
u : String(100);
};
} technical configuration {
row store;
};
Restriction
It is not possible to use both the @Catalog.tableType annotation and the technical configuration (for
example, row store) at the same time to define the storage type for an entity.
Indexes
In the technical configuration for an entity, you can use the index and unique index keywords to specify the
index type for the generated table. For example: “asc” (ascending) or “desc” (descending) describes the index
order, and unique specifies that the index is unique, where no two rows of data in the indexed entity can have
identical key values.
entity MyEntity {
key id : Integer;
a : Integer;
b : Integer;
t : String(100);
s {
u : String(100);
};
} technical configuration {
index MyIndex1 on (a, b) asc;
unique index MyIndex2 on (c, s) desc;
};
Restriction
It is not possible to use both the @Catalog.index annotation and the technical configuration (for
example, index) at the same time to define the index type for an entity.
In the technical configuration for an entity, you can use the fulltext index keyword to specify the full-text
index type for the generated table, as illustrated in the following example.
Sample Code
entity MyEntity {
key id : Integer;
a : Integer;
b : Integer;
t : String(100);
s {
u : String(100);
};
} technical configuration {
row store;
index MyIndex1 on (a, b) asc;
unique index MyIndex2 on (a, b) asc;
fulltext index MYFTI1 on (t)
LANGUAGE COLUMN t
LANGUAGE DETECTION ('de', 'en')
MIME TYPE COLUMN s.u
FUZZY SEARCH INDEX off
PHRASE INDEX RATIO 0.721
SEARCH ONLY off
FAST PREPROCESS off
TEXT ANALYSIS off;
fuzzy search index on (s.u);
};
The <fulltext_parameter_list> is identical to the standard SAP HANA SQL syntax for CREATE
FULLTEXT INDEX. A fuzzy search index in the technical configuration section of an entity definition
corresponds to the @SearchIndex annotation in XS classic and the statement "FUZZY SEARCH INDEX ON"
for a table column in SAP HANA SQL. It is not possible to specify both a full-text index and a fuzzy search index
for the same element.
It is not possible to use both the @SearchIndex annotation and the technical configuration (for example,
fulltext index) at the same time.
Spatial Types *
The following example shows how to use the spatial type ST_POINT in a CDS entity definition. In the example
entity Person, each person has a home address and a business address, each of which is accessible via the
corresponding associations. In the Address entity, the geo-spatial coordinates for each person are stored in
element loc using the spatial type ST_POINT (*).
Sample Code
context SpatialData {
entity Person {
key id : Integer;
name : String(100);
homeAddress : Association[1] to Address;
officeAddress : Association[1] to Address;
};
entity Address {
key id : Integer;
street_number : Integer;
street_name : String(100);
zip : String(10);
city : String(100);
loc : hana.ST_POINT(4326);
};
view CommuteDistance as select from Person {
name,
homeAddress.loc.ST_Distance(officeAddress.loc) as distance
};
};
Series Data *
CDS enables you to create a table to store series data by defining an entity that includes a series () clause
as an table option and then defining the appropriate parameters and options.
Note
The period for series must be unique and should not be affected by any shift in timestamps.
Sample Code
context SeriesData {
entity MySeriesEntity1 {
key setId : Integer;
t : UTCTimestamp;
value : Decimal(10,4);
CDS also supports the creation of a series table called equidistant piecewise using Formula-Encoded
Timestamps (FET). This enables support for data that is not loaded in an order that ensures good
compression. There is no a-priori restriction on the timestamps that are stored, but the data is expected to be
well approximated as piecewise linear with some jitter. The timestamps do not have a single slope/offset
throughout the table; rather, they can change within and among series in the table.
Restriction
The equidistant piecewise specification can only be used in CDS; it cannot be used to create a table
with the SQL command CREATE TABLE.
When a series table is defined as equidistant piecewise, the following restrictions apply:
1. The period includes one column (instant); there is no support for interval periods.
2. There is no support for missing elements. These could logically be defined if the period includes an
interval start and end. Missing elements then occur when we have adjacent rows where the end of the
interval does not equal the start of the interval.
3. The type of the period column must map to the one of the following types: DATE, SECONDDATE, or
TIMESTAMP.
Caution
(*) For information about the capabilities available for your license and installation scenario, refer to the
Feature Scope Description for SAP HANA.
Related Information
A structured type is a data type comprising a list of attributes, each of which has its own data type. You create a
user-defined structured type as a design-time file in the SAP HANA repository.
Prerequisites
● You have created a schema for the CDS catalog objects, for example, MYSCHEMA.
● You have SELECT privileges on the schema so you can see the generated catalog objects.
Context
SAP HANA Extended Application Services (SAP HANA XS) enables you to use the CDS syntax to create a user-
defined structured type as a design-time file in the repository. Repository files are transportable. Activating the
CDS entity creates the corresponding table in the specified schema.
Procedure
namespace Package1.Package2;
@Schema: 'MYSCHEMA'
type MyStructuredType
{
aNumber : Integer;
someText : String(80);
otherText : String(80);
};
Note
If the schema you specify does not exist, you cannot activate the new CDS entity.
Related Information
User-defined data types reference existing structured types (for example, user-defined) or the individual types
(for example, field, type, or context) used in another data-type definition.
You can use the type keyword to define a new data type in CDS-compliant DDL syntax. You can define the data
type in the following ways:
In the following example, the element definition field2 : MyType1; specifies a new element field2 that is
based on the specification in the user-defined data type MyType1.
Note
If you are using a CDS document to define a single CDS-compliant user-defined data type, the name of the
CDS document must match the name of the top-level data type defined in the CDS document, for example,
with the type keyword.
In the following example, you must save the data-type definition “MyType1” in the CDS document
MyType1.hdbdd. In addition, the name space declared in a CDS document must match the repository
package in which the object the document defines is located.
namespace com.acme.myapp1;
@Schema: 'MYSCHEMA' // user-defined structured data types
type MyType1 {
field1 : Integer;
field2 : String(40);
field3 : Decimal(22,11);
field4 : Binary(11);
};
namespace com.acme.myapp1;
using com.acme.myapp1::MyType1;
@Schema: 'MYSCHEMA' // user-defined structured data types
type MyType2 {
field1 : String(50);
field2 : MyType1;
};
In the following example, you must save the data-type definition “MyType3” in the CDS document
MyType3.hdbdd; the document contains a using directive pointing to the data-type “MyType2” defined in CDS
document MyType2.hdbdd.
namespace com.acme.myapp1;
using com.acme.myapp1::MyType2;
@Schema: 'MYSCHEMA' // user-defined structured data types
type MyType3 {
field1 : UTCTimestamp;
field2 : MyType2;
};
The following code example shows how to use the type of keyword to define an element using the definition
specified in another user-defined data-type field. For example, field4 : type of field3; indicates that,
like field3, field4 is a LocalDate data type.
namespace com.acme.myapp1;
using com.acme.myapp1::MyType1;
@Schema: 'MYSCHEMA' // Simple user-defined data types
entity MyEntity1 {
key id : Integer;
field1 : MyType3;
field2 : String(24);
field3 : LocalDate;
field4 : type of field3;
field5 : type of MyType1.field2;
field6 : type of InnerCtx.CtxType.b; // context reference
};
● Define a new element (field4) using the definition specified in another user-defined element field3:
field4 : type of field3;
● Define a new element field5 using the definition specified in a field (field2) that belongs to another
user-defined data type (MyType1):
field5 : type of MyType1.field2;
● Define a new element (field6) using an existing field (b) that belongs to a data type (CtxType) in another
context (InnerCtx):
field6 : type of InnerCtx.CtxType.b;
The following code example shows you how to define nested contexts (MyContext.InnerCtx) and refer to
data types defined by a user in the specified context.
namespace com.acme.myapp1;
@Schema: 'MYSCHEMA'
context MyContext {
// Nested contexts
Entity MyEntity {
…
};
Type CtxType {
a : Integer;
b : String(59);
};
};
type MyType1 {
field1 : Integer;
field2 : String(40);
field3 : Decimal(22,11);
field4 : Binary(11);
};
type MyType2 {
field1 : String(50);
field2 : MyType1;
};
type MyType3 {
field1 : UTCTimestamp;
field2 : MyType2;
};
Restrictions
CDS name resolution does not distinguish between CDS elements and CDS types. If you define a CDS
element based on a CDS data type that has the same name as the new CDS element, CDS displays an error
message and the activation of the CDS document fails.
Caution
In an CDS document, you cannot define a CDS element using a CDS type of the same name; you must
specify the context where the target type is defined, for example, MyContext.doobidoo.
The following example defines an association between a CDS element and a CDS data type both of which are
named doobidoo. The result is an error when resolving the names in the CDS document; CDS expects a type
named doobidoo but finds an CDS entity element with the same name that is not a type.
context MyContext2 {
type doobidoo : Integer;
entity MyEntity {
key id : Integer;
doobidoo : doobidoo; // error: type expected; doobidoo is not a type
The following example works, since the explicit reference to the context where the type definition is located
(MyContext.doobidoo) enables CDS to resolve the definition target.
context MyContext {
type doobidoo : Integer;
entity MyEntity {
key id : Integer;
doobidoo : MyContext.doobidoo; // OK
};
};
Note
To prevent name clashes between artifacts that are types and those that have a type assigned to them,
make sure you keep to strict naming conventions. For example, use an uppercase first letter for MyEntity,
MyView and MyType; use a lowercase first letter for elements myElement.
Related Information
SAP HANA Extended Application Services (SAP HANA XS) enables you to create a database structured type as
a design-time file in the repository. All repository files including your structured-type definition can be
transported to other SAP HANA systems, for example, in a delivery unit. You can define the structured type
using CDS-compliant DDL.
Note
A delivery unit is the medium SAP HANA provides to enable you to assemble all your application-related
repository artifacts together into an archive that can be easily exported to other systems.
When a CDS document is activated, the activation process generates a corresponding catalog object for each
of the artifacts defined in the document; the location in the catalog is determined by the type of object
generated. For example, the corresponding table type for a CDS type definition is generated in the following
catalog location:
In a structured user-defined type, you can define original types (aNumber in the following example) or
reference existing types defined elsewhere in the same type definition or another, separate type definition
(MyString80). If you define multiple types in a single CDS document, for example, in a parent context, each
structure-type definition must be separated by a semi-colon (;).
namespace Package1.Package2;
@Schema: 'MySchema'
type MyString80: String(80);
A using directive is required to resolve the reference to the data type specified in otherText :
MyString80;, as illustrated in the following example:
namespace Package1.Package2;
using Package1.Package2::MyString80; //contains definition of MyString80
@Schema: 'MySchema'
type MyStruct
{
aNumber : Integer;
someText : String(80);
otherText : MyString80; // defined in a separate type
};
Note
If you are using a CDS document to specify a single CDS-compliant data type, the name of the CDS
document (MyStruct.hdbdd) must match the name of the top-level data type defined in the CDS
document, for example, with the type keyword.
Since user-defined types can make use of other user-defined types, you can build nested structured types, as
illustrated in the following example:
namespace Package1.Package2;
using Package1.Package2::MyString80;
using Package1.Package2::MyStruct;
@Schema: 'MYSCHEMA'
context NestedStructs {
type MyNestedStruct
{
name : MyString80;
nested : MyStruct; // defined in a separate type
};
type MyDeepNestedStruct
{
text : LargeString;
nested : MyNestedStruct;
};
type MyOtherInt : type of MyStruct.aNumber; // => Integer
type MyOtherStruct : type of MyDeepNestedStruct.nested.nested; // => MyStruct
};
For each structured type, a SAP HANA table type is generated, whose name is built by concatenating the
following elements of the CDS document containing the structured-type definition and separating the
elements by a dot delimiter (.):
The new SAP HANA table types are generated in the schema that is specified in the schema annotation of the
respective top-level artifact in the CDS document containing the structured types.
Note
To view the newly created objects, you must have the required SELECT privilege for the schema object in
which the objects are generated.
The columns of the table type are built by flattening the elements of the type. Elements with structured types
are mapped to one column per nested element, with the column names built by concatenating the element
names and separating the names by dots ".".
Tip
If you want to use the structured types inside a CDS document without generating table types in the
catalog, use the annotation @GenerateTableType : false.
Table types are only generated for direct structure definitions; in the following example, this would include:
MyStruct, MyNestedStruct, and MyDeepNestedStruct. No table types are generated for derived types
that are based on structured types; in the following example, the derived types include: MyS, MyOtherInt,
MyOtherStruct.
Example
namespace Pack1."pack-age2";
Related Information
A structured type is a data type comprising a list of attributes, each of which has its own data type. The
attributes of the structured type can be defined manually in the structured type itself and reused either by
another structured type or an entity.
Example
namespace examples;
@Schema: 'MYSCHEMA'
context StructuredTypes {
type MyOtherInt : type of MyStruct.aNumber; // => Integer
type MyOtherStruct : type of MyDeepNestedStruct.nested.nested; // => MyStruct
@GenerateTableType: false
type EmptyStruct { };
type
In a structured user-defined type, you can define original types (aNumber in the following example) or
reference existing types defined elsewhere in the same type definition or another, separate type definition, for
example, MyString80 in the following code snippet. If you define multiple types in a single CDS document,
each structure definition must be separated by a semi-colon (;).
type MyStruct
{
aNumber : Integer;
aText : String(80);
anotherText : MyString80; // defined in a separate type
};
You can define structured types that do not contain any elements, for example, using the keywords type
EmptyStruct { };. In the example, below the generated table for entity “E” contains only one column: “a”.
Tip
It is not possible to generate an SAP HANA table type for an empty structured type. This means you must
disable the generation of the table type in the Repository, for example, with the @GenerateTableType
annotation.
@GenerateTableType : false
type EmptyStruct { };
entity E {
a : Integer;
s : EmptyStruct;
};
You can define a type based on an existing type that is already defined in another user-defined structured type,
for example, by using the type of keyword, as illustrated in the following example:
Context StructuredTypes
{
type MyOtherInt : type of MyStruct.aNumber; // => Integer
type MyOtherStruct : type of MyDeepNestedStruct.nested.nested; // => MyStruct
};
Related Information
In the Data Definition Language (DDL), primitive (or core) data types are the basic building blocks that you use
to define entities or structure types with DDL.
When you are specifying a design-time table (entity) or a view definition using the CDS syntax, you use data
types such as String, Binary, or Integer to specify the type of content in the entity columns. CDS supports the
use of the following primitive data types:
The following table lists all currently supported simple DDL primitive data types. Additional information
provided in this table includes the SQL syntax required as well as the equivalent SQL and EDM names for the
listed types.
String (n) Variable-length Unicode string with a 'text with “quote”' NVARCHAR String
specified maximum length of
n=1-1333 characters (5000 for SAP
HANA specific objects). Default =
maximum length. String length (n) is
mandatory.
Binary(n) Variable length byte string with user- x'01Cafe', X'01Cafe' VARBINARY Binary
defined length limit of up to 4000
bytes. Binary length (n) is mandatory.
Integer64 Signed 64-bit integer with a value 13, -1234567 BIGINT Int64
range of -2^63 to 2^63-1. De
fault=NULL.
Decimal( p, s ) Decimal number with fixed precision 12.345, -9.876 DECIMAL( p, s ) Decimal
(p) in range of 1 to 34 and fixed scale
(s) in range of 0 to p. Values for preci
sion and scale are mandatory.
BinaryFloat Binary floating-point number (IEEE 1.2, -3.4, 5.6e+7 DOUBLE Double
754), 8 bytes (roughly 16 decimal dig
its precision); range is roughly
±2.2207e-308 through ±1.7977e+308
LocalDate Local date with values ranging from date'1234-12-31' DATE DateTimeOffset
0001-01-01 through 9999-12-31
Combines date
and time; with
time zone must
be converted to
offset
LocalTime Time values (with seconds precision) time'23:59:59', time'12:15' TIME Time
and values ranging from 00:00:00
For duration/
through 24:00:00
period of time
(==xsd:dura
tion). Use Date
TimeOffset if
there is a date,
too.
UTCDateTime UTC date and time (with seconds pre timestamp'2011-12-31 SECONDDATE DateTimeOffset
cision) and values ranging from 23:59:59'
Values ending
0001-01-01 00:00:00 through
9999-12-31 23:59:59 with “Z” for
UTC. Values be
fore
1753-01-01T00:
00:00 are not
supported;
transmitted as
NULL.
UTCTimestamp UTC date and time (with a precision of timestamp'2011-12-31 TIMESTAMP DateTimeOffset
0.1 microseconds) and values ranging 23:59:59.7654321'
With Precision =
from 0001-01-01 00:00:00 through
9999-12-31 23:59:59.9999999, and a “7”
special initial value
Boolean Represents the concept of binary-val true, false, unknown (null) BOOLEAN Boolean
ued logic
The following table lists all the native SAP HANA primitive data types that CDS supports. The information
provided in this table also includes the SQL syntax required (where appropriate) as well as the equivalent SQL
and EDM names for the listed types.
Note
* In CDS, the name of SAP HANA data types are prefixed with the word “hana”, for example,
hana.ALPHANUM, or hana.SMALLINT, or hana.TINYINT.
The following example shows the native SAP HANA data types that CDS supports; the code example also
illustrates the mandatory syntax.
Note
Support for the geo-spatial types ST_POINT and ST_GEOMETRY is limited: these types can only be used for
the definition of elements in types and entities. It is not possible to define a CDS view that selects an
element based on a geo-spatial type from a CDS entity.
@nokey
entity SomeTypes {
a : hana.ALPHANUM(10);
b : hana.SMALLINT;
c : hana.TINYINT;
d : hana.SMALLDECIMAL;
e : hana.REAL;
h : hana.VARCHAR(10);
i : hana.CLOB;
j : hana.BINARY(10);
k : hana.ST_POINT;
l : hana.ST_GEOMETRY;
};
Related Information
Associations define relationships between entities. You create associations in a CDS entity definition, which is a
design-time file in the SAP HANA repository.
Prerequisites
● You have created a schema for the CDS catalog objects, for example, MYSCHEMA.
● You have SELECT privileges on the schema so you can see the generated catalog objects.
Context
SAP HANA Extended Application Services (SAP HANA XS) enables you to use the CDS syntax to create
associations between entities. The associations are defined as part of the entity definition, which are design-
time files in the repository. Repository files are transportable. Activating the CDS entity creates the
corresponding catalog objects in the specified schema.
Procedure
namespace com.acme.myapp1;
@Schema : 'MYSCHEMA'
context MyEntity1 {
type StreetAddress {
name : String(80);
number : Integer;
};
type CountryAddress {
name : String(80);
code : String(3);
};
entity Address {
key id : Integer;
street : StreetAddress;
Note
If the schema you specify does not exist, you cannot activate the new CDS entity.
entity Person
{
key id : Integer;
address1 : Association to Address;
};
entity Person
{
key id : Integer;
address1 : Association to Address;
address4 : Association[0..*] to Address { zipCode };
}
Related Information
Associations are specified by adding an element to a source entity with an association type that points to a
target entity, complemented by optional information defining cardinality and which keys to use.
Note
namespace samples;
@Schema: 'MYSCHEMA' // XS classic *only*
context SimpleAssociations {
type StreetAddress {
name : String(80);
number : Integer;
};
type CountryAddress {
name : String(80);
code : String(3);
};
entity Address {
key id : Integer;
street : StreetAddress;
zipCode : Integer;
city : String(80);
country : CountryAddress;
type : String(10); // home, office
};
entity Person
{
key id : Integer;
// address1,2,3 are to-one associations
address1 : Association to Address;
address2 : Association to Address { id };
address3 : Association[1] to Address { zipCode, street, country };
// address4,5,6 are to-many associations
address4 : Association[0..*] to Address { zipCode };
address5 : Association[*] to Address { street.name };
address6 : Association[*] to Address { street.name AS streetName,
country.name AS countryName };
};
};
Cardinality in Associations
When using an association to define a relationship between entities in a CDS document, you use the
cardinality to specify the type of relation, for example, one-to-one (to-one) or one-to-many (to-n); the
relationship is with respect to both the source and the target of the association.
The target cardinality is stated in the form of [ min .. max ], where max=* denotes infinity. If no cardinality
is specified, the default cardinality setting [ 0..1 ] is assumed. It is possible to specify the maximum
cardinality of the source of the association in the form [ maxs, min .. max], too, where maxs = * denotes
infinity.
Tip
The information concerning the maximum cardinality is only used as a hint for optimizing the execution of
the resulting JOIN.
namespace samples;
@Schema: 'MYSCHEMA' // XS classic *only*
context AssociationCardinality {
You use the to keyword in a CDS view definition to specify the target entity in an association, for example, the
name of an entity defined in a CDS document. A qualified entity name is expected that refers to an existing
entity. A target entity specification is mandatory; a default value is not assumed if no target entity is specified
in an association relationship.
The entity Address specified as the target entity of an association could be expressed in any of the ways
illustrated the following examples:
When following an association (for example, in a view), it is now possible to apply a filter condition; the filter is
merged into the ON-condition of the resulting JOIN. The following example shows how to get a list of customers
and then filter the list according to the sales orders that are currently “open” for each customer. In the
example, the infix filter is inserted after the association orders to get only those orders that satisfy the
condition [status='open'].
Sample Code
Sample Code
entity Customer {
key id : Integer;
orders : Association[*] to SalesOrder on orders.cust_id = id;
name : String(80);
};
entity SalesOrder {
key id : Integer;
cust_id : Integer;
customer: Association[1] to Customer on customer.id = cust_id;
items : Association[*] to Item on items.order_id = id;
status: String(20);
date : LocalDate;
};
entity Item {
key id : Integer;
order_id : Integer;
salesOrder : Association[1] to SalesOrder on salesOrder.id = order_id;
descr : String(100);
price : Decimal(8,2);
};
Tip
For more information about filter conditions and prefixes in CDS views, see CDS Views and CDS View
Syntax Options.
For managed associations, the relationship between source and target entity is defined by specifying a set of
elements of the target entity that are used as a foreign key. If no foreign keys are specified explicitly, the
elements of the target entity’s designated primary key are used. Elements of the target entity that reside inside
substructures can be addressed via the respective path. If the chosen elements do not form a unique key of the
target entity, the association has cardinality to-many. The following examples show how to express foreign keys
in an association.
namespace samples;
using samples::SimpleAssociations.StreetAddress;
using samples::SimpleAssociations.CountryAddress;
using samples::SimpleAssociations.Address;
@Schema: 'MYSCHEMA' // XS classic *only*
context ForeignKeys {
entity Person
{
key id : Integer;
// address1,2,3 are to-one associations
address1 : Association to Address;
address2 : Association to Address { id };
address3 : Association[1] to Address { zipCode, street, country };
// address4,5,6 are to-many associations
address4 : Association[0..*] to Address { zipCode };
address5 : Association[*] to Address { street.name };
address6 : Association[*] to Address { street.name AS streetName,
country.name AS countryName };
● address1
No foreign keys are specified: the target entity's primary key (the element id) is used as foreign key.
● address2
Explicitly specifies the foreign key (the element id); this definition is similar to address1.
● address3
The foreign key elements to be used for the association are explicitly specified, namely: zipcode and the
structured elements street and country.
● address4
Uses only zipcode as the foreign key. Since zipcode is not a unique key for entity Address, this
association has cardinality “to-many”.
● address5
Uses the subelement name of the structured element street as a foreign key. This is not a unique key and,
as a result, address4 has cardinality “to-many”.
● address6
Uses the subelement name of both the structured elements street and country as foreign key fields.
The names of the foreign key fields must be unique, so an alias is required here. The foreign key is not
unique, so address6 is a “to-many” association.
You can use foreign keys of managed associations in the definition of other associations. In the following
example, the appearance of association head in the ON condition is allowed; the compiler recognizes that the
field head.id is actually part of the entity Item and, as a result, can be obtained without following the
association head.
Sample Code
entity Header {
key id : Integer;
toItems : Association[*] to Item on toItems.head.id = id;
};
entity Item {
key id : Integer;
head : Association[1] to Header { id };
...
};
CDS name resolution does not distinguish between CDS associations and CDS entities. If you define a
CDS association with a CDS entity that has the same name as the new CDS association, CDS displays an error
message and the activation of the CDS document fails.
Caution
In an CDS document, to define an association with a CDS entity of the same name, you must specify the
context where the target entity is defined, for example, Mycontext.Address3.
The following code shows some examples of associations with a CDS entity that has the same (or a similar)
name. Case sensitivity ("a", "A") is important; in CDS documents, address is not the same as Address. In the
case of Address2, where the association name and the entity name are identical, the result is an error; when
resolving the element names, CDS expects an entity named Address2 but finds a CDS association with the
same name instead. MyContext.Address3 is allowed, since the target entity can be resolved due to the
absolute path to its location in the CDS document.
context MyContext {
entity Address {…}
entity Address1 {…}
entity Address2 {…}
entity Address3 {…}
entity Person
{
key id : Integer;
address : Association to Address; // OK: "address" ≠ "Address”
address1 : Association to Address1; // OK: "address1" ≠ "Address1”
Address2 : Association to Address2; // Error: association name =
entity name
Address3 : Association to MyContext.Address3; //OK: full path to Address3
};
};
Example:
Complex (One-to-Many) Association
The following example shows a more complex association (to-many) between the entity “Header” and the
entity “Item”.
namespace samples;
@Schema: 'MYSCHEMA' // XS classic *only*
context ComplexAssociation {
Entity Header {
key PurchaseOrderId: BusinessKey;
Items: Association [0..*] to Item on
Items.PurchaseOrderId=PurchaseOrderId;
"History": HistoryT;
NoteId: BusinessKey null;
PartnerId: BusinessKey;
Currency: CurrencyT;
GrossAmount: AmountT;
NetAmount: AmountT;
Related Information
Example:
Managed Associations
Example:
Unmanaged Associations
Overview
Associations are specified by adding an element to a source entity with an association type that points to a
target entity, complemented by optional information defining cardinality and which keys to use.
Note
SAP HANA Extended Application Services (SAP HANA XS) enables you to use associations in the definition of a
CDS entity or a CDS view. When defining an association, bear in mind the following points:
When using an association to define a relationship between entities in a CDS view; you use the cardinality to
specify the type of relation, for example:
● one-to-one (to-one)
● one-to-many (to-n)
The relationship is with respect to both the source and the target of the association. The following code
example illustrates the syntax required to define the cardinality of an association in a CDS view:
In the most simple form, only the target cardinality is stated using the syntax [ min .. max ], where max=*
denotes infinity. Note that [] is short for [ 0..* ]. If no cardinality is specified, the default cardinality setting
[ 0..1 ] is assumed. It is possible to specify the maximum cardinality of the source of the association in the
form [ maxs, min .. max], where maxs = * denotes infinity.
namespace samples;
@Schema: 'MYSCHEMA' // XS classic *only*
context AssociationCardinality {
entity Associations {
// To-one associations
assoc1 : Association[0..1] to target;
assoc2 : Association to target;
assoc3 : Association[1] to target;
assoc4 : Association[1..1] to target; // association has one target
instance
// To-many associations
assoc5 : Association[0..*] to target{id1};
assoc6 : Association[] to target{id1}; // as assoc4, [] is short
for [0..*]
assoc7 : Association[2..7] to target{id1}; // any numbers are
possible; user provides
assoc8 : Association[1, 0..*] to target{id1}; // additional info. about
source cardinality
};
// Required to make the example above work
entity target {
key id1 : Integer;
key id2 : Integer;
};
};
The following table describes the various cardinality expressions illustrated in the example above:
assoc2 Like assoc1, this association has no or one target instance and uses the de
fault [0..1]
assoc3 [1] Like assoc1, this association has no or one target instance; the default for
min is 0
assoc5 [0..*] The association has no, one, or multiple target instances
assoc6 [] Like assoc4, [] is short for [0..*] (the association has no, one, or multiple tar
get instances)
assoc8 [1, 0..*] The association has no, one, or multiple target instances and includes addi
tional information about the source cardinality
When an infix filter effectively reduces the cardinality of a “to-N” association to “to-1”, this can be expressed
explicitly in the filter, for example:
assoc[1: <cond> ]
Specifying the cardinality in the filter in this way enables you to use the association in the WHERE clause, where
“to-N” associations are not normally allowed.
Sample Code
namespace samples;
@Schema: 'MYSCHEMA' // XS classic *only*
context CardinalityByInfixFilter {
entity Person {
key id : Integer;
name : String(100);
address : Association[*] to Address on address.personId = id;
};
entity Address {
key id : Integer;
personId : Integer;
type : String(20); // home, business, vacation, ...
street : String(100);
city : String(100);
};
view V as select from Person {
name
} where address[1: type='home'].city = 'Accra';
};
Association Target
You use the to keyword in a CDS view definition to specify the target entity in an association, for example, the
name of an entity defined in a CDS document. A qualified entity name is expected that refers to an existing
entity. A target entity specification is mandatory; a default value is not assumed if no target entity is specified
in an association relationship.
Association Keys
In the relational model, associations are mapped to foreign-key relationships. For managed associations, the
relation between source and target entity is defined by specifying a set of elements of the target entity that are
used as a foreign key, as expressed in the forwardLink element of the following code example:
<forwardLink> = { <foreignKeys> }
<foreignKeys> = <targetKeyElement> [ AS <alias> ] [ , <foreignKeys> ]
<targetKeyElement> = <elementName> ( . <elementName> )*
If no foreign keys are specified explicitly, the elements of the target entity’s designated primary key are used.
Elements of the target entity that reside inside substructures can be addressed by means of the respective
path. If the chosen elements do not form a unique key of the target entity, the association has cardinality to-
many. The following examples show how to express foreign keys in an association.
entity Person
{
key id : Integer;
// address1,2,3 are to-one associations
address1 : Association to Address;
address2 : Association to Address { id };
address3 : Association[1] to Address { zipCode, street, country };
// address4,5,6 are to-many associations
address4 : Association[0..*] to Address { zipCode };
address5 : Association[*] to Address { street.name };
address6 : Association[*] to Address { street.name AS streetName,
country.name AS countryName };
};
address1 No foreign keys are specified: the target entity's primary key (the element id) is
used as foreign key.
address2 { id } Explicitly specifies the foreign key (the element id); this definition is identical to
address1.
address3 { zipCode, The foreign key elements to be used for the association are explicitly specified,
street, namely: zipcode and the structured elements street and country.
country }
address4 { zipCode } Uses only zipcode as the foreign key. Since zipcode is not a unique key for
entity Address, this association has cardinality “to-many”.
address5 { street.name Uses the sub-element name of the structured element street as a foreign key.
} This is not a unique key and, as a result, address4 has cardinality “to-many”.
address6 { street.name Uses the sub-element name of both the structured elements street and
AS country as foreign key fields. The names of the foreign key fields must be
streetName, unique, so an alias is required here. The foreign key is not unique, so address6
country.name is a “to-many” association.
AS
countryName }
You can now use foreign keys of managed associations in the definition of other associations. In the following
example, the compiler recognizes that the field toCountry.cid is part of the foreign key of the association
toLocation and, as a result, physically present in the entity Company.
Sample Code
namespace samples;
@Schema: 'MYSCHEMA' // XS classic *only*
context AssociationKeys {
entity Country {
key c_id : String(3);
// <...>
};
entity Region {
key r_id : Integer;
key toCountry : Association[1] to Country { c_id };
// <...>
};
entity Company {
key id : Integer;
toLocation : Association[1] to Region { r_id, toCountry.c_id };
// <...>
};
};
Unmanaged Associations
Unmanaged associations are based on existing elements of the source and target entity; no fields are
generated. In the ON condition, only elements of the source or the target entity can be used; it is not possible to
use other associations. The ON condition may contain any kind of expression - all expressions supported in
views can also be used in the ON condition of an unmanaged association.
Note
The names in the ON condition are resolved in the scope of the source entity; elements of the target entity
are accessed through the association itself .
namespace samples;
@Schema: 'MYSCHEMA' // XS classic *only*
context UnmanagedAssociations {
entity Employee {
key id : Integer;
officeId : Integer;
// <...>
};
entity Room {
key id : Integer;
inhabitants : Association[*] to Employee on inhabitants.officeId = id;
// <...>
};
entity Thing {
key id : Integer;
parentId : Integer;
parent : Association[1] to Thing on parent.id = parentId;
children : Association[*] to Thing on children.parentId = id;
// <...>
};
};
● parent
The unmanaged association parent uses a cardinality of [1] to create a relation between the element
parentId and the target element id. The target element id is accessed through the name of the
association itself.
● children
The unmanaged association children creates a relation between the element id and the target element
parentId. The target element parentId is accessed through the name of the association itself.
entity Thing {
key id : Integer;
parentId : Integer;
parent : Association[1] to Thing on parent.id = parentId;
children : Association[*] to Thing on children.parentId = id;
...
};
Constants in Associations
The usage of constants is no longer restricted to annotation assignments and default values for entity
elements. With SPS 11, you can use constants in the “ON”-condition of unmanaged associations, as illustrated
in the following example:
Sample Code
context MyContext {
const MyIntConst : Integer = 7;
const MyStringConst : String(10) = 'bright';
const MyDecConst : Decimal(4,2) = 3.14;
const MyDateTimeConst : UTCDateTime = '2015-09-30 14:33';
entity MyEntity {
key id : Integer;
a : Integer;
b : String(100);
Related Information
A view is a virtual table based on the dynamic results returned in response to an SQL statement. SAP HANA
Extended Application Services (SAP HANA XS) enables you to use CDS syntax to create a database view as a
design-time file in the repository.
Prerequisites
● You have created a schema for the CDS catalog objects, for example, MYSCHEMA.
● You have SELECT privileges on the schema so you can see the generated catalog objects.
Context
SAP HANA Extended Application Services (SAP HANA XS) enables you to use the CDS syntax to create a
database view as a design-time file in the repository. Repository files are transportable. Activating the CDS view
definition creates the corresponding catalog object in the specified schema.
namespace com.acme.myapp1;
@Schema : 'MYSCHEMA'
context MyEntity2 {
type StreetAddress {
name : String(80);
number : Integer;
};
type CountryAddress {
name : String(80);
code : String(3);
};
@Catalog.tableType : #COLUMN
entity Address {
key id : Integer;
street : StreetAddress;
zipCode : Integer;
city : String(80);
country : CountryAddress;
type : String(10); // home, office
};
};
A view is an entity that is not persistent; it is defined as the projection of other entities. SAP HANA Extended
Application Services (SAP HANA XS) enables you to create a CDS view as a design-time file in the repository.
SAP HANA Extended Application Services (SAP HANA XS) enables you to define a view in a CDS document,
which you store as design-time file in the repository. Repository files can be read by applications that you
develop. In addition, all repository files including your view definition can be transported to other SAP HANA
systems, for example, in a delivery unit.
If your application refers to the design-time version of a view from the repository rather than the runtime
version in the catalog, for example, by using the explicit path to the repository file (with suffix), any changes to
the repository version of the file are visible as soon as they are committed to the repository. There is no need to
wait for the repository to activate a runtime version of the view.
To define a transportable view using the CDS-compliant view specifications, use something like the code
illustrated in the following example:
context Views {
VIEW AddressView AS SELECT FROM Address {
id,
street.name,
street.number
};
<...>
}
When a CDS document is activated, the activation process generates a corresponding catalog object for each
of the artifacts defined in the document; the location in the catalog is determined by the type of object
generated. For example, in SAP HANA XS classic the corresponding catalog object for a CDS view definition is
generated in the following location:
Views defined in a CDS document can make use of the following SQL features:
For more information about the syntax required when using these SQL features in a CDS view, see CDS
View Syntax Options in Related Information.
Type Definition
In a CDS view definition, you can explicitly specify the type of a select item, as illustrated in the following
example:
Sample Code
● a,
Has type “MyInteger”
● a+b as s1,
Has type “Integer” and any information about the user-defined type is lost
● a+b as s2 : MyInteger
Has type “MyInteger”, which is explicitly specified
Note
If necessary, a CAST function is added to the generated view in SAP HANA; this ensures that the select
item's type in the SAP HANA view is the SAP HANA “type” corresponding to the explicitly specified CDS
type.
Related Information
SAP HANA XS includes a dedicated, CDS-compliant syntax, which you must adhere to when using a CDS
document to define a view as a design-time artifact.
Example
Note
The following example is intended for illustration purposes only and might contain syntactical errors. For
further details about the keywords illustrated, click the links provided.
context views {
const x : Integer = 4;
const y : Integer = 5;
const Z : Integer = 6;
VIEW MyView1 AS SELECT FROM Employee
{
a + b AS theSum
};
VIEW MyView2 AS SELECT FROM Employee
{ officeId.building,
officeId.floor,
officeId.roomNumber,
office.capacity,
count(id) AS seatsTaken,
count(id)/office.capacity as occupancyRate
} WHERE officeId.building = 1
GROUP BY officeId.building,
officeId.floor,
officeId.roomNumber,
office.capacity,
office.type
HAVING office.type = 'office' AND count(id)/office.capacity < 0.5;
VIEW MyView3 AS SELECT FROM Employee
{ orgUnit,
salary
} ORDER BY salary DESC;
VIEW MyView4 AS SELECT FROM Employee {
CASE
WHEN a < 10 then 'small'
WHEN 10 <= a AND a < 100 THEN 'medium'
ELSE 'large'
END AS size
};
VIEW MyView5 AS
SELECT FROM E1 { a, b, c}
UNION
SELECT FROM E2 { z, x, y};
VIEW MyView6 AS SELECT FROM Customer {
name,
orders[status='open'].{ id as orderId,
date as orderDate,
items[price>200].{ descr,
price } }
};
VIEW MyView7 as
select from E { a, b, c}
order by a limit 10 offset 30;
In a CDS view definition you can use any of the functions and expressions listed in the following example:
Note
When expressions are used in a view element, an alias must be specified, for example, AS theSum.
● AVG
● COUNT
● MIN
● MAX
● SUM
● STDDEV
● VAR
The following example shows how to use aggregates and expressions to collect information about headcount
and salary per organizational unit for all employees hired from 2011 to now.
Note
Constants in Views
With SPS 11, you can use constants in the views, as illustrated in “MyView” at the end of the following example:
Sample Code
context MyContext {
const MyIntConst : Integer = 7;
const MyStringConst : String(10) = 'bright';
const MyDecConst : Decimal(4,2) = 3.14;
const MyDateTimeConst : UTCDateTime = '2015-09-30 14:33';
entity MyEntity {
key id : Integer;
a : Integer;
b : String(100);
c : Decimal(20,10);
d : UTCDateTime;
your : Association[1] to YourEntity on your.a - a < MyIntConst;
};
entity YourEntity {
key id : Integer;
a : Integer;
};
entity HerEntity {
key id : Integer;
t : String(20);
When constants are used in a view definition, their name must be prefixed with the scope operator “:”. Usually
names that appear in a query are resolved as alias or element names. The scope operator instructs the
compiler to resolve the name outside of the query.
Sample Code
context NameResolution {
const a : Integer = 4;
const b : Integer = 5;
const c : Integer = 6;
entity E {
key id : Integer;
a : Integer;
c : Integer;
};
view V as select from E {
a as a1,
b,
:a as a2,
E.a as a3,
:E,
:E.a as a4,
:c
};
}
The following table explains how the constants used in view “V” are resolved.
a as a1, Success “a” is resolved in the space of alias and element names, for example, ele
ment “a” of entity “E”.
b, Error There is no alias and no element with name “b” in entity “E”
:a as a2, Success Scope operator “:” instructs the compiler to search for element “a” outside
of the query (finds the constant “a”).
E.a as a3, Success “E” is resolved in the space of alias and element names, so this matches
element “a” of entity “Entity” .
:E.a as a4, Error Error; no access to “E” (or any of its elements) via “:”
SELECT
In the following example of an association in a SELECT list, a view compiles a list of all employees; the list
includes the employee's name, the capacity of the employee's office, and the color of the carpet in the office.
The association follows the to-one association office from entity Employee to entity Room to collect the
relevant information about the office.
Subqueries
You can define subqueries in a CDS view, as illustrated in the following example:
Restriction
Code Syntax
Note
In a correlated subquery, elements of outer queries must always be addressed by means of a table alias.
WHERE
The following example shows how the syntax required in the WHERE clause used in a CDS view definition. In this
example, the WHERE clause is used in an association to restrict the result set according to information located
in the association's target. Further filtering of the result set can be defined with the AND modifier.
FROM
The following example shows the syntax required when using the FROM clause in a CDS view definition. This
example shows an association that lists the license plates of all company cars.
If a CDS view references a native SAP HANA table, the table and column names must be specified using their
effective SAP HANA names.
This means that if a table (foo) or its columns (bar and “gloo” were created without using quotation marks
(""), the corresponding uppercase names for the table or columns must be used in the CDS document, as
illustrated in the following example.
GROUP BY
The following example shows the syntax required when using the GROUP BY clause in a CDS view definition.
This example shows an association in a view that compiles a list of all offices that are less than 50% occupied.
HAVING
The following example shows the syntax required when using the HAVING clause in a CDS view definition. This
example shows a view with an association that compiles a list of all offices that are less than 50% occupied.
ORDER BY
The ORDER BY operator enables you to list results according to an expression or position, for example salary.
In the same way as with plain SQL, the ASC and DESC operators enable you to sort the list order as follows.
● ASC
Display the result set in ascending order
● DESC
Display the result set in descending order
LIMIT/OFFSET
You can use the SQL clauses LIMIT and OFFSET in a CDS query. The LIMIT <INTEGER> [OFFSET
<INTEGER>] operator enables you to restrict the number of output records to display to a specified “limit”; the
OFFSET <INTEGER> specifies the number of records to skip before displaying the records according to the
defined LIMIT.
CASE
In the same way as in plain SQL, you can use the case expression in a CDS view definition to introduce IF-
THEN-ELSE conditions without the need to use procedures.
entity MyEntity12 {
key id : Integer;
a : Integer;
color : String(1);
};
In the first example of usage of the CASE operator, CASE color shows a “switched” CASE (one table column
and multiple values). The second example of CASE usage shows a “conditional” CASE with multiple arbitrary
conditions, possibly referring to different table columns.
UNION
Enables multiple select statements to be combined but return only one result set. UNION works in the same
way as the SAP HANA SQL command of the same name; it selects all unique records from all select statements
by removing duplicates found from different select statements.The signature of the result view is equal to the
signature of the first SELECT in the union.
Note
entity E1 {
key a : Integer;
b : String(20);
c : LocalDate;
};
entity E2 {
key x : String(20);
y : LocalDate;
z : Integer;
JOIN
You can include a JOIN clause in a CDS view definition; the following JOIN types are supported:
● [ INNER ] JOIN
● LEFT [ OUTER ] JOIN
● RIGHT [ OUTER ] JOIN
● FULL [ OUTER ] JOIN
● CROSS JOIN
Sample Code
entity E {
key id : Integer;
a : Integer;
};
entity F {
key id : Integer;
b : Integer;
};
entity G {
key id : Integer;
c : Integer;
};
view V_join as select from E join (F as X full outer join G on X.id = G.id)
on E.id = c {
a, b, c
};
TOP
You can use the SQL clause TOP in a CDS query, as illustrated in the following example:
Sample Code
Restriction
It is not permitted to use TOP in combination with the LIMIT clause in a CDS query.
CDS now supports the SELECT DISTINCT semantic, which enables you to specify that only one copy of each
set of duplicate records selected should be returned. The position of the DISTINCT keyword is important; it
must appear directly in front of the curly brace, as illustrated in the following example:
Sample Code
entity E {
key id : Integer;
a : Integer;
};
entity F {
key id : Integer;
b : Integer;
};
entity G {
key id : Integer;
c : Integer;
};
view V_dist as select from E distinct { a };
With Parameters
You can define parameters for use in a CDS view; this allows you to pass additional values to modify the results
of the query at run time. Parameters must be defined in the view definition before the query block, as
illustrated in the following example:
Restriction
For use in XS advanced only; views with parameters are not supported in XS classic.
Sample Code
context MyContext
{
entity MyEntity1 {
id: Integer;
elt: String(100); };
entity MyEntity2 {
id: Integer;
elt: String(100); };
type MyUserDefinedType: type of E.elt;
view MyParamView with parameters PAR1: Integer,
PAR2: MyUserDefinedType,
PAR3: type of E.elt
as select from MyEntity {
id,
elt };
Note
Tip
If no matching parameter can be found, the scope operator “escapes” from the query and attempts to
resolve the identifier outside the query.
Sample Code
Restriction
It is not allowed to use a query as value expression. Nor is it allowed to provide a parameter list in the ON
condition of an association definition to a parameterized view. This is because the association definition
establishes the relationship between the two entities but makes no assumptions about the run-time
conditions. For the same reason, it is not allowed to specify filter conditions in those ON conditions.
The following example shows two entities SourceEntity and TargetEntity and a parameterized view
TargetWindowView, which selects from TargetEntity. An association is established between
SourceEntity and TargetEntity.
Sample Code
entity SourceEntity {
id: Integer;
someElementOfSourceEntity: String(100);
toTargetViaParamView: association to TargetWindowView on
toTargetViaParamView.targetId = id;
};
entity TargetEntity {
targetId: Integer;
someElementOfTargetEntity: String(100);
};
It is now possible to query SourceEntity in a view; it is also possible to follow the association to
TargetWindowView, for example, by providing the required parameters, as illustrated in the following
example:
Sample Code
It is also possible to follow the association in the FROM clause; this provides access only to the elements of the
target artifact:
Sample Code
You can select directly from the view with parameters, adding a free JOIN expression, as illustrated in the
following example:
Sample Code
To improve readability and comprehension, it is recommended to include only one annotation assignment
per line.
In the following example, the view TargetWindowView selects from the entity TargetEntity; the annotation
@positiveValuesOnly is not checked; and the targetId is required for the ON condition in the entity
SourceEntity.
Sample Code
You can define an association as a view element, for example, by defining an ad-hoc association in the mixin
clause and then adding the association to the SELECT list, as illustrated in the following example:
Restriction
XS classic does not support the use of ad-hoc associations in a view's SELECT list.
Sample Code
entity E {
a : Integer;
b : Integer;
};
entity F {
x : Integer;
y : Integer;
};
view VE as select from E mixin {
f : Association[1] to VF on f.vy = $projection.vb;
} into {
a as va,
b as vb,
f as vf
};
view VF as select from F {
x as vx,
y as vy
In the ON condition of this type of association in a view, it is necessary to use the pseudo-identifier
$projection to specify that the following element name must be resolved in the select list of the view
(“VE”) rather than in the entity (“E”) in the FROM clause
Filter Conditions
It is possible to apply a filter condition when resolving associations between entities; the filter is merged into
the ON-condition of the resulting JOIN. The following example shows how to get a list of customers and then
filter the list according to the sales orders that are currently “open” for each customer. In the example, the filter
is inserted after the association orders; this ensures that the list displayed by the view only contains those
orders that satisfy the condition [status='open'].
Sample Code
The following example shows how to use the prefix notation to ensure that the compiler understands that there
is only one association (orders) to resolve but with multiple elements (id and date):
Sample Code
Tip
The following example shows how to use the associations orders and items in a view that displays a list of
customers with open sales orders for items with a price greater than 200.
Sample Code
Prefix Notation
The prefix notation can also be used without filters. The following example shows how to get a list of all
customers with details of their sales orders. In this example, all uses of the association orders are combined
Sample Code
The example above can be expressed more elegantly by combining the associations orders and items using
the following prefix notation:
Sample Code
Type Definition
In a CDS view definition, you can explicitly specify the type of a select item, as illustrated in the following
example:
Restriction
For use in XS advanced only; assigning an explicit CDS type to an item in a SELECT list is not supported in
XS classic.
Sample Code
● a,
Has type “MyInteger”
● a+b as s1,
Has type “Integer” and any information about the user-defined type is lost
● a+b as s2 : MyInteger
Has type “MyInteger”, which is explicitly specified
Note
If necessary, a CAST function is added to the generated view in SAP HANA; this ensures that the select
item's type in the SAP HANA view is the SAP HANA “type” corresponding to the explicitly specified CDS
type.
Spatial Functions
The following view (SpatialView1) displays a list of all persons selected from the entity Person and uses the
spatial function ST_Distance (*) to include information such as the distance between each person's home
and business address (distanceHomeToWork), and the distance between their home address and the
building SAP03 (distFromSAP03). The value for both distances is measured in kilometers, which is rounded
up and displayed to one decimal point.
Sample Code
Caution
(*) For information about the capabilities available for your license and installation scenario, refer to the
Feature Scope Description for SAP HANA.
Related Information
CDS supports the use of Geographic Information Systems (GIS) functions and element types in CDS-
compliant entities and views.
Spatial data is data that describes the position, shape, and orientation of objects in a defined space; the data is
represented as two-dimensional geometries in the form of points, line strings, and polygons. The following
examples shows how to use the spatial function ST_Distance in a CDS view. The underlying spatial data used
in the view is defined in a CDS entity using the type ST_POINT.
The following example, the CDS entity Address is used to store geo-spatial coordinates in element loc of type
ST_POINT:
Sample Code
namespace samples;
@Schema: 'MYSCHEMA'
context Spatial {
entity Person {
key id : Integer;
name : String(100);
homeAddress : Association[1] to Address;
officeAddress : Association[1] to Address;
};
entity Address {
key id : Integer;
street_number : Integer;
street_name : String(100);
zip : String(10);
city : String(100);
loc : hana.ST_POINT(4326);
};
view GeoView1 as select from Person {
name,
homeAddress.street_name || ', ' || homeAddress.city as home,
officeAddress.street_name || ', ' || officeAddress.city as office,
round( homeAddress.loc.ST_Distance(officeAddress.loc, 'meter')/1000,
1) as distanceHomeToWork,
round( homeAddress.loc.ST_Distance(NEW ST_POINT(8.644072, 49.292910),
'meter')/1000, 1) as distFromSAP03
};
};
The view GeoView1 is used to display a list of all persons using the spatial function ST_Distance to include
information such as the distance between each person's home and business address
(distanceHomeToWork), and the distance between their home address and the building SAP03
(distFromSAP03). The value for both distances is measured in kilometers.
Caution
(*) For information about the capabilities available for your license and installation scenario, refer to the
Feature Scope Description for SAP HANA.
Changes to the definition of a CDS artifact result in changes to the corresponding catalog object. The resultant
changes to the catalog object are made according to strict rules.
Reactivating a CDS document which contains changes to the original artifacts results in changes to the
corresponding objects in the catalog. Before making change to the design-time definition of a CDS artifact, it is
very important to understand what the consequences of the planned changes will be in the generated catalog
objects.
If a CDS design-time artifact (for example, a table or a view) defined in an old version of a CDS document is no
longer present in the new version, the corresponding runtime object is dropped from the catalog.
Note
Renaming a CDS artifact results in the deletion of the artifact with the old name (with all the corresponding
consequences) and the creation of a new CDS artifact with the new name.
If a CDS design-time artifact is present in both the old and the new version of a CDS document, a check is
performed to establish what, if any, changes have occurred. This applies to changes made either directly to a
CDS artifact or indirectly, for example, as a result of a change to a dependent artifact. If changes have been
made to the CDS document, changes are implemented in the corresponding catalog objects according to the
following rules:
● Views
Note
Adding the “key” modifier to an element will also make the column in the corresponding table not
nullable. If column in the corresponding database table contains null values and there is no default
value defined for the element, the activation of the modified CDS document will fail.
For changes to individual elements of a CDS entity, for example, column definitions, the same logic applies as
for complete artifacts in a CDS document.
● Since the elements of a CDS entity are identified by their name, changing the order of the elements in the
entity definition will have no effect; the order of the columns in the generated catalog table object remains
unchanged.
● Renaming an element in a CDS entity definition is not recognized; the rename operation results in the
deletion of the renamed element and the creation of a new one.
● If a new element is added to a CDS entity definition, the order of the columns in the table generated in the
catalog after the change cannot be guaranteed.
If an existing CDS entity definition is changed, the order of the columns in the generated database tables
may be different from the order of the corresponding elements in the CDS entity definition.
In the following example of a simple CDS document, the context OuterCtx contains a CDS entity Entity1 and
the nested context InnerCtx, which contains the CDS entity definition Entity2.
namespace pack;
@Schema: 'MYSCHEMA'
context OuterCtx
{
entity Entity1
{
key a : Integer;
b : String(20);
};
context InnerCtx
{
entity Entity2
{
key x : Integer;
y : String(10);
z : LocalDate;
};
};
};
To understand the effect of the changes made to this simple CDS document in the following example, it is
necessary to see the changes not only from the perspective of the developer who makes the changes but also
the compiler which needs to interpret them.
From the developer's perspective, the CDS entity Entity1 has been moved from context OuterCtx to
InnerCtx. From the compiler's perspective, however, the entity pack::OuterCtx.Entity1 has disappeared
and, as a result, will be deleted (and the corresponding generated table with all its content dropped), and a new
entity named pack::OuterCtx.InnerCtx.Entity1 has been defined.
namespace pack;
@Schema: 'MYSCHEMA'
context OuterCtx
{
context InnerCtx
{
entity Entity1
{
key a : Integer;
b : String(20);
};
entity Entity2
{
key x : Integer;
q : String(10);
z : LocalDate;
};
};
};
Similarly, renaming the element y: String; to q: String; in Entity2 results in the deletion of column y
and the creation of a new column q in the generated catalog object. As a consequence, the content of column y
is lost.
CDS does not support modifications to catalog objects generated from CDS documents. You must never
modify an SAP HANA catalog object (in particular a table) that has been generated from a CDS document. The
next time you activate the CDS document that contains the original CDS object definition and the
corresponding catalog objects are generated, all modifications made to the catalog object are lost or activation
might even fail due to inconsistencies.
If the definition of a CDS entity has already been transported to another system, do not enforce activation of
any illegal changes to this entity, for example, by means of an intermediate deletion.
Restrictions apply to changes that can be made to a CDS entity if the entity has been activated and a
corresponding catalog object exists. If changes to a CDS entity on the source system produce an error during
activation of the CDS document, for example, because you changed an element type in a CDS entity from
Binary to LocalDate, you could theoretically delete the original CDS entity and then create a new CDS entity
with the same name as the original entity but with the changed data type. However, if this change is
transported to another system, where the old version of the entity already exists, the import will fail, because
the information that the entity has been deleted and recreated is not available either on the target system or in
the delivery unit.
Related Information
The table-import function is a data-provisioning tool that enables you to import data from comma-separated
values (CSV) files into SAP HANA tables.
Context
In this tutorial, you import data from a CSV file into a table generated from a design-time definition that uses
the .hdbdd syntax, which complies with the Core Data Services (CDS) specifications. Note that the names
used are just examples; where necessary, replace the names of the schema, tables, files, and so on with your
own names.
Note
Naming conventions exist for package names, for example, a package name must not start with
either a dot (.) or a hyphen (-) and cannot contain two or more consecutive dots (..). In addition,
the name must not exceed 190 characters.
3. If it does not already exist, create a schema named AMT directly in the catalog; the AMT schema is where
the target table for the table-import operation resides.
4. Create a set of table-import files.
For the purposes of this tutorial, you create all files in the same package, for example, a package called
TiTest. Note, however, that the table-import feature also allows you to use files distributed in different
packages.
The following files are required for the table import scenario:
○ The table-import configuration file, for example, inhabitants.hdbti
Specifies the source file containing the data values to import and the target table in SAP HANA into
which the data will be inserted.
○ A CSV file, for example, inhabitants.csv
Contains the data to be imported into the SAP HANA table during the table-import operation; values in
the .csv file can be separated either by a comma (,) or a semi-colon (;).
○ A target table, for example, inhabitants.hdbdd
The target table can be either a runtime table in the catalog or a table definition, for example, a table
defined using the .hdbtable syntax (TiTable.hdbtable) or the CDS-compliant .hdbdd syntax
(TiTable.hdbdd).
Note
In this tutorial, the target table for the table-import operation is inhabitants.hdbdd, a design-
time table defined using the CDS-compliant .hdbdd syntax.
When all the necessary files are available, you can import data from the source CSV file into the desired
target table.
5. Create or open the table-definition file for the target import table (inhabitants.hdbdd) and enter the
following lines of text; this example uses the .hdbdd syntax:
namespace mycompany.tests.TiTest;
@Schema : 'AMT'
@Catalog.tableType : #COLUMN
entity inhabitants {
key ID : Integer;
Note
In the CDS-compliant .hdbdd syntax, the namespace keyword denotes the path to the package
containing the table-definition file.
6. Open the CSV file containing the data to import, for example, inhabitants.csv, in a text editor and enter
the values shown in the following example:
0,Annan,Kofi,Accra
1,Essuman,Wiredu,Tema
2,Tetteh,Kwame,Kumasi
3,Nterful,Akye,Tarkwa
4,Acheampong,Kojo,Tamale
5,Assamoah,Adjoa,Takoradi
6,Mensah,Afua,Cape Coast
Note
You can import data from multiple .csv files in a single, table-import operation. However, each .csv
file must be specified in a separate code block ({table= ...}) in the table-import configuration file.
7. Create or open the table-import configuration file (inhabitants.hdbti) and enter the following lines of
text (make sure the paths point to the correct locations in your environment):
import = [
{
table = "mycompany.tests.TiTest::inhabitants";
schema = "AMT";
file = "mycompany.tests.TiTest:inhabitants.csv";
header = false;
}
];
Expand the AMT Tables node, select the table mycompany.tests.TiTest::inhabitants, and
from the context menu choose Open Content.
In SAP HANA XS, you create a table-import scenario by setting up an table-import configuration file and one or
more comma-separated value (CSV) files containing the content you want to import into the specified SAP
To use the SAP HANA XS table-import feature to import data into an SAP HANA table, you need to understand
the following table-import concepts:
● Table-import configuration
You define the table-import model in a configuration file that specifies the data fields to import and the
target tables for each data field.
Note
The table-import file must have the .hdbti extension, for example, myTableImport.hdbti.
The following constraints apply to the CSV file used as a source for the table-import feature in SAP HANA XS:
● The number of table columns must match the number of CSV columns.
● There must not be any incompatibilities between the data types of the table columns and the data types of
the CSV columns.
● Overlapping data in data files is not supported.
● The target table of the import must not be modified (or appended to) outside of the data-import operation.
If the table is used for storage of application data, this data may be lost during any operation to re-import
or update the data.
Related Information
You can define the elements of a table-import operation in a design-time file; the configuration includes
information about source data and the target table in SAP HANA.
SAP HANA Extended Application Services (SAP HANA XS) enables you to perform data-provisioning
operations that you define in a design-time configuration file. The configuration file is transportable, which
means you can transfer the data-provisioning between SAP HANA systems quickly and easily.
The table-import configuration enables you to specify how data from a comma-separated-value (.csv) file is
imported into a target table in SAP HANA. The configuration specifies the source file containing the data values
to import and the target table in SAP HANA into which the data must be inserted. As further options, you can
Note
If you use multiple table import configurations to import data into a single target table, the keys keyword is
mandatory. This is to avoid problems relating to the overwriting or accidental deletion of existing data.
The following example of a table-import configuration shows how to define a simple import operation which
inserts data from the source files myData.csv and myData2.csv into the table myTable in the schema
mySchema.
import = [
{
table = "myTable";
schema = "mySchema";
file = "sap.ti2.demo:myData.csv";
header = false;
delimField = ";";
keys = [ "GROUP_TYPE" : "BW_CUBE"];
},
{
table = "sap.ti2.demo::myTable";
file = "sap.ti2.demo:myData2.csv";
header = false;
delimField = ";";
keys = [ "GROUP_TYPE" : "BW_CUBE"];
}
];
In the table import configuration, you can specify the target table using either of the following methods:
Note
Both the schema and the target table specified in the table-import operation must already exist. If either
the specified table or the schema does not exist, SAP HANA XS displays an error message during the
activation of the configuration file, for example: Table import target table cannot be found. or
Schema could not be resolved.
You can also use one table-import configuration file to import data from multiple .csv source files. However,
you must specify each import operation in a new code block introduced by the [hdb | cds]table keyword, as
illustrated in the example above.
By default, the table-import operation assumes that data values in the .csv source file are separated by a
comma (,). However, the table-import operation can also interpret files containing data values separated by a
semi-colon (;).
,,,BW_CUBE,,40000000,2,40000000,all
;;;BW_CUBE;;40000000;3;40000000;all
Note
If the activated .hdbti configuration used to import data is subsequently deleted, only the data that was
imported by the deleted .hdbti configuration is dropped from the target table. All other data including any
data imported by other .hdbti configurations remains in the table. If the target CDS entity has no key
(annotated with @nokey) all data that is not part of the CSV file is dropped from the table during each
table-import activation.
You can use the optional keyword keys to specify the key range taken from the source .csv file for import into
the target table. If keys are specified for an import in a table import configuration, multiple imports into same
target table are checked for potential data collisions.
Note
The configuration-file syntax does not support wildcards in the key definition; the full value of a selectable
column value has to be specified.
Security Considerations
In SAP HANA XS, design-time artifacts such as tables (.hdbtable or .hdbdd) and table-import
configurations (.hdbti) are not normally exposed to clients via HTTP. However, design-time artifacts
containing comma-separated values (.csv) could be considered as potential artifacts to expose to users
through HTTP. For this reason, it is essential to protect these exposed .csv artifacts by setting the appropriate
application privileges; the application privileges prevents data leakage, for example, by denying access to data
by users, who are not normally allowed to see all the records in such tables.
Tip
Place all the .csv files used to import content to into tables together in a single package and set the
appropriate (restrictive) application-access permissions for that package, for example, with a
dedicated .xsaccess file.
Related Information
The design-time configuration file used to define a table-import operation requires the use of a specific syntax.
The syntax comprises a series of keyword=value pairs.
If you use the table-import configuration syntax to define the details of the table-import operation, you can use
the keywords illustrated in the following code example. The resulting design-time file must have the .hdbti file
extension, for example, myTableImportCfg.hdbti.
import = [
{
table = "myTable";
schema = "mySchema";
file = "sap.ti2.demo:myData.csv";
header = false;
useHeaderNames = false;
delimField = ";";
delimEnclosing=“\““;
distinguishEmptyFromNull = true;
keys = [ "GROUP_TYPE" : "BW_CUBE", "GROUP_TYPE" : "BW_DSO", "GROUP_TYPE" :
"BW_PSA"];
}
];
table
In the table-import configuration, the table, cdstable, and hdbtable keywords enable you to specify the
name of the target table into which the table-import operation must insert data. The target table you specify in
the table-import configuration can be a runtime table in the catalog or a design-time table definition, for
example, a table defined using either the .hdbtable or the .hdbdd (Core Data Services) syntax.
Note
The target table specified in the table-import configuration must already exist. If the specified table does
not exist, SAP HANA XS displays an error message during the activation of the configuration file, for
example: Table import target table cannot be found.
Use the table keyword in the table-import configuration to specify the name of the target table using the
qualified name for a catalog table.
table = "target_table";
schema = "mySchema";
Note
You must also specify the name of the schema in which the target catalog table resides, for example, using
the schema keyword.
The hdbtable keyword in the table-import configuration enables you to specify the name of a target table using
the public synonym for a design-time table defined with the .hdbtable syntax.
hdbtable = "sap.ti2.demo::target_table";
cdstable = "sap.ti2.demo::target_table";
Caution
There is no explicit check if the addressed table is created using the .hdbtable or CDS-compliant .hdbdd
syntax.
If the table specified with the cdstable or hdbtable keyword is not defined with the corresponding syntax,
SAP HANA displays an error when you try to activate the artifact, for example,Invalid combination of
table declarations found, you may only use [cdstable | hdbtable | table] .
schema
The following code example shows the syntax required to specify a schema in a table-import configuration.
schema = "TI2_TESTS";
Note
The schema specified in the table-import configuration file must already exist.
If the schema specified in a table-import configuration file does not exist, SAP HANA XS displays an error
message during the activation of the configuration file, for example:
The schema is only required if you use a table's schema-qualified catalog name to reference the target table for
an import operation, for example, table = "myTable"; schema = "mySchema";. The schema is not
required if you use a public synonym to reference a table in a table-import configuration, for example,
hdbtable = "sap.ti2.demo::target_table";.
file
Use the file keyword in the table-import configuration to specify the source file containing the data that the
table-import operation imports into the target table. The source file must be a .csv file with the data values
separated either by a comma (,) or a semi-colon (;). The file definition must also include the full package path
in the SAP HANA repository.
file = "sap.ti2.demo:myData.csv";
Use the header keyword in the table-import configuration to indicate if the data contained in the
specified .csv file includes a header line. The header keyword is optional, and the possible values are true or
false.
header = false;
useHeaderNames
Use the useHeaderNames keyword in the table-import configuration to indicate if the data contained in the
first line of the specified .csv file must be interpreted. The useHeaderNames keyword is optional; it is used in
combination with theheader keyword. The useHeaderNames keyword is boolean: possible values are true or
false.
Note
useHeaderNames = false;
The table-import process considers the order of the columns; if the column order specified in the .csv, file
does not match the order used for the columns in the target table, an error occurs on activation.
delimField
Use the delimField keyword in the table-import configuration to specify which character is used to separate
the values in the data to be imported. Currently, the table-import operation supports either the comma (,) or
the semi-colon (;). The following example shows how to specify that values in the .csv source file are
separated by a semi-colon (;).
delimField = ";";
Note
By default, the table-import operation assumes that data values in the .csv source file are separated by a
comma (,). If no delimiter field is specified in the .hdbti table-import configuration file, the default setting
is assumed.
delimEnclosing
Use the delimEnclosing keyword in the table-import configuration to specify a single character that
indicates both the start and end of a set of characters to be interpreted as a single value in the .csv file, for
Tip
If the value used to separate the data fields in your .csv file (for example, the comma (,)) is also used
inside the data values themselves ("This, is, a, value"), you must declare and use a delimiter
enclosing character and use it to enclose all data values to be imported.
The following example shows how to use the delimEnclosing keyword to specify the quote (") as the
delimiting character that indicates both the start and the end of a value in the .csv file. Everything enclosed
between the delimEnclosing characters (in this example, “”) is interpreted by the import process as one,
single value.
delimEnclosing=“\““;
Note
Since the hdbti syntax requires us to use the quotes (“”) to specify the delimiting character, and the
delimiting character in this example is, itself, also a quote ("), we need to use the backslash character (\) to
escape the second quote (").
In the following example of values in a .csv file, we assume that delimEnclosing“\““, and
delimField=",". This means that imported values in the .csv file are enclosed in the quote character
("value”) and multiple values are separated by the comma ("value1”,"value 2”). Any commas inside the
quotes are interpreted as a comma and not as a field delimiter.
You can use other characters as the enclosing delimiter, too, for example, the hash (#). In the following
example, we assume that delimEnclosing="#" and delimField=";". Any semi-colons included inside the
hash characters are interpreted as a semi-colon and not as a field delimiter.
distinguishEmptyFromNull
Use the distinguishEmptyFromNull keyword in combination with delimEnclosing to ensure that the
table-import process correctly interprets any empty value in the .CSV file, which is enclosed with the value
defined in the delimEnclosing keyword, for example, as an empty space. This ensures that an empty space
is imported “as is” into the target table. If the empty space in incorrectly interpreted, it is imported as NULL.
distinguishEmptyFromNull = true;
Note
"Value1",,"",Value2
The table-import process would add the values shown in the example .csv above into the target table as
follows:
keys
Use the keys keyword in the table-import configuration to specify the key range to be considered when
importing the data from the .csv source file into the target table.
In the example above, all the lines in the .csv source file where the GROUP_TYPE column value matches one of
the given values (BW_CUBE, BW_DSO, or BW_PSA) are imported into the target table specified in the table-import
configuration.
;;;BW_CUBE;;40000000;3;40000000;slave
;;;BW_DSO;;40000000;3;40000000;slave
;;;BW_PSA;;2000000000;1;2000000000;slave
All the lines in the .csv source file where the GROUP_TYPE column is empty are imported into the target table
specified in the table-import configuration.
;;;;;40000000;2;40000000;all
During the course of the activation of the table-import configuration and the table-import operation itself, SAP
HANA checks for errors and displays the following information in a brief message.
40201 If you import into a catalog table, You specified a target table with the table key
please provide schema word but did not specify a schema with the
schema keyword.
40202 Schema could not be resolved The schema specified with the schema key
word does not exist or could not be found
(wrong name).
40203 Schema resolution error The schema specified with the schema key
word does not exist or could not be found
(wrong name).
40204 Table import target table cannot be The table specified with the table keyword does
found not exist or could not be found (wrong name or
wrong schema name).
40210 Table import syntax error The table-import configuration file (.hdbti)
contains one or more syntax errors.
40211 Table import constraint checks The same key is specified in multiple table-im
failed port configurations (.hdbti files), which
leads to overlaps in the range of data to import.
40212 Importing data into table failed Either duplicate keys were written (due to du
plicates in the .CSV source file) or
40213 CSV table column count mismatch Either the number of columns in the .CSV re
cord is higher than the number of columns in
the table, or
40214 Column type mismatch The .CSV file does not match the target table
for either of the following reasons:
40216 Key does not match to table header For some key columns of the table, no data are
provided.
Migrate a design-time representation of a table from the .hdbtable syntax to the CDS-compliant .hdbdd
syntax while retaining the underlying catalog table.
Prerequisites
● You have created a schema for the CDS catalog objects, for example, MYSCHEMA.
● You have SELECT privileges on the schema so you can see the generated catalog objects.
● You have a design-time definition of the hdbtable entity you want to migrate to CDS.
In this procedure you replace a design-time representation of a database table that was defined using the
hdbtable syntax with a CDS document that describes the same table (entity) with the CDS-compliant hdbdd
syntax. To migrate an hdbtable artifact to CDS, you delete the inactive version of the hdbtable object and
create a new hdbdd artifact with the same name and structure.
You need to define the target CDS entity manually. The name of the entity and the names of the elements can
be reused from the hdbtable definition. The same applies for the element modfifiers, for example, NULL/NOT
NULL, and the default values.
Note
In CDS, there is no way to reproduce the column-comments defined in an hdbtable artifact. You can use
source code comments, for example, '/* */' or '//', however, the comments do not appear in the
catalog table after activation of the new CDS artifact.
Procedure
1. Use CDS syntax to create a duplicate of the table you originally defined using the hdbtable syntax.
Note
The new CDS document must have the same name as the original hdbtable artifact, for example,
Employee.hdbdd and Employee.hdbtable.
The following code shows a simple table Employee.hdbtable that is defined using the hdbtable syntax.
This is the “source” table for the migration. When you have recreated this table in CDS using the .hdbdd
syntax, you can delete the artifact Employee.hdbtable.
table.schemaName = "MYSCHEMA";
table.tableType = COLUMNSTORE;
table.columns = [
{name = "firstname"; sqlType = NVARCHAR; nullable = false; length = 20;},
{name = "lastname"; sqlType = NVARCHAR; nullable = true; length = 20;
defaultValue = "doe";},
{name = "age"; sqlType = INTEGER; nullable = false;},
{name = "salary"; sqlType = DECIMAL; nullable = false; precision = 7;
scale = 2;}
];
The following code shows the same simple table recreated with the CDS-compliant hdbdd syntax. The new
design-time artifact is called Employee.hdbdd and is the “target” for the migration operation. Note that
all column names remain the same.
namespace sap.cds.tutorial;
@Schema:'MYSCHEMA'
@Catalog.tableType:#COLUMN
@nokey
entity Employee {
firstname : String(20) not null;
lastname : String(20) default 'doe';
2. Activate the source (hdbtable) and target (CDS) artifacts of the migration operation.
To replace the old hdbtable artifact with the new hdbdd (CDS) artifact, you must activate both artifacts
(the deleted hdbtable artifact and the new CDS document) together in a single activation operation.
Tip
In the SAP HANA Web-based Workbench, the default setting is activate on save, however you can
change this behavior to save without activating.
It is possible to migrate your SAP HANA hdbtable definition to a Core Data Services (CDS) entity that has
equally named but differently typed elements. When recreating the new CDS document, you cannot choose an
arbitrary data type; you must follow the guidelines for valid data-type mappings in the SAP HANA SQL data-
type conversion documentation. Since the SAP HANA SQL documentation does not cover CDS data types you
must map the target type names to CDS types manually.
Note
Remember that most of the data-type conversions depend on the data that is present in the catalog table
on the target system.
If you are planning to migrate SAP HANA (hdbtable) tables to CDS entities, bear in mind the following
important points:
Related Information
Mapping table for SAP HANA (hdbtable) and Core Data Services (CDS) types.
Although CDS defines its own system of data types, the list of types is roughly equivalent to the data types
available in SAP HANA (hdbtable); the difference between CDS data types and SAP HANA data types is
mostly in the type names. The following table lists the SAP HANA (hdbtable) data types and indicates what
the equivalent type is in CDS.
NVARCHAR String
SHORTTEXT String
NCLOB LargeString
TEXT LargeString
VARBINARY Binary
BLOB LargeBinary
INTEGER Integer
INT Integer
BIGINT Integer64
DECIMAL(p,s) Decimal(p,s)
DECIMAL DecimalFloat
DOUBLE BinaryFloat
DAYDATE LocalDate
DATE LocalDate
SECONDTIME LocalTime
TIME LocalTime
SECONDDATE UTCDateTime
LONGDATE UTCTimestamp
TIMESTAMP UTCTimestamp
ALPHANUM hana.ALPHANUM
SMALLINT hana.SMALLINT
TINYINT hana.TINYINT
SMALLDECIMAL hana.SMALLDECIMAL
REAL hana.REAL
VARCHAR hana.VARCHAR
CLOB hana.CLOB
BINARY hana.BINARY
ST_POINT hana.ST_POINT
ST_GEOMETRY hana.ST_GEOMETRY
Related Information
HDBTable is a language syntax that can be used to define a design-time representation of the artifacts that
comprise the persistent data models in SAP HANA.
In SAP HANA Extended Application Services (SAP HANA XS), the persistence model defines the schema,
tables, and views that specify what data to make accessible and how. The persistence model is mapped to the
consumption model that is exposed to client applications and users, so that data can be analyzed and
displayed.
SAP HANA XS enables you to create database schema, tables, views, and sequences as design-time files in the
repository. Repository files can be read by applications that you develop.
Note
All repository files including your view definition can be transported (along with tables, schema, and
sequences) to other SAP HANA systems, for example, in a delivery unit. A delivery unit is the medium SAP
You can also set up data-provisioning rules and save them as design-time objects so that they can be included
in the delivery unit that you transport between systems.
As part of the process of setting up the basic persistence model for SAP HANA XS, you perform the following
tasks:
Task Description
Create a schema Define a design-time schema and maintain the schema definition in the repository; the transportable
schema has the file extension .hdbschema, for example, MYSCHEMA.hdbschema.
Create a synonym Define a design-time synonym and maintain the synonym definition in the repository; the transporta
ble synonym has the file extension .hdbsynonym, for example, MySynonym.hdbsynonym.
Create a table Define a design-time table and maintain the table definition in the repository; the transportable table
has the file extension .hdbtable, for example, MYTABLE.hdbtable
Create a reusable Define the structure of a database table in a design-time file in the repository; you can reuse the ta
table structure ble-structure definition to specify the table type when creating a new table.
Create a view Define a design-time view and maintain the view definition in the repository; the transportable view
has the file extension .hdbview, for example, MYVIEW.hdbview
Create a sequence Define a design-time sequence and maintain the sequence definition in the repository; the transport
able sequence has the file extension .hdbsequence, for example, MYSEQUENCE.hdbsequence
Import table con Define data-provisioning rules that enable you to import data from comma-separated values (CSV)
tent files into SAP HANA tables using the SAP HANA XS table-import feature; the complete configuration
can be included in a delivery unit and transported between SAP HANA systems.
Note
On activation of a repository file, the file suffix, for example, .hdbview, .hdbschema, or .hdbtable, is
used to determine which runtime plug-in to call during the activation process. The plug-in reads the
repository file selected for activation, for example, a table, sees the object descriptions in the file, and
creates the appropriate runtime object.
Related Information
Create a Schema
Create a Table
Create an SQL View
Create a Synonym
A schema defines the container that holds database objects such as tables, views, and stored procedures. You
need a schema to be able to write to the catalog.
Context
To create a database schema as a design-time object, you create a flat file that contains the schema definition.
You save this file with the suffix .hdbschema in the appropriate package for your application in the SAP HANA
repository.
Procedure
Add the schema definition by choosing (Insert snippet) in the toolbar and change the schema name to
MYSCHEMA, for example, as shown below:
schema_name="MYSCHEMA";
Alternatively open the catalog and run a command in the following form in the SQL editor to grant
specific schema privileges:
call
_SYS_REPO.GRANT_SCHEMA_PRIVILEGE_ON_ACTIVATED_CONTENT('select','<SCHEMANAME
>','<username>');
Related Information
4.2.1.1 Schema
Relational databases contain a catalog that describes the various elements in the system. The catalog divides
the database into sub-databases known as schema. A database schema enables you to logically group
together objects such as tables, views, and stored procedures. Without a defined schema, you cannot write to
the catalog.
SAP HANA Extended Application Services (SAP HANA XS) enables you to create a database schema as a
transportable design-time file in the repository. Repository files can be read by applications that you develop.
If your application refers to the repository (design-time) version of a schema rather than the runtime version in
the catalog, for example, by using the explicit path to the repository file (with suffix), any changes to the
repository version of the file are visible as soon as they are committed to the repository. There is no need to
wait for the repository to activate a runtime version of the schema.
If you want to define a transportable schema using the design-time hdbschema specifications, use the
configuration schema illustrated in the following example:
string schema_name
The following example shows the contents of a valid transportable schema-definition file for a schema called
MYSCHEMA:
schema_name=”MYSCHEMA”;
The schema is stored in the repository with the schema name MYSCHEMA as the file name and the
suffix .hdbschema, for example, MYSCHEMA.hdbschema.
Note
A schema generated from an .hdbschema artifact can also be used in the context of Core Data Services
(CDS).
If you want to create a schema definition as a design-time object, you must create the schema as a flat file. You
save the file containing the schema definition with the suffix .hdbschema in the appropriate package for your
application in the SAP HANA repository. You can activate the design-time objects at any point in time.
Note
On activation of a repository file, the file suffix, for example, .hdbschema, is used to determine which
runtime plugin to call during the activation process. The plug-in reads the repository file selected for
activation, parses the object descriptions in the file, and creates the appropriate runtime objects.
If you activate a schema-definition object in SAP HANA, the activation process checks if a schema with the
same name already exists in the SAP HANA repository. If a schema with the specified name does not exist, the
repository creates a schema with the specified name and makes _SYS_REPO the owner of the new schema.
Note
The schema cannot be dropped even if the deletion of a schema object is activated.
If you define a schema in SAP HANA XS, note the following important points regarding the schema name:
● Name mapping
The schema name must be identical to the name of the corresponding repository object.
● Naming conventions
The schema name must adhere to the SAP HANA rules for database identifiers. In addition, a schema
name must not start with the letters SAP*; the SAP* namespace is reserved for schemas used by SAP
products and applications.
● Name usage
The Data Definition Language (DDL) rendered by the repository contains the schema name as a delimited
identifier.
Related Information
In this tutorial, you use the SAP HANA Web-based Development Workbench Editor to create a table as a
design-time file in the repository using the hdbtable syntax. In the catalog, you view the table definition and
insert data into the table.
Prerequisites
Procedure
a. Select the newly created demo.tables package and from the context menu choose New File .
b. Enter the name of the schema in the File Name field, for example, myschema.hdbschema, and choose
Create.
c. Add the schema definition by choosing (Insert snippet) in the toolbar and change the schema name
to MYSCHEMA, for example, schema_name = “MYSCHEMA”;.
d. Save the file and choose (Assign execution authorization) in the toolbar.
4. Create the new table definition.
a. Select the demo.tables package and from the context menu choose New File .
b. Enter the name of the table in the File Name field, for example, mytable.hdbtable, and choose
Create.
c. Choose (Insert snippet) in the toolbar to insert the following table definition code. Remember to
change the schema name in the inserted code to MYSCHEMA.
table.schemaName = "MYSCHEMA";
table.tableType = COLUMNSTORE;
table.columns = [
{name = "KEY1"; sqlType = TINYINT; nullable = false;},
{name = "FIELD1"; sqlType = DOUBLE; nullable = false;},
{name = "FIELD2"; sqlType = DOUBLE; nullable = false;},
b. Choose (Insert).
c. In the new row, enter some test data and save your entries, as shown in the example below:
Related Information
4.2.2.1 Tables
In the SAP HANA database, as in other relational databases, a table is a set of data elements that are organized
using columns and rows. A database table has a specified number of columns, defined at the time of table
creation, but can have any number of rows. Database tables also typically have meta-data associated with
them; the meta-data might include constraints on the table or on the values within particular columns.
Note
A delivery unit is the medium SAP HANA provides to enable you to assemble all your application-related
repository artifacts together into an archive that can be easily exported to other systems.
If your application is configured to use the design-time version of a database table in the repository rather than
the runtime version in the catalog, any changes to the repository version of the table are visible as soon as they
are committed to the repository. There is no need to wait for the repository to activate a runtime version of the
table.
If you want to define a transportable table using the design-time .hdbtable specifications, use the
configuration schema illustrated in the following example:
struct TableDefinition {
string SchemaName;
optional bool temporary;
optional TableType tableType;
optional bool public;
optional TableLoggingType loggingType;
list<ColumnDefinition> columns;
optional list<IndexDefinition> indexes;
optional PrimaryKeyDefinition primaryKey;
optional string description
};
table.schemaName = "MYSCHEMA";
table.tableType = COLUMNSTORE;
table.columns = [
{name = "Col1"; sqlType = VARCHAR; nullable = false; length = 20; comment =
"dummy comment";},
{name = "Col2"; sqlType = INTEGER; nullable = false;},
{name = "Col3"; sqlType = NVARCHAR; nullable = true; length = 20;
defaultValue = "Defaultvalue";},
{name = "Col4"; sqlType = DECIMAL; nullable = false; precision = 2; scale =
3;}];
table.indexes = [
{name = "MYINDEX1"; unique = true; order = DSC; indexColumns = ["Col2"];},
{name = "MYINDEX2"; unique = true; order = DSC; indexColumns = ["Col1",
"Col4"];}];
table.primaryKey.pkcolumns = ["Col1", "Col2"];
If you want to create a database table as a repository file, you must create the table as a flat file and save the
file containing the table dimensions with the suffix .hdbtable, for example, MYTABLE.hdbtable. The new file
is located in the package hierarchy you establish in the SAP HANA repository. You can activate the repository
files at any point in time.
Note
On activation of a repository file, the file suffix, for example, .hdbtable, is used to determine which
runtime plug-in to call during the activation process. The plug-in reads the repository file selected for
activation, in this case a table, parses the object descriptions in the file, and creates the appropriate
runtime objects.
It is important to bear in mind that an incorrectly defined table can lead to security-related problems. If the
content of the table you create is used to determine the behavior of the application, for example, whether data
is displayed depends on the content of a certain cell, any modification of the table content could help an
attacker to obtain elevated privileges. Although you can use authorization settings to restrict the disclosure of
information, data-modification issues need to be handled as follows:
● Make sure you specify the field type and define a maximum length for the field
● Avoid using generic types such as VARCHAR or BLOB.
● Keep the field length as short as possible; it is much more difficult to inject shell-code into a string that is 5
characters long than one that an can contain up to 255 characters.
Related Information
SAP HANA Extended Application Services (SAP HANA XS) enables you to use the hdbtable syntax to create a
database table as a design-time file in the repository. The design-time artifact that contains the table definition
must adhere to the .hdbtable syntax specified below.
Table Definition
The following code illustrates a simple example of a design-time table definition using the .hdbtable syntax.
Note
Keywords are case-sensitive, for example, tableType and loggingType, and the schema referenced in the
table definition, for example, MYSCHEMA, must already exist.
table.schemaName = "MYSCHEMA";
table.temporary = true;
table.tableType = COLUMNSTORE;
table.loggingType = NOLOGGING;
table.columns = [
{name = "Col1"; sqlType = VARCHAR; nullable = false; length = 20; comment =
"dummy comment";},
{name = "Col2"; sqlType = INTEGER; nullable = false;},
{name = "Col3"; sqlType = NVARCHAR; nullable = true; length = 20;
defaultValue = "Defaultvalue";},
{name = "Col4"; sqlType = DECIMAL; nullable = false; precision = 2; scale =
3;}];
table.indexes = [
The following example shows the configuration schema for tables defined using the .hdbtable syntax. Each
of the entries in the table-definition configuration schema is explained in more detail in a dedicated section
below:
struct TableDefinition {
string SchemaName;
optional bool temporary;
optional TableType tableType;
optional bool public;
optional TableLoggingType loggingType;
list<ColumnDefinition> columns;
optional list<IndexDefinition> indexes;
optional PrimaryKeyDefinition primaryKey;
optional string description
};
Schema Name
To use the .hdbtable syntax to specify the name of the schema that contains the table you are defining, use
the schemaName keyword. In the table definition, the schemaName keyword must adhere to the syntax shown
in the following example.
table.schemaName = "MYSCHEMA";
Temporary
To use the .hdbtable syntax to specify that the table you define is temporary, use the boolean temporary
keyword. Since data in a temporary table is session-specific, only the owner session of the temporary table is
allowed to INSERT/READ/TRUNCATE the data. Temporary tables exist for the duration of the session, and data
from the local temporary table is automatically dropped when the session is terminated. In the table definition,
the temporary keyword must adhere to the syntax shown in the following example.
table.temporary = true;
To specify the table type using the .hdbtable syntax, use the tableType keyword. In the table definition, the
TableType keyword must adhere to the syntax shown in the following example.
The following configuration schema illustrates the parameters you can specify with the tableType keyword:
● COLUMNSTORE
Column-oriented storage, where entries of a column are stored in contiguous memory locations. SAP
HANA is particularly optimized for column-order storage.
● ROWSTORE
Row-oriented storage, where data is stored in a table as a sequence of records
To enable logging in a table definition using the .hdbtable syntax, use the tableLoggingType keyword. In the
table definition, the tableLoggingType keyword must adhere to the syntax shown in the following example.
To define the column structure and type in a table definition using the .hdbtable syntax, use the columns
keyword. In the table definition, the columns keyword must adhere to the syntax shown in the following
example.
table.columns = [
{name = "Col1"; sqlType = VARCHAR; nullable = false; length = 20; comment =
"dummy comment";},
{name = "Col2"; sqlType = INTEGER; nullable = false;},
{name = "Col3"; sqlType = NVARCHAR; nullable = true; length = 20;
defaultValue = "Defaultvalue";},
{name = "Col4"; sqlType = DECIMAL; nullable = false; precision = 2; scale =
3;}];
The following configuration schema illustrates the parameters you can specify with the columns keyword:
struct ColumnDefinition {
string name;
SqlDataType sqlType;
optional bool nullable;
optional bool unique;
optional int32 length;
optional int32 scale;
optional int32 precision;
optional string defaultValue;
optional string comment;
};
To define the SQL data type for a column in a table using the .hdbtable syntax, use the sqlType keyword. In
the table definition, the sqlType keyword must adhere to the syntax shown in the following example.
table.columns = [
{name = "Col1"; sqlType = VARCHAR; nullable = false; length = 20; comment =
"dummy comment";},
...
];
The following configuration schema illustrates the data types you can specify with the sqlType keyword:
enum SqlDataType {
DATE; TIME; TIMESTAMP; SECONDDATE; INTEGER; TINYINT;
SMALLINT; BIGINT; REAL; DOUBLE; FLOAT; SMALLDECIMAL;
DECIMAL; VARCHAR; NVARCHAR; CLOB; NCLOB;
ALPHANUM; TEXT; SHORTTEXT; BLOB; VARBINARY;
};
To define the primary key for the specified table using the .hdbtable syntax, use the primaryKey and
pkcolumns keywords. In the table definition, the primaryKey and pkcolumns keywords must adhere to the
syntax shown in the following example.
The following configuration schema illustrates the parameters you can specify with the primaryKey keyword:
struct PrimaryKeyDefinition {
list<string> pkcolumns;
optional IndexType indexType;
};
To define the index for the specified table using the .hdbtable syntax, use the indexes keyword. In the table
definition, the indexes keyword must adhere to the syntax shown in the following example.
table.indexes = [
{name = "MYINDEX1"; unique = true; order = DSC; indexColumns = ["Col2"];},
{name = "MYINDEX2"; unique = true; order = DSC; indexColumns = ["Col1",
"Col4"];}];
You can also use the optional parameter indexType to define the type of index, for example, B_TREE or
CPB_TREE, as described in Table Index Type [page 232].
To define the index type for the specified table using the .hdbtable syntax, use the indexType keyword. In the
table definition, the indexType keyword must adhere to the syntax shown in the following example.
B_TREE specifies an index tree of type B+, which maintains sorted data that performs the insertion, deletion,
and search of records. CPB_TREE stands for “Compressed Prefix B_TREE” and specifies an index tree of type
CPB+, which is based on pkB-tree. CPB_TREE is a very small index that uses a “partial key”, that is; a key that
is only part of a full key in index nodes.
Note
If neither the B_TREE nor the CPB_TREE type is specified in the table-definition file, SAP HANA chooses the
appropriate index type based on the column data type, as follows:
● CPB_TREE
Character string types, binary string types, decimal types, when the constraint is a composite key or a
non-unique constraint
● B_TREE
All column data types other than those specified for CPB_TREE
To define the order of the table index using the .hdbtable syntax, use the order keyword. Insert the order with
the desired value (for example, ascending or descending) in the index type definition; the order keyword must
adhere to the syntax shown in the following example.
You can choose to filter the contents of the table index either in ascending (ASC) or descending (DSC) order.
The following example shows the complete configuration schema for tables defined using the .hdbtable
syntax.
enum TableType {
COLUMNSTORE; ROWSTORE;
};
enum TableLoggingType {
LOGGING; NOLOGGING;
};
enum IndexType {
B_TREE; CPB_TREE;
};
enum Order {
ASC; DSC;
Related Information
SAP HANA Extended Application Services (SAP HANA XS) enables you to define the structure of a database
table in a design-time file in the repository. You can reuse the table-structure definition to specify the table type
when creating a new table.
Prerequisites
Context
Table-structure definition files are stored in the SAP HANA repository with the .hdbstructure file extension,
for example, TableStructure.hdbstructure. The primary use case for a design-time representation of a
table structure is creating reusable type definitions for procedure interfaces.
Procedure
table.schemaName = "MYSCHEMA";
table.columns = [
{name = "Col1"; sqlType = VARCHAR; nullable = false; length = 20; comment
= "dummy comment";},
Results
You have created a new table structure in SAP HANA, which can now be used as a basis for creating new tables.
In the example below, the SQL command CREATE TABLE is used with the like operator to create a new table
in this manner:
A table-structure definition is a template that you can reuse as a basis for creating new tables of the same type
and structure. You can reference the table structure in an SQL statement (CREATE TABLE [...] like
[...]) or an SQLScript procedure.
SAP HANA Extended Application Services (SAP HANA XS) enables you to create a database table structure (or
type) as a design-time file in the repository. All repository files including your table-structure definition can be
transported to other SAP HANA systems, for example, in a delivery unit. The primary use case for a design-
time representation of a table structure is creating reusable table-type definitions for procedure interfaces.
However, you an also use table-type definitions in table user-defined fuctions (UDF).
If you want to define a design-time representation of a table structure with the .hdbstructure specifications,
use the configuration schema illustrated in the following example:
struct TableDefinition {
string SchemaName;
optional bool public;
list<ColumnDefinition> columns;
optional PrimaryKeyDefinition primaryKey;
};
The .hdbstructure syntax is a subset of the syntax used in .hdbtable. In a table structure definition,
you cannot specify the table type (for example, COLUMN/ROW), define the index, or enable logging.
table.schemaName = "MYSCHEMA";
table.columns = [
{name = "Col1"; sqlType = VARCHAR; nullable = false; length = 20; comment =
"dummy comment";},
{name = "Col2"; sqlType = INTEGER; nullable = false;},
{name = "Col3"; sqlType = NVARCHAR; nullable = true; length = 20;
defaultValue = "Defaultvalue";},
{name = "Col4"; sqlType = DECIMAL; nullable = false; precision = 2; scale =
3;}];
table.primaryKey.pkcolumns = ["Col1", "Col2"];
If you want to create a database table structure as a repository file, you must create the table structure as a flat
file and save the file containing the structure definition with the .hdbstructure file extension, for example,
TableStructure.hdbstructure. The new file is located in the package hierarchy you establish in the SAP
HANA repository. You can activate the repository files at any point in time.
Note
On activation of a repository file, the file suffix is used to determine which runtime plug-in to call during the
activation process. The plug-in reads the repository file selected for activation, in this case a table structure
element with the file extension .hdbstructure, parses the object descriptions in the file, and creates the
appropriate runtime objects.
You can use the SQL command CREATE TABLE to create a new table based on the table structure, for example,
with the like operator, as illustrated in the following example:
Related Information
A view is a virtual table based on the dynamic results returned in response to an SQL statement. SAP HANA
Extended Application Services (SAP HANA XS) enables you to create a database view as a design-time file in
the repository.
Prerequisites
Context
An SQL view contains rows and columns, just like a real database table; the fields in an SQL view are fields from
one or more real tables in the database. You can add SQL functions, for example, WHERE or JOIN statements,
to a view and present the resulting data as if it were coming from one, single table.
To create an SQL view as a design-time object, you create a flat file that contains the view definition. You save
this file with the suffix .hdbview, for example, MYVIEW.hdbview, in the appropriate package for your
application in the SAP HANA repository.
Procedure
schema="MYSCHEMA";
query="SELECT T1.\"Column2\" FROM \"MYSCHEMA\".
\"acme.com.test.views::MY_VIEW1\" AS T1 LEFT JOIN \"MYSCHEMA\".
\"acme.com.test.views::MY_VIEW2\" AS T2 ON T1.\"Column1\" = T2.\"Column1\"";
depends_on=["acme.com.test.views::MY_VIEW1", "acme.com.test.views::MY_VIEW2"];
In an SQL view defined using the .hdbview syntax, any dependency to another table or view must be
declared explicitly, for example, with the depends_on keyword. The target view or table specified in the
depends_on keyword must also be mentioned in the SELECT query that defines the SQL view. If one of
If you want to assign names to the columns in a view, use the following syntax in the SQL query. In the
example below, the following names are specified for the columns defined in the view:
○ idea_id
○ identity_id
○ role_id
schema = "MYSCHEMA";
query = "SELECT role_join.idea_id AS idea_id, ident.member_id AS identity_id,
role_join.role_id AS role_id
FROM \"acme.com.odin.db.iam::t_identity_group_member_transitive\"
AS ident
INNER JOIN \"acme.com.odin.db.idea::t_idea_identity_role\" AS
role_join
ON role_join.identity_id = ident.group_id UNION DISTINCT
SELECT idea_id, identity_id, role_id
FROM \"acme.com.odin.db.idea::t_idea_identity_role\"
WITH read only";
In SQL, a view is a virtual table based on the dynamic results returned in response to an SQL statement. Every
time a user queries an SQL view, the database uses the view's SQL statement to recreate the data specified in
the SQL view. The data displayed in an SQL view can be extracted from one or more database tables.
An SQL view contains rows and columns, just like a real database table; the fields in an SQL view are fields from
one or more real tables in the database. You can add SQL functions, for example, WHERE or JOIN statements,
to a view and present the resulting data as if it were coming from one, single table.
SAP HANA Extended Application Services (SAP HANA XS) enables you to create a database view as a design-
time file in the repository. Repository files can be read by applications that you develop. In addition, all
repository files including your view definition can be transported to other SAP HANA systems, for example, in a
delivery unit.
If your application refers to the design-time version of a view from the repository rather than the runtime
version in the catalog, for example, by using the explicit path to the repository file (with suffix), any changes to
the repository version of the file are visible as soon as they are committed to the repository. There is no need to
wait for the repository to activate a runtime version of the view.
The following example shows the contents of a valid transportable view-definition file for a view called MYVIEW:
schema="MYSCHEMA";
query="SELECT T1.\"Column2\" FROM \"MYSCHEMA\".\"acme.com.test.views::MY_VIEW1\"
AS T1 LEFT JOIN \"MYSCHEMA\".\"acme.com.test.views::MY_VIEW2\" AS T2 ON
T1.\"Column1\" = T2.\"Column1\"";
depends_on=["acme.com.test.views::MY_VIEW1", "acme.com.test.views::MY_VIEW2"];
If you want to create a view definition as a design-time object, you must create the view as a flat file and save
the file containing the view definition with the suffix .hdbview, for example, MYVIEW.hdbview in the
Tip
On activation of a repository file, the file suffix (for example, .hdbview) is used to determine which runtime
plugin to call during the activation process. The plug-in reads the repository file selected for activation,
parses the object descriptions in the file, and creates the appropriate runtime objects.
In an SQL view defined using the .hdbview syntax, any dependency to another table or view must be declared
explicitly, for example, with the depends_on keyword. The target view or table specified in the depends_on
keyword must also be mentioned in the SELECT query that defines the SQL view. If one of more of the tables or
views specified in the dependency does not exist, the activation of the object in the repository fails.
Note
On initial activation of the SQL view, no check is performed to establish the existence of the target view (or
table) in the depends_on dependency; such a check is only made on reactivation of the SQL view.
If you want to assign names to the columns in a view, use the SQL query in the .hdbview file. In this example of
design-time view definition, the following names are specified for columns defined in the view:
● idea_id
● identity_id
● role_id
schema = "MYSCHEMA";
query = "SELECT role_join.idea_id AS idea_id, ident.member_id AS identity_id,
role_join.role_id AS role_id
FROM \"acme.com.odin.db.iam::t_identity_group_member_transitive\" AS
ident
INNER JOIN \"acme.com.odin.db.idea::t_idea_identity_role\" AS
role_join
ON role_join.identity_id = ident.group_id UNION DISTINCT
SELECT idea_id, identity_id, role_id
FROM \"acme.com.odin.db.idea::t_idea_identity_role\"
WITH read only";
Related Information
SAP HANA Extended Application Services (SAP HANA XS) enables you to use the hdbview syntax to create an
SQL view as a design-time file in the repository. The design-time artifact that contains the SQL view definition
must adhere to the .hdbview syntax specified below.
The following code illustrates a simple example of a design-time definition of an SQL view using the .hdbview
syntax.
Note
Keywords are case-sensitive, for example, schema and query, and the schema referenced in the table
definition, for example, MYSCHEMA, must already exist.
schema="MYSCHEMA";
public=false
query="SELECT T1.\"Column2\" FROM \"MYSCHEMA\".
\"acme.com.test.tables::MY_TABLE1\" AS T1 LEFT JOIN \"MYSCHEMA\".
\"acme.com.test.views::MY_VIEW1\" AS T2 ON T1.\"Column1\" = T2.\"Column1\"";
depends_on= "acme.com.test.tables::MY_TABLE1","acme.com.test.views::MY_VIEW1";
The following example shows the configuration schema for an SQL view that you define using the .hdbview
syntax. Each of the entries in the view-definition configuration schema is explained in more detail in a
dedicated section below:
string schema;
string query;
bool public(default=true);
optional list<string> depends_on_table;
optional list<string> depends_on_view;
Schema Name
To use the .hdbview syntax to specify the name of the schema that contains the SQL view you are defining,
use the schema keyword. In the SQL view definition, the schema keyword must adhere to the syntax shown in
the following example.
schema= "MYSCHEMA";
To use the .hdbview syntax to specify the query that creates the SQL view you are defining, use the query
keyword. In the SQL view definition, the query keyword must adhere to the syntax shown in the following
example.
For example:
public
To use the .hdbview syntax to specify whether or not the SQL view you are defining is publicly available, use
the boolean keyword public. In the SQL view definition, the public keyword must adhere to the syntax
shown in the following example.
public=[false|true];
For example:
public=false
Note
Depends on
In an SQL view defined using the .hdbview syntax, the optional keyword depends_on enables you to define a
dependency to one or more tables or views. In the .hdbview definition, the depends_on keyword must adhere
to the syntax shown in the following example.
depends_on=
["<repository.package.path>::<MY_TABLE_NAME1>","<repository.package.path>::<MY_VI
EW_NAME1>"];
Note
The depends_on keyword replaces and extends the keywords depends_on_table and
depends_on_view.
depends_on= ["acme.com.test.tables::MY_TABLE1","acme.com.test.views::MY_VIEW1"];
The target table or view specified in the depends_on keyword must be mentioned in the SELECT query that
defines the SQL view. On initial activation of the SQL view, no check is performed to establish the existence of
the target tables or views specified in the dependency; such a check is only made during reactivation of the
SQL view. If one or more of the target tables or views specified in the dependency does not exist, the re-
activation of the SQL view object in the repository fails.
Related Information
A database sequence generates an automatically incremented list of unique numeric values according to the
rules defined in the sequence specification. The numbers generated by a sequence can be used by
applications, for example, to identify the rows and columns of a table.
Prerequisites
Context
A sequence specification allows you to set the options that control the start and end point of the sequence, the
size of the increment, and the minimum and maximum values allowed. You can also specify if the sequence
should recycle when it reaches the maximum value specified. You create a sequence definition in a file using
the hdbsequence syntax.
Sequences are not associated with tables; they are referenced by applications, which can use CURRVAL in an
SQL statement to get the current value generated by a sequence and NEXTVAL to generate the next value in
the defined sequence. The relationship between sequences and tables is controlled by the application.
Sequences allow applications to generate unique, primary key values, for example, to identify the rows and
columns of a table, and to coordinate keys across multiple rows or tables.
schema= "MYSCHEMA";
start_with= 10;
maxvalue= 30;
nomaxvalue=false;
minvalue= 1;
nominvalue=true;
cycles= false;
reset_by= "SELECT T1.\"Column2\" FROM \"MYSCHEMA\".
\"com.acme.test.tables::MY_TABLE1\" AS T1 LEFT JOIN \"MYSCHEMA\".
\"com.acme.test.tables::MY_TABLE2\" AS T2 ON T1.\"Column1\" = T2.\"Column1\"";
depends_on=["com.acme.test.tables::MY_TABLE1",
"com.acme.test.tables::MY_TABLE2"];
Note that in this example no increment value is defined, so the default value of 1 (ascend by 1) is assumed.
To set a descending sequence of 1, set the increment_by value to -1.
Note
It is important to bear in mind that incorrectly defined sequences can lead to security-related
problems. For example, if the sequencing process becomes corrupted, it can result in data overwrite.
This can happen if the index has a maximum value which rolls-over, or if a defined reset condition is
triggered unexpectedly. A roll-over can be achieved by an attacker forcing data to be inserted by
flooding the system with requests. Overwriting log tables is a known practice for deleting traces. To
prevent unexpected data overwrite, use the following settings:
○ cycles= false
○ Avoid using the reset_by feature
Related Information
A sequence is a database object that generates an automatically incremented list of numeric values according
to the rules defined in the sequence specification. The sequence of numeric values is generated in an
ascending or descending order at a defined increment interval, and the numbers generated by a sequence can
be used by applications, for example, to identify the rows and columns of a table.
Sequences are not associated with tables; they are used by applications, which can use CURRVAL in a SQL
statement to get the current value generated by a sequence and NEXTVAL to generate the next value in the
defined sequence. Sequences provide an easy way to generate the unique values that applications use, for
example, to identify a table row or a field. In the sequence specification, you can set options that control the
start and end point of the sequence, the size of the increment size, or the minimum and maximum allowed
value. You can also specify if the sequence should recycle when it reaches the maximum value specified. The
relationship between sequences and tables is controlled by the application. Applications can reference a
sequence object and coordinate the values across multiple rows and tables.
SAP HANA Extended Application Services (SAP HANA XS) enables you to create a database sequence as a
transportable design-time file in the repository. Repository files can be read by applications that you develop.
● Generate unique, primary key values, for example, to identify the rows and columns of a table
● Coordinate keys across multiple rows or tables
The following example shows the contents of a valid sequence-definition file for a sequence called
MYSEQUENCE. Note that, in this example, no increment value is defined, so the default value of 1 (ascend by 1)
is assumed. To set a descending sequence of 1, set the increment_by value to -1.
schema= "TEST_DUMMY";
start_with= 10;
maxvalue= 30;
nomaxvalue=false;
minvalue= 1;
nominvalue=true;
cycles= false;
reset_by= "SELECT T1.\"Column2\" FROM \"MYSCHEMA\".
\"com.acme.test.tables::MY_TABLE1\" AS T1 LEFT JOIN \"MYSCHEMA\".
\"com.acme.test.tables::MY_TABLE2\" AS T2 ON T1.\"Column1\" = T2.\"Column1\"";
depends_on=["com.acme.test.tables::MY_TABLE1",
"com.acme.test.tables::MY_TABLE2"];
The sequence definition is stored in the repository with the suffix hdbsequence, for example,
MYSEQUENCE.hdbsequence.
Note
A schema generated from an .hdbsequence artifact can also be used in the context of Core Data Services
(CDS).
If you activate a sequence-definition object in SAP HANA XS, the activation process checks if a sequence with
the same name already exists in the SAP HANA repository. If a sequence with the specified name does not
exist, the repository creates a sequence with the specified name and makes _SYS_REPO the owner of the new
sequence.
In a sequence defined using the .hdbsequence syntax, the reset_by keyword enables you to reset the
sequence using a query on any view, table or even table function. However, any dependency must be declared
Note
On initial activation of the sequence definition, no check is performed to establish the existence of the
target view (or table) in the dependency; such a check is only made on reactivation of the sequence
definition.
Security Considerations
It is important to bear in mind that an incorrectly defined sequences can lead to security-related problems. For
example, if the sequencing process becomes corrupted, it can result in data overwrite. This can happen if the
index has a maximum value which rolls-over, or if a defined reset condition is triggered unexpectedly. A roll-
over can be achieved by an attacker forcing data to be inserted by flooding the system with requests.
Overwriting log tables is a known practice for deleting traces. To prevent unexpected data overwrite, use the
following settings:
● cycles= false
● Avoid using the reset_by feature
Related Information
Create a Sequence
Sequence Configuration Syntax [page 245]
SAP HANA Extended Application Services (SAP HANA XS) enables you to use the hdbsequence syntax to
create a database sequence as a design-time file in the repository. The design-time artifact that contains the
sequence definition must adhere to the .hdbsequence syntax specified below.
Sequence Definition
The following code illustrates a simple example of a design-time sequence definition using the .hdbsequence
syntax.
Keywords are case-sensitive, for example, maxvalue and start_with, and the schema referenced in the table
definition, for example, MYSCHEMA, must already exist.
schema= "MYSCHEMA";
start_with= 10;
maxvalue= 30;
nomaxvalue= false;
minvalue= 1;
nominvalue= true;
cycles= false;
reset_by= "SELECT T1.\"Column2\" FROM \"MYSCHEMA\".
\"com.acme.test.tables::MY_TABLE1\" AS T1 LEFT JOIN \"MYSCHEMA\".
\"com.acme.test.tables::MY_TABLE2\" AS T2 ON T1.\"Column1\" = T2.\"Column1\"";
depends_on= ["com.acme.test.tables::MY_TABLE1",
"com.acme.test.tables::MY_TABLE2"];
The following example shows the configuration schema for sequences defined using the .hdbsequence
syntax. Each of the entries in the sequence-definition configuration schema is explained in more detail in a
dedicated section below:
string schema;
int32 increment_by(default=1);
int32 start_with(default=-1);
optional int32 maxvalue;
bool nomaxvalue(default=false);
optional int32 minvalue;
bool nominvalue(default=false);
optional bool cycles;
optional string reset_by;
bool public(default=false);
optional string depends_on_table;
optional string depends_on_view;
optional list<string> depends_on;
Schema Name
To use the .hdbsequence syntax to specify the name of the schema that contains the sequence you are
defining, use the schema keyword. In the sequence definition, the schema keyword must adhere to the syntax
shown in the following example.
schema= "MYSCHEMA";
To use the .hdbsequence syntax to specify that the sequence increments by a defined value, use the
increment_by keyword. increment_by specifies the amount by which the next sequence value is
incremented from the last value assigned. The default increment is 1. In the sequence definition, the
increment_by keyword must adhere to the syntax shown in the following example.
increment_by= 2;
Note
Start Value
To use the .hdbsequence syntax to specify that the sequence starts with a specific value, use the
start_with keyword. If you do not specify a value for the start_with keyword, the value defined in
minvalue is used for ascending sequences, and value defined in maxvalue is used for descending sequences.
In the sequence definition, the start_with keyword must adhere to the syntax shown in the following
example.
start_with= 10;
Maximum Value
To use the .hdbsequence syntax to specify that the sequence stops at a specific maximum value, for
example, 30, use the optional keyword maxvalue. In the sequence definition, the maxvalue keyword must
adhere to the syntax shown in the following example.
maxvalue= 30;
Note
The maximum value (maxvalue) a sequence can generate must be between -4611686018427387903 and
4611686018427387902.
No Maximum Value
To use the .hdbsequence syntax to specify that the sequence does not stop at any specific maximum value,
use the boolean keyword nomaxvalue. When the nomaxvalue keyword is used, the maximum value for an
nomaxvalue= true;
Note
Minimum Value
To use the .hdbsequence syntax to specify that the sequence stops at a specific minimum value, for example,
1, use the minvalue keyword. In the sequence definition, the minvalue keyword must adhere to the syntax
shown in the following example.
minvalue= 1;
Note
The minimum value (minvalue) a sequence can generate must be between -4611686018427387903 and
4611686018427387902.
No Minimum Value
To use the .hdbsequence syntax to specify that the sequence does not stop at any specific minimum value,
use the boolean keyword nominvalue. When the nominvalue keyword is used, the minimum value for an
ascending sequence is 1 and the minimum value for a descending sequence is -4611686018427387903. In the
sequence definition, the nominvalue keyword must adhere to the syntax shown in the following example.
nominvalue= true;
Note
Cycles
In a sequence defined using the .hdbsequence syntax, the optional boolean keyword cycles enables you to
specify whether the sequence number will be restarted after it reaches its maximum or minimum value. For
example, the sequence restarts with minvalue after having reached maxvalue (where increment_by is
greater than zero (0)) or restarts with maxvalue after having reached minvalue (where increment_by is
cycles= false;
Reset by Query
In a sequence defined using the .hdbsequence syntax, the reset_by keyword enables you to reset the
sequence using a query on any view, table or even table function. However, any dependency must be declared
explicitly, for example, with the depends_on_view or depends_on_table keyword. If the table or view
specified in the dependency does not exist, the activation of the sequence object in the repository fails.
In the .hdbsequence definition, the reset_by keyword must adhere to the syntax shown in the following
example.
During a restart of the database, the system automatically executes the reset_by statement and the
sequence value is restarted with the value determined from the reset_by subquery
Note
If reset_by is not specified, the sequence value is stored persistently in the database. During the restart of
the database, the next value of the sequence is generated from the saved sequence value.
Depends on
In a sequence defined using the .hdbsequence syntax, the optional keyword depends_on enables you to
define a dependency to one or more specific tables or views, for example when using the reset_by option to
specify the query to use when resetting the sequence. In the .hdbsequence definition, the depends_on
keyword must adhere to the syntax shown in the following example.
depends_on=
["<repository.package.path>::<MY_TABLE_NAME1>","<repository.package.path>::<MY_VI
EW_NAME1>"];
Note
The depends_on keyword replaces and extends the keywords depends_on_table and
depends_on_view.
For example, to specify multiple tables and views with the depends_on keyword, use a comma-separated list
enclosed in square brackets [].
depends_on= ["com.acme.test.tables::MY_TABLE1",
"com.acme.test.tables::MY_TABLE2", "com.acme.test.views::MY_VIEW1"];
Related Information
Create a Sequence
Sequences [page 244]
Extended Application Services (SAP HANA XS) enables you to create a local database synonym as a design-
time file in the repository.
Prerequisites
Context
In SAP HANA, a design-time synonym artifact has the suffix .hdbsynonym and defines the target object by
specifying an authoring schema and an object name; its activation evaluates a system's schema mapping to
determine the physical schema in which the target table is expected, and creates a local synonym that points
to this object.
Restriction
A design-time synonym cannot refer to another synonym, and you cannot define multiple synonyms in a
single design-time synonym artifact. In addition, the target object specified in a design-time synonym must
only exist in the catalog; it is not possible to use .hdbsynonym to define a synonym for a catalog object
that originates from a design-time artifact.
Procedure
Sample Code
{ "acme.com.app1::MySynonym1" : {...}}
a. In the package structure, select the package where you want to create the new synonym-definition file
and from the context menu choose New File .
b. Enter the name of the synonym in the File Name field, for example, MySynonym1.hdbsynonym, and
choose Create.
3. Define the synonym.
Add the synonym-definition code to the new file, as shown in the example below, replacing object names
and paths to suit your requirements.
Sample Code
{ "acme.com.app1::MySynonym1" : {
"target" : {
"schema": "DEFAULT_SCHEMA",
"object": "MY_ERP_TABLE_1"
},
"schema": "SCHEMA_2"
}
}
Related Information
4.2.6.1 Synonyms
SAP HANA Extended Application Services (SAP HANA XS) enables you to create a design-time representation
of a local database synonym. The synonym enables you to refer to a table (for example, from a view) that only
exists as a catalog object.
In SAP HANA XS, a design-time representation of a local synonym has the suffix.hdbsynonym that you can
store in the SAP HANA repository. The syntax of the design-time synonym artifact requires you to define the
Restriction
A synonym cannot refer to another synonym, and you cannot define multiple synonyms in a single design-
time synonym artifact. In addition, the target object specified in a design-time synonym must only exist in
the catalog; it is not possible to define a define-time synonym for a catalog object that originates from a
design-time artifact.
In the following example of a design-time synonym artifact, the table MY_ERP_TABLE_1 resides in the schema
DEFAULT_SCHEMA. The activation of the design-time synonym artifact illustrated in the example would
generate a local synonym ("acme.com.app1::MySynonym1") in the schema SCHEMA_2. Assuming that a
schema-mapping table exists that maps DEFAULT_SCHEMA to the schema SAP_SCHEMA, the newly
generated synonym "SCHEMA_2"."acme.com.app1::MySynonym1" points to the run-time object
"SAP_SCHEMA"."MY_ERP_TABLE_1".
Sample Code
MySynonym1.hdbsynonym
{ "acme.com.app1::MySynonym1" : {
"target" : {
"schema": "DEFAULT_SCHEMA",
"object": "MY_ERP_TABLE_1"
},
"schema": "SCHEMA_2"
}
}
Tip
Related Information
Synonym Definition
SAP HANA Extended Application Services (SAP HANA XS) enables you to use the hdbsynonym syntax to
create a database synonym as a design-time file in the repository. On activation, a local synonym is generated
in the catalog in the specified schema. The design-time artifact that contains the synonym definition must
adhere to the .hdbsynonym syntax specified below.
Note
The activation of the design-time synonym artifact illustrated in the following example generates a local
synonym ("acme.com.app1::MySynonym1") in the schema SCHEMA_2.
Sample Code
MySynonym1.hdbsynonym
{ "acme.com.app1::MySynonym1" : {
"target" : {
"schema": "DEFAULT_SCHEMA",
"object": "MY_ERP_TABLE_1"
},
"schema [page 254]": "SCHEMA_2"
}
}
Synonym Location
In the first line of the synonym-definition file, you must specify the absolute repository path to the package
containing the synonym artifact (and the name of the synonym artifact) itself using the syntax illustrated in the
following example.
Code Syntax
{ "<full.path.to.package>::<MySynonym1>" : {...}}
For example, to generate a synonym called "acme.com.app1::MySynonym1", you must create a design-time
artifact called MySynonym1.hdbsynonym in the repository package acme.com.app1; the first line of the
design-time synonym artifact must be specified as illustrated in the following example.
Sample Code
{ "acme.com.app1::MySynonym1" : {...}}
To specify the name and location of the object for which you are defining a synonym, use the target keyword
together with the keywords schema and object. In the synonym definition, the target keyword must adhere
to the syntax shown in the following example.
Code Syntax
"target" : {
"schema": "<Name_of_schema_containing_<"object">",
"object": "<Name_of_target_object>"
},
In the context of the target keyword, the following additional keywords are required:
● schema defines the name of the schema where the target object (defined in object) is located.
● object specifies the name of the catalog object to which the synonym applies.
Restriction
The target object specified in a design-time synonym must only exist in the catalog; it is not possible to
define a design-time synonym for a catalog object that originates from a design-time artifact.
schema
To specify the catalog location of the generated synonym, use the schema keyword. In the synonym definition,
the schema keyword must adhere to the syntax shown in the following example.
Code Syntax
"schema": "<Schema_location_of_generated_synonym>"
Related Information
The table-import function is a data-provisioning tool that enables you to import data from comma-separated
values (CSV) files into SAP HANA database tables.
Context
In this tutorial, you import data from a CSV file into a table generated from a design-time definition that uses
the .hdbtable syntax. Note that the names used are just examples; where necessary, replace the names of
the schema, tables, files, and so on with your own names.
Procedure
Note
Naming conventions exist for package names, for example, a package name must not start with
either a dot (.) or a hyphen (-) and cannot contain two or more consecutive dots (..). In addition,
the name must not exceed 190 characters.
The following files are required for the table import scenario:
○ The table-import configuration file, for example, inhabitants.hdbti
Specifies the source file containing the data values to import and the target table in SAP HANA into
which the data will be inserted.
○ A CSV file, for example, inhabitants.csv
Contains the data to be imported into the SAP HANA table during the table-import operation; values in
the .csv file can be separated either by a comma (,) or a semi-colon (;).
○ A target table, for example, inhabitants.hdbtable
The target table can be either a runtime table in the catalog or a table definition, for example, a table
defined using the .hdbtable syntax (TiTable.hdbtable) or the CDS-compliant .hdbdd syntax
(TiTable.hdbdd).
In this tutorial, the target table for the table-import operation is inhabitants.hdbtable, a
design-time table defined using the .hdbtable syntax.
schema_name="AMT";
b. To grant schema privileges to yourself, select the AMT.hdbschema file and choose (Assign
execution authorization).
You are assigned the requested schema privileges.
5. Create or open the table-definition file for the target import table (inhabitants.hdbtable) and enter
the following lines of text; this example uses the .hdbtable syntax:
table.schemaName = "AMT";
table.tableType = COLUMNSTORE;
table.columns =
[
{name = "ID"; sqlType = VARCHAR; nullable = false; length = 20; comment =
"";},
{name = "surname"; sqlType = VARCHAR; nullable = true; length = 30; comment =
"";},
{name = "name"; sqlType = VARCHAR; nullable = true; length = 30; comment =
"";},
{name = "city"; sqlType = VARCHAR; nullable = true; length = 30; comment =
"";}
];
table.primaryKey.pkcolumns = ["ID"];
6. Open the CSV file containing the data to import, for example, inhabitants.csv in a text editor and enter
the values shown in the following example:
0,Annan,Kofi,Accra
1,Essuman,Wiredu,Tema
2,Tetteh,Kwame,Kumasi
3,Nterful,Akye,Tarkwa
4,Acheampong,Kojo,Tamale
5,Assamoah,Adjoa,Takoradi
6,Mensah,Afua,Cape Coast
Note
You can import data from multiple .csv files in a single, table-import operation. However, each .csv
file must be specified in a separate code block ({table= ...}) in the table-import configuration file.
7. Create or open the table-import configuration file (inhabitants.hdbti) and enter the following lines of
text (make sure the paths point to the correct locations in your environment):
import = [
{
Expand the AMT Tables node, select the table mycompany.tests.TiTest::inhabitants, and
from the context menu choose Open Content.
In SAP HANA XS, you create a table-import scenario by setting up an table-import configuration file and one or
more comma-separated value (CSV) files containing the content you want to import into the specified SAP
HANA table. The import-configuration file links the import operation to one or more target tables. The table
definition (for example, in the form of a .hdbdd or .hdbtable file) can either be created separately or be
included in the table-import scenario itself.
To use the SAP HANA XS table-import feature to import data into an SAP HANA table, you need to understand
the following table-import concepts:
● Table-import configuration
You define the table-import model in a configuration file that specifies the data fields to import and the
target tables for each data field.
Note
The table-import file must have the .hdbti extension, for example, myTableImport.hdbti.
The following constraints apply to the CSV file used as a source for the table-import feature in SAP HANA XS:
● The number of table columns must match the number of CSV columns.
● There must not be any incompatibilities between the data types of the table columns and the data types of
the CSV columns.
● Overlapping data in data files is not supported.
● The target table of the import must not be modified (or appended to) outside of the data-import operation.
If the table is used for storage of application data, this data may be lost during any operation to re-import
or update the data.
You can define the elements of a table-import operation in a design-time file; the configuration includes
information about source data and the target table in SAP HANA.
SAP HANA Extended Application Services (SAP HANA XS) enables you to perform data-provisioning
operations that you define in a design-time configuration file. The configuration file is transportable, which
means you can transfer the data-provisioning between SAP HANA systems quickly and easily.
The table-import configuration enables you to specify how data from a comma-separated-value (.csv) file is
imported into a target table in SAP HANA. The configuration specifies the source file containing the data values
to import and the target table in SAP HANA into which the data must be inserted. As further options, you can
specify which field delimiter to use when interpreting data in the source .csv file and if keys must be used to
determine which columns in the target table to insert the imported data into.
Note
If you use multiple table import configurations to import data into a single target table, the keys keyword is
mandatory. This is to avoid problems relating to the overwriting or accidental deletion of existing data.
The following example of a table-import configuration shows how to define a simple import operation which
inserts data from the source files myData.csv and myData2.csv into the table myTable in the schema
mySchema.
import = [
{
table = "myTable";
schema = "mySchema";
file = "sap.ti2.demo:myData.csv";
header = false;
delimField = ";";
keys = [ "GROUP_TYPE" : "BW_CUBE"];
},
{
table = "sap.ti2.demo::myTable";
file = "sap.ti2.demo:myData2.csv";
header = false;
delimField = ";";
keys = [ "GROUP_TYPE" : "BW_CUBE"];
}
];
In the table import configuration, you can specify the target table using either of the following methods:
Note
Both the schema and the target table specified in the table-import operation must already exist. If either
the specified table or the schema does not exist, SAP HANA XS displays an error message during the
activation of the configuration file, for example: Table import target table cannot be found. or
Schema could not be resolved.
You can also use one table-import configuration file to import data from multiple .csv source files. However,
you must specify each import operation in a new code block introduced by the [hdb | cds]table keyword, as
illustrated in the example above.
By default, the table-import operation assumes that data values in the .csv source file are separated by a
comma (,). However, the table-import operation can also interpret files containing data values separated by a
semi-colon (;).
,,,BW_CUBE,,40000000,2,40000000,all
;;;BW_CUBE;;40000000;3;40000000;all
Note
If the activated .hdbti configuration used to import data is subsequently deleted, only the data that was
imported by the deleted .hdbti configuration is dropped from the target table. All other data including any
data imported by other .hdbti configurations remains in the table. If the target CDS entity has no key
(annotated with @nokey) all data that is not part of the CSV file is dropped from the table during each
table-import activation.
You can use the optional keyword keys to specify the key range taken from the source .csv file for import into
the target table. If keys are specified for an import in a table import configuration, multiple imports into same
target table are checked for potential data collisions.
Note
The configuration-file syntax does not support wildcards in the key definition; the full value of a selectable
column value has to be specified.
Security Considerations
In SAP HANA XS, design-time artifacts such as tables (.hdbtable or .hdbdd) and table-import
configurations (.hdbti) are not normally exposed to clients via HTTP. However, design-time artifacts
containing comma-separated values (.csv) could be considered as potential artifacts to expose to users
through HTTP. For this reason, it is essential to protect these exposed .csv artifacts by setting the appropriate
Tip
Place all the .csv files used to import content to into tables together in a single package and set the
appropriate (restrictive) application-access permissions for that package, for example, with a
dedicated .xsaccess file.
Related Information
The design-time configuration file used to define a table-import operation requires the use of a specific syntax.
The syntax comprises a series of keyword=value pairs.
If you use the table-import configuration syntax to define the details of the table-import operation, you can use
the keywords illustrated in the following code example. The resulting design-time file must have the .hdbti file
extension, for example, myTableImportCfg.hdbti.
import = [
{
table = "myTable";
schema = "mySchema";
file = "sap.ti2.demo:myData.csv";
header = false;
useHeaderNames = false;
delimField = ";";
delimEnclosing=“\““;
distinguishEmptyFromNull = true;
keys = [ "GROUP_TYPE" : "BW_CUBE", "GROUP_TYPE" : "BW_DSO", "GROUP_TYPE" :
"BW_PSA"];
}
];
table
In the table-import configuration, the table, cdstable, and hdbtable keywords enable you to specify the
name of the target table into which the table-import operation must insert data. The target table you specify in
the table-import configuration can be a runtime table in the catalog or a design-time table definition, for
example, a table defined using either the .hdbtable or the .hdbdd (Core Data Services) syntax.
The target table specified in the table-import configuration must already exist. If the specified table does
not exist, SAP HANA XS displays an error message during the activation of the configuration file, for
example: Table import target table cannot be found.
Use the table keyword in the table-import configuration to specify the name of the target table using the
qualified name for a catalog table.
table = "target_table";
schema = "mySchema";
Note
You must also specify the name of the schema in which the target catalog table resides, for example, using
the schema keyword.
The hdbtable keyword in the table-import configuration enables you to specify the name of a target table using
the public synonym for a design-time table defined with the .hdbtable syntax.
hdbtable = "sap.ti2.demo::target_table";
The cdstable keyword in the table-import configuration enables you to specify the name of a target table using
the public synonym for a design-time table defined with the CDS-compliant .hdbdd syntax.
cdstable = "sap.ti2.demo::target_table";
Caution
There is no explicit check if the addressed table is created using the .hdbtable or CDS-compliant .hdbdd
syntax.
If the table specified with the cdstable or hdbtable keyword is not defined with the corresponding syntax,
SAP HANA displays an error when you try to activate the artifact, for example,Invalid combination of
table declarations found, you may only use [cdstable | hdbtable | table] .
schema
The following code example shows the syntax required to specify a schema in a table-import configuration.
schema = "TI2_TESTS";
Note
The schema specified in the table-import configuration file must already exist.
If the schema specified in a table-import configuration file does not exist, SAP HANA XS displays an error
message during the activation of the configuration file, for example:
The schema is only required if you use a table's schema-qualified catalog name to reference the target table for
an import operation, for example, table = "myTable"; schema = "mySchema";. The schema is not
required if you use a public synonym to reference a table in a table-import configuration, for example,
hdbtable = "sap.ti2.demo::target_table";.
file
Use the file keyword in the table-import configuration to specify the source file containing the data that the
table-import operation imports into the target table. The source file must be a .csv file with the data values
separated either by a comma (,) or a semi-colon (;). The file definition must also include the full package path
in the SAP HANA repository.
file = "sap.ti2.demo:myData.csv";
header
Use the header keyword in the table-import configuration to indicate if the data contained in the
specified .csv file includes a header line. The header keyword is optional, and the possible values are true or
false.
header = false;
useHeaderNames
Use the useHeaderNames keyword in the table-import configuration to indicate if the data contained in the
first line of the specified .csv file must be interpreted. The useHeaderNames keyword is optional; it is used in
combination with theheader keyword. The useHeaderNames keyword is boolean: possible values are true or
false.
Note
useHeaderNames = false;
The table-import process considers the order of the columns; if the column order specified in the .csv, file
does not match the order used for the columns in the target table, an error occurs on activation.
Use the delimField keyword in the table-import configuration to specify which character is used to separate
the values in the data to be imported. Currently, the table-import operation supports either the comma (,) or
the semi-colon (;). The following example shows how to specify that values in the .csv source file are
separated by a semi-colon (;).
delimField = ";";
Note
By default, the table-import operation assumes that data values in the .csv source file are separated by a
comma (,). If no delimiter field is specified in the .hdbti table-import configuration file, the default setting
is assumed.
delimEnclosing
Use the delimEnclosing keyword in the table-import configuration to specify a single character that
indicates both the start and end of a set of characters to be interpreted as a single value in the .csv file, for
example “This is all one, single value”. This feature enables you to include in data values in a .CSV file even the
character defined as the field delimiter (in delimField), for example, a comma (,) or a semi-colon (;).
Tip
If the value used to separate the data fields in your .csv file (for example, the comma (,)) is also used
inside the data values themselves ("This, is, a, value"), you must declare and use a delimiter
enclosing character and use it to enclose all data values to be imported.
The following example shows how to use the delimEnclosing keyword to specify the quote (") as the
delimiting character that indicates both the start and the end of a value in the .csv file. Everything enclosed
between the delimEnclosing characters (in this example, “”) is interpreted by the import process as one,
single value.
delimEnclosing=“\““;
Note
Since the hdbti syntax requires us to use the quotes (“”) to specify the delimiting character, and the
delimiting character in this example is, itself, also a quote ("), we need to use the backslash character (\) to
escape the second quote (").
In the following example of values in a .csv file, we assume that delimEnclosing“\““, and
delimField=",". This means that imported values in the .csv file are enclosed in the quote character
("value”) and multiple values are separated by the comma ("value1”,"value 2”). Any commas inside the
quotes are interpreted as a comma and not as a field delimiter.
distinguishEmptyFromNull
Use the distinguishEmptyFromNull keyword in combination with delimEnclosing to ensure that the
table-import process correctly interprets any empty value in the .CSV file, which is enclosed with the value
defined in the delimEnclosing keyword, for example, as an empty space. This ensures that an empty space
is imported “as is” into the target table. If the empty space in incorrectly interpreted, it is imported as NULL.
distinguishEmptyFromNull = true;
Note
"Value1",,"",Value2
The table-import process would add the values shown in the example .csv above into the target table as
follows:
keys
Use the keys keyword in the table-import configuration to specify the key range to be considered when
importing the data from the .csv source file into the target table.
In the example above, all the lines in the .csv source file where the GROUP_TYPE column value matches one of
the given values (BW_CUBE, BW_DSO, or BW_PSA) are imported into the target table specified in the table-import
configuration.
;;;BW_CUBE;;40000000;3;40000000;slave
;;;BW_DSO;;40000000;3;40000000;slave
;;;BW_PSA;;2000000000;1;2000000000;slave
;;;;;40000000;2;40000000;all
During the course of the activation of the table-import configuration and the table-import operation itself, SAP
HANA checks for errors and displays the following information in a brief message.
40201 If you import into a catalog table, You specified a target table with the table key
please provide schema word but did not specify a schema with the
schema keyword.
40202 Schema could not be resolved The schema specified with the schema key
word does not exist or could not be found
(wrong name).
40203 Schema resolution error The schema specified with the schema key
word does not exist or could not be found
(wrong name).
40204 Table import target table cannot be The table specified with the table keyword does
found not exist or could not be found (wrong name or
wrong schema name).
40210 Table import syntax error The table-import configuration file (.hdbti)
contains one or more syntax errors.
40211 Table import constraint checks The same key is specified in multiple table-im
failed port configurations (.hdbti files), which
leads to overlaps in the range of data to import.
40212 Importing data into table failed Either duplicate keys were written (due to du
plicates in the .CSV source file) or
40213 CSV table column count mismatch Either the number of columns in the .CSV re
cord is higher than the number of columns in
the table, or
40214 Column type mismatch The .CSV file does not match the target table
for either of the following reasons:
40216 Key does not match to table header For some key columns of the table, no data are
provided.
SQL in SAP HANA includes extensions for creating procedures, which enables you to embed data-intensive
application logic into the database, where it can be optimized for performance (since there are no large data
transfers to the application and features such as parallel execution are possible). Procedures are used when
other modeling objects, such as analytic or attribute views, are not sufficient.
Languages
● SQLScript: The language that SAP HANA provides for writing procedures.
● R: An open-source programming language for statistical computing and graphics, which can be installed
and integrated with SAP HANA.
There are additional libraries of procedures, called Business Function Library and Predictive Analysis Library,
that can be called via SQL or from within another procedure.
● CREATE TYPE: Creates a table types, which are used to define parameters for a procedure that represent
tabular results. For example:
● CREATE PROCEDURE: Creates a procedure. The LANGUAGE clause specifies the language you are using to
code the procedure. For example:
Code Completion
In the SAP HANA Web-based Development Workbench, the editor provides semantic code completion for the
hdbprocedure file type. The semantic code completion feature is a context based search tool that lists
suggested catalog objects and local variables that assist you with developing accurate stored procedures in a
faster and more efficient matter. You can quickly identify valid objects, reducing errors during activation. Code
completion proposals take into consideration SQLScript grammar, context specific schemas, and textual input.
● Catalog objects: such as schemas, views, table functions, procedures, scalar functions, synonyms
● Local variables: such as input and output parameters, declared scalar variables
● Database artifacts
The list of proposals contain syntactic and semantic proposals listed in the following order:
1. Local variables
2. Catalog objects (maximum of 50 suggestions)
3. Keywords
Note
Objects selected from the proposed list might be automatically inserted as quoted identifiers based on the
SQLScript language guidelines For example, if the object contains special characters, or lower and upper
case characters.
You can create and edit procedures in the SAP HANA Web-based Development Workbench Editor tool.
Prerequisites
You have the privileges granted by the role sap.hana.ide.roles::EditorDeveloper; this role is included in the
parent role sap.hana.ide.roles::Developer.
a. Select the package where you want to create the new stored procedure and from the context menu
choose New HDB Procedure .
b. Enter the required data:
○ File name: Enter the file name without the file extension. The file extension .hdbprocedure is
added automatically when the file is created.
○ Schema: Enter the name of an existing schema.
c. Choose Create.
The new procedure is listed under the package you selected.
3. Define the new stored procedure.
Begin writing your code inside your new procedure and save it. The syntax is checked simultaneously and
is highlighted. Auto-completion of the syntax appears as you type or by using the semantic code
completion feature.
Note
You can only write one stored procedure per file. The file name and the procedure name must be the
same. Only SQLScript language is supported for hdbprocedure procedures.
Note
Text-based searches display the object names that begin with and contain the entered text.
Searches are asynchronous, the suggested list is updated in parallel to the user's refined textual
input.
c. Use the arrow keys to scroll through the list, click Enter to select the object, or Esc to close the code
completion window without selecting an object.
4. Save the procedure.
Your procedure is saved and activated and is now available in the catalog as a runtime object. This allows
you and other users to call the procedure and debug it.
Related Information
You can use a table type to define parameters for a procedure; the table type represents tabular results.
Prerequisites
Context
If you define a procedure that uses data provided by input and output parameters, you can use table types to
store the parameterized data. These parameters have a type and are either based on a global table (for
example, a catalog table), a global table type, or a local (inline) table type. This task shows you two ways to use
the .hdbprocedure syntax to define a text-based design-time procedure artifact; the parameterized data for
your procedure can be stored in either of the following ways:
● Global
In an externally defined (and globally available) table type, for example, using the Core Data Service (CDS)
syntax
● Local:
In a table type that is defined inline, for example, in the procedure itself
Procedure
1. Create a procedure that uses data provided by a local (inline) table type.
To define a text-based design-time procedure, use the .hdbprocedure syntax. The procedure in this
example stores data in a local table type defined inline, that is; in the procedure itself.
Note
If you plan to define a global table type (for example, using CDS) you can skip this step.
The table used to store the parameterized data is defined inline, in the procedure's OUT parameter.
PROCEDURE
SAP_HANA_EPM_NEXT."sap.hana.democontent.epmNext.procedures::get_product_sal
e_price" (
IN im_productid NVARCHAR(10),
OUT ex_product_sale_price table (
"PRODUCTID" nvarchar(10),
"CATEGORY" nvarchar(40),
"PRICE" decimal(15,2),
"SALEPRICE" decimal(15,2) ) )
LANGUAGE SQLSCRIPT
SQL SECURITY INVOKER
DEFAULT SCHEMA SAP_HANA_EPM_NEXT
READS SQL DATA AS
BEGIN
Note
This is only required if you want to use a global table type. If you plan to define a table type inline, you
can skip this step.
namespace sap.hana.democontent.epmNext.data;
@Schema: 'SAP_HANA_EPM_NEXT'
context GlobalTypes {
type tt_product_sale_price {
PRODUCTID: String(10);
CATEGORY: String(40);
PRICE: Decimal(15,2);
SALEPRICE: Decimal(15,2);
};
};
Note
This is only required if you want to use a global table type. If you plan to define a table type inline, you
can skip this step.
Tip
The OUT parameter refers to the CDS type tt_product_sale_price defined in the CDS
document GlobalTypes.hdbdd.
PROCEDURE
SAP_HANA_EPM_NEXT."sap.hana.democontent.epmNext.procedures::get_product_sal
e_price" (
IN im_productid NVARCHAR(10),
OUT ex_product_sale_price SAP_HANA_EPM_NEXT."
sap.hana.democontent.epmNext.data::GlobalTypes.tt_product_sale_price")
LANGUAGE SQLSCRIPT
SQL SECURITY INVOKER
DEFAULT SCHEMA SAP_HANA_EPM_NEXT
READS SQL DATA AS
BEGIN
SQLScript procedures can make use of standard SQL statements to build a query that requests data and
returns a specified result set.
Prerequisites
● You have installed the SAP HANA Interactive Education (SHINE) HCODEMOCONTENT delivery unit (DU); this
DU contains the tables and views that you want to consume with the procedure you build in this tutorial.
● You have generated data to populate the tables and views provided by the SHINE delivery unit and used in
this tutorial. You can generate the data with tools included in the SHINE delivery unit.
Note
You might have to adjust the paths in the code examples provided to suit the package hierarchy in your SAP
HANA repository, for example, to point to the underlying content (demonstration tables and services)
referenced in the tutorial.
The stored procedure you create in this tutorial uses standard SQL statements (for example, SELECT
statements) and some imperative logic constructs to determine the sale price of a product based on the
product category.
Procedure
a. In the package where you want to create the new stored procedure, create a new subpackage called
procedures, if not already available.
b. From the context menu of the procedures folder, choose New HDB Procedure .
c. Enter the required data:
○ File name: Enter the file name get_product_sales_price. The file extension .hdbprocedure
is added automatically when the file is created.
○ Schema: Enter the name of an existing schema, for example, MYSCHEMA.
d. Choose Create.
3. Define the new stored procedure.
This procedure uses standard SQL statements and some imperative logic constructs to determine the sale
price of a product based on the product category.
a. In the get_product_sales_price.hdbprocedure file, use the following code to define the details
of the stored procedure:
Sample Code
Remember
Remember to replace the schema name and fully qualified procedure name with the ones you have
used. In the example above, the schema name is MYSCHEMA and the fully qualified procedure name
is demo.procedures::get_product_sales_price.
b. Save the changes you have made to the new stored procedure.
4. Preview the data in the editor.
c. Choose (Run).
The SQL result is displayed.
5. Open the catalog and check that the new stored procedure was successfully created in the correct
location.
Example: Catalog.MYSCHEMA.Procedures.demo.procedures::get_product_sales_price
6. Test the new stored procedure in the catalog.
a. In the context menu of the file (for example, demo.procedures::get_product_sales_price),
choose Invoke Procedure.
The SQL console opens.
Sample Code
Sample Code
CALL "MYSCHEMA"."demo.procedures::get_product_sales_price"
(PRODUCTID => 'HT-1000', PRODUCT_SALE_PRICE => ? );
In SQL, a user-defined function (UDF) enables you to build complex logic into a single database object. A scalar
UDF is a custom function that can be called in the SELECT and WHERE clauses of an SQL statement.
Prerequisites
● You have installed the SAP HANA Interactive Education (SHINE) HCODEMOCONTENT delivery unit (DU); this
DU contains the tables and views that you want to consume with the procedure you build in this tutorial.
● You have generated data to populate the tables and views provided by the SHINE delivery unit and used in
this tutorial. You can generate the data with tools included in the SHINE delivery unit.
Note
You might have to adjust the paths in the code examples provided to suit the package hierarchy in your SAP
HANA repository, for example, to point to the underlying content (demonstration tables and services)
referenced in the tutorial.
Context
A scalar user-defined function has a list of input parameters and returns the scalar values specified in the
RETURNS <return parameter list> option defined in the SQL function, for example, decimal(15,2).
The scalar UDF named apply_discount that you create in this tutorial performs the following actions:
Procedure
FUNCTION
"SAP_HANA_DEMO"."sap.hana.democontent.epm.functions::apply_discount"
(im_price decimal(15,2),
im_discount decimal(15,2) )
RETURNS result decimal(15,2)
LANGUAGE SQLSCRIPT
SQL SECURITY INVOKER AS
BEGIN
result := :im_price - ( :im_price * :im_discount );
END;
b. Save the changes you have made to the new scalar UDF.
c. Check the catalog to ensure the new UDF was successfully created in the correct location, for example:
Catalog.SAP_HANA_DEMO.Functions.sap.hana.democontent.epm.functions::apply_dis
count
4. Use the new UDF in an SQL select statement.
b. To call the new UDF, enter the following SQL statement and choose (Run). Remember to modify the
paths to point to the correct locations in your environment, for example, the schema name, the
package location of the new UDF, and the location of the demo table referenced in the code.
"SAP_HANA_DEMO"."sap.hana.democontent.epm.functions::apply_discount"(PRICE,
0.33 )
as "SalePrice" from
"sap.hana.democontent.epm.data::EPM.MD.Products";
In SQL, a user-defined function (UDF) enables you to build complex logic into a single database object that you
can call from a SELECT statement. You can use a table user-defined function (UDF) to create a parameterized,
fixed view of the data in the underlying tables.
Prerequisites
● You have installed the SAP HANA Interactive Education (SHINE) HCODEMOCONTENT delivery unit (DU); this
DU contains the tables and views that you want to consume with the procedure you build in this tutorial.
● You have generated data to populate the tables and views provided by the SHINE delivery unit and used in
this tutorial. You can generate the data with tools included in the SHINE delivery unit.
Note
You might have to adjust the paths in the code examples provided to suit the package hierarchy in your SAP
HANA repository, for example, to point to the underlying content (demonstration tables and services)
referenced in the tutorial.
Context
A table UDF has a list of input parameters and must return a table of the type specified in RETURNS <return-
type>. The table UDF named get_employees_by_name_filter that you create in this tutorial performs the
following actions:
Procedure
FUNCTION
"SAP_HANA_DEMO"."sap.hana.democontent.epm.functions::get_employees_by_name_
filter" (lastNameFilter NVARCHAR(40))
RETURNS table ( EMPLOYEEID NVARCHAR(10),
"NAME.FIRST" NVARCHAR(40),
"NAME.LAST" NVARCHAR(40),
EMAILADDRESS NVARCHAR(255),
ADDRESSID NVARCHAR(10), CITY NVARCHAR(40),
POSTALCODE NVARCHAR(10), STREET NVARCHAR(60))
LANGUAGE SQLSCRIPT
SQL SECURITY INVOKER AS
BEGIN
RETURN
select a."EMPLOYEEID", a."NAME.FIRST",
a."NAME.LAST", a."EMAILADDRESS",
a."ADDRESSID.ADDRESSID" as "ADDRESSID", b."CITY", b."POSTALCODE",
b."STREET"
from "sap.hana.democontent.epm.data::EPM.MD.Employees"
as a
inner join
"sap.hana.democontent.epm.data::EPM.MD.Addresses"
as b
on a."ADDRESSID.ADDRESSID" = b.ADDRESSID
where contains("NAME.LAST", :lastNameFilter, FUZZY(0.9));
END;
b. Save the changes you have made to the new table UDF.
c. Check the catalog to ensure the new UDF was successfully created in the correct location, for example:
Catalog.SAP_HANA_DEMO.Functions.sap.hana.democontent.epm.functions::get_emplo
yees_by_name_filter
b. In the SQL console, enter a value for the last name filter, for example, *ll*, and choose (Run).
Sample Code
Remember to modify the paths to point to the correct locations in your environment, for example,
the schema name and the package location of the new UDF.
select * from
"SAP_HANA_DEMO"."sap.hana.democontent.epm.functions::get_employees_by_na
me_filter"('*ll*');
You can develop secure procedures using SQLScript in SAP HANA by observing the following
recommendations.
Using SQLScript, you can read and modify information in the database. In some cases, depending on the
commands and parameters you choose, you can create a situation in which data leakage or data tampering
can occur. To prevent this, SAP recommends using the following practices in all procedures.
● Mark each parameter using the keywords IN or OUT. Avoid using the INOUT keyword.
● Use the INVOKER keyword when you want the user to have the assigned privileges to start a procedure.
The default keyword, DEFINER, allows only the owner of the procedure to start it.
● Mark read-only procedures using READS SQL DATA whenever it is possible. This ensures that the data and
the structure of the database are not altered.
Tip
● Ensure that the types of parameters and variables are as specific as possible. Avoid using VARCHAR, for
example. By reducing the length of variables you can reduce the risk of injection attacks.
Dynamic SQL
In SQLScript you can create dynamic SQL using one of the following commands: EXEC and EXECUTE
IMMEDIATE. Although these commands allow the use of variables in SQLScript where they might not be
supported. In these situations you risk injection attacks unless you perform input validation within the
procedure. In some cases injection attacks can occur by way of data from another database table.
To avoid potential vulnerability from injection attacks, consider using the following methods instead of dynamic
SQL:
● Use static SQL statements. For example, use the static statement, SELECT instead of EXECUTE
IMMEDIATE and passing the values in the WHERE clause.
● Use server-side JavaScript to write this procedure instead of using SQLScript.
● Perform validation on input parameters within the procedure using either SQLScript or server-side
JavaScript.
● Use APPLY_FILTER if you need a dynamic WHERE condition
● Use the SQL Injection Prevention Function
Escape Code
You might need to use some SQL statements that are not supported in SQLScript, for example, the GRANT
statement. In other cases you might want to use the Data Definition Language (DDL) in which some <name>
elements, but not <value> elements, come from user input or another data source. The CREATE TABLE
statement is an example of where this situation can occur. In these cases you can use dynamic SQL to create
an escape from the procedure in the code.
To avoid potential vulnerability from injection attacks, consider using the following methods instead of escape
code:
Tip
For more information about security in SAP HANA, see the SAP HANA Security Guide.
Related Information
The SAP HANA SQLScript debugger allows you to debug and analyze procedures.
In a debug session, your procedures are executed in serial mode, not in parallel (not optimized). The stored
procedure call stack appears in the debug view, allowing you to view the nested calls and test the correctness
of the procedure logic. Note that the debugger is not intended to be used for evaluating performance.
Related Information
To debug procedures, you require the DEBUG privilege. To debug procedures in an external session, you also
need the ATTACH DEBUGGER privilege.
DEBUG Authorizes debug functionality on Required to debug procedures that are owned by another
a specific procedure or on the user.
(object privilege)
procedures of a specific schema.
Procedures created in the repository are owned by
It allows the user to:
_SYS_REPO and the user therefore needs to be granted the
● Set breakpoints
debug authorization.
● Display the procedure source
in the procedure viewer Procedures created directly in the catalog using the SQL
● Inspect intermediate results console are owned by the user creating them, so that user
does not require an additional debug authorization to de
bug the procedure.
ATTACH DEBUGGER Authorizes the debugging of a Required to debug an external session belonging to a differ-
procedure called by a different ent user. The session owner needs to grant the ATTACH
(system privilege)
user. In addition, the DEBUG privi DEBUGGER authorization to the user who is debugging.
lege for the corresponding proce
dure is needed.
Grant the DEBUG privilege to your user. The DEBUG privilege is required to debug procedures that are owned
by another user.
Context
Procedures created in the repository are owned by _SYS_REPO, so to debug these procedures you need to be
granted the debug authorization. You can assign the DEBUG privilege on a procedure or schema.
The steps below describe how to assign this privilege using the security tool. Note that you can also grant this
authorization from the SQL console in the catalog using either of the following statements:
CALL _SYS_REPO.GRANT_PRIVILEGE_ON_ACTIVATED_CONTENT('debug','<object>','<user>');
CALL
_SYS_REPO.GRANT_SCHEMA_PRIVILEGE_ON_ACTIVATED_CONTENT('debug','<schema>','<user>'
);
Procedure
5. Choose the (Add) button, select the relevant schema or procedure, and choose OK. The selected SQL
object is added to the table.
6. In the table, select the schema or procedure you just added.
A list of privileges for the object is shown.
7. Select the DEBUG option in the privileges list.
If you want to allow other users to debug your schema or procedure, select Yes under Grantable to
Others.
Related Information
Grant the ATTACH DEBUGGER privilege to another user to allow them to debug procedures in your sessions.
Context
The steps below describe how to assign the ATTACH DEBUGGER privilege using the security tool. Note that you
can also grant this authorization from the SQL console in the catalog using the following statement:
Note
The user also needs the object privilege DEBUG on the relevant procedure.
Procedure
5. Choose the (Add) button. Your user name is added to the table.
6. Select the ATTACH DEBUGGER option and save your changes.
You can debug and analyze SQLScript procedures in the editor or catalog of the SAP HANA Web-based
Development Workbench.
Procedure
Option URL
Editor http://<WebServerHost>:80<SAPHANAinstance>/sap/hana/ide/editor
Catalog http://<WebServerHost>:80<SAPHANAinstance>/sap/hana/ide/catalog
Note
If the debugger is bound to another browser, you can also detach it, that is, release the debugger, by
refreshing your current browser.
Tip
When you work in different browsers (Chrome and Firefox, for example), each browser debugs
separately. To invoke a procedure in your current browser and debug it in another one, get the debug
token of your current browser and apply it in the other browser. Note that you might have to refresh
your current browser before giving the debug token to the other browser.
3. Open a procedure.
Editor In the Content tree, select the procedure to open it in the code editor.
Catalog In the catalog, expand the <schema> Procedures node and select the procedure.
The source code appears in the stored procedure viewer. It is read-only but you can set break
points.
Note
For scalar types, insert a value. For table types, enter the name of a catalog table
(schema.tablename) that contains the relevant input. For example, SYS.USERS.
The debug session begins and the SQL debug browser opens on the right, showing the status of the
session. The debugger will stop at the first breakpoint and the session will be suspended until you resume
it.
You can see a list of all of the breakpoints on the Breakpoints tab. From the Breakpoints tab, you can:
○ Disable specific breakpoints or disable all of them.
○ Remove specific breakpoints or remove all of them.
○ Click a breakpoint to see which line it belongs to in the source code.
6. Add an expression.
a. Choose (Toggle Expression Editor) or choose Add Expression from the context menu of a variable
you want to investigate further.
The Expression Editor opens.
c. Press ENTER .
The expression is now listed under the Expressions node.
Option Description
You can evaluate your local scalar and table variables in the top pane under the Variables node. It shows the
values of the scalar variables and the number of rows in each table.
Under the Expressions node, you can see the values of the expressions you added. To view the content of
tables, choose Display Content from the context menu of the expression. The results are shown on the
Expression Result tab.
8. View the content of the tables listed under the Variables node.
Select a table and from the context menu choose Display Content. The results are shown on a separate tab
named according to the table.
Results
The debug session is terminated when the procedure run has finished.
You can execute procedures in the SAP HANA studio and debug and analyze them in the SAP HANA Web-
based Development Workbench using an external session.
Prerequisites
You have been assigned the DEBUG authorization for the procedure or associated schema.
Procedure
1. In the SAP HANA studio, get the details you need to set up an external session.
External sessions can be set up through a connection ID or HANA user (including, optionally, an application
user).
Option Description
Connec To get the connection ID, open the SQL console and execute the following statement:
tion ID
SELECT SESSION_CONTEXT('CONN_ID') FROM DUMMY;
HANA This is your SAP HANA database user. You can optionally set an application user, which can be used as an
user additional filter attribute.
To set an application user, open the SQL console and execute the following statement, replacing <applica
tion user name> with the appropriate value:
To check the application user has been set correctly, execute the following statement:
2. In the SAP HANA Web-based Development Workbench editor or catalog, set up the external debug
session.
Option Description
Set Filter Attribute Enter the SAP HANA user (this is your SAP HANA database user) and, op
tionally, an application user, if you set an application user in step 1.
Option Description
Editor In the Content tree, select the procedure to open it in the code editor.
Catalog In the catalog, expand the <schema> Procedures node and select the procedure.
The source code appears in the stored procedure viewer. It is read-only but you can set break
points.
Results
The debug session is terminated when the procedure run has finished.
SAP HANA extended application services (SAP HANA XS) provide applications and application developers with
access to the SAP HANA database using a consumption model that is exposed via HTTP.
In addition to providing application-specific consumption models, SAP HANA XS also host system services that
are part of the SAP HANA database, for example: search services and a built-in Web server that provides
access to static content stored in the SAP HANA repository.
The consumption model provided by SAP HANA XS focuses on server-side applications written in JavaScript
and making use of a powerful set of specially developed API functions. However, you can use other methods to
provide access to the data you want to expose in SAP HANA. For example, you can set up the Web-based data
access for XS classic applications using the following services:
● OData (v2)
You can map the persistence and consumption models with the Open Data Protocol (OData), a resource-
based Web protocol for querying and updating data.
● XMLA
Use the XML for Analysis (XMLA) interface to send a Multi-dimensional Expressions (MDX) query. XMLA
uses Web-based services to enable platform-independent access to XMLA-compliant data sources for
Online Analytical Processing (OLAP).
● SAP HANA REST API
SAP HANA REST API supports the Orion protocol 1.0, which allows development tools to access the SAP
HANA Repository (XS classic) in a convenient and standards-compliant way.
Related Information
In SAP HANA Extended Application Services (SAP HANA XS), the persistence model (for example, tables,
views, and stored procedures) is mapped to the consumption model that is exposed to clients - the
applications you write to extract data from the SAP HANA database.
You can map the persistence and consumption models with the Open Data Protocol (OData), a resource-based
Web protocol for querying and updating data. An OData application running in SAP HANA XS is used to provide
the consumption model for client applications exchanging OData queries with the SAP HANA database.
SAP HANA XS currently supports OData version 2.0, which you can use to send OData queries (for
example, using the HTTP GET method). Language encoding is restricted to UTF-8.
You can use OData to enable clients to consume authorized data stored in the SAP HANA database. OData
defines operations on resources using RESTful HTTP commands (for example, GET, PUT, POST, and DELETE)
and specifies the URI syntax for identifying the resources. Data is transferred over HTTP using either the Atom
(XML) or the JSON (JavaScript) format.
Note
For modification operations, for example, CREATE, UPDATE, and DELETE, SAP HANA XS supports only the
JSON format (“content-type: application/json”).
Applications running in SAP HANA XS enable accurate control of the flow of data between the presentational
layer, for example, in the Browser, and the data-processing layer in SAP HANA itself, where the calculations are
performed, for example, in SQL or SQLScript. If you develop and deploy an OData service running in SAP HANA
XS, you can take advantage of the embedded access to SAP HANA that SAP HANA XS provides; the embedded
access greatly improves end-to-end performance.
OData is a resource-based web protocol for querying and updating data. OData defines operations on
resources using HTTP commands (for example, GET, PUT, POST, and DELETE) and specifies the uniform
resource indicator (URI) syntax to use to identify the resources.
Note
OData makes it easier for SAP, for partners, and for customers to build standards-based applications for
many different devices and on various platforms, for example, applications that are based on a lightweight
consumption of SAP and non-SAP business application data.
The main aim of OData is to define an abstract data model and a protocol which, combined, enable any client
to access data exposed by any data source. Clients might include Web browsers, mobile devices, business-
intelligence tools, and custom applications (for example, written in programming languages such as PHP or
Java); data sources can include databases, content-management systems, the Cloud, or custom applications
(for example, written in Java).
In this tutorial, you create a simple OData service that exposes a SAP HANA database table as an OData
collection so that it can be analyzed and displayed by client applications.
Prerequisites
You have the privileges granted by the role sap.hana.ide.roles::EditorDeveloper; this role is included in the
parent role sap.hana.ide.roles::Developer.
Context
SAP HANA Extended Application Services allows you to create OData services without needing to perform
server side coding. To create an OData service from an existing HANA table (or view), you define a service
definition file with the suffix .xsodata.
Procedure
a. Select the helloodata package and from the context menu choose New File .
b. Enter the file name HELLO_ODATA.hdbschema and choose Create.
c. Enter the following code in the HELLO_ODATA.hdbschema file:
schema_name="HELLO_ODATA";
a. Select the helloodata package and from the context menu choose New File .
b. Enter the file name otable.hdbtable and choose Create.
c. Enter the following code in the otable.hdbtable file:
table.schemaName = "HELLO_ODATA";
table.tableType = COLUMNSTORE;
table.columns = [
{name = "Col1"; sqlType = VARCHAR; nullable = false; length = 20;
comment = "dummy comment";},
{name = "Col2"; sqlType = INTEGER; nullable = false;},
{name = "Col3"; sqlType = NVARCHAR; nullable = true; length = 20;
defaultValue = "Defaultvalue";},
{name = "Col4"; sqlType = DECIMAL; nullable = false; precision = 12;
scale = 3;}];
table.primaryKey.pkcolumns = ["Col1", "Col2"];
a. Select the helloodata package and from the context menu choose New File .
b. Enter the file name hello.xsodata and choose Create.
c. Enter the following code in the hello.xsodata file:
service {
7. To test the new OData service, select the hello.xsodata file and choose (Run).
The root URI of the OData service is passed to a new browser tab and an HTTP request executed.
The correctly addressed URI returns the list of resources exposed by the OData service, as shown below. In
this example, an entity set otable has been created for the table defined in the hdbtable file
helloodata:otable.hdbtable. The default name of the entity set is the name of the repository object
file, here "otable":
Note
You can view the same output as above in JSON format by appending the parameter format=json to
the URL: hello.xsodata/?$format=json
8. To view the details of the data model used by the OData service, add the /$metadata parameter to the
end of the URL and refresh your browser window.
The field descriptions for all the attributes of the OData service are now shown in the generated metadata
document, as can be seen in the example below. All information about the table, such as its properties,
data types, and primary key, is obtained from the database catalog:
Col1 test1
Col2 1
Col3 value 1
Col4 1.1
d. Choose Generate Data to create a set of test data based on random and/or fixed values. Specify the
number of rows you want to create and choose Generate.
You have successfully exposed the database entity as an OData service by means of SAP HANA extended
applications services and tested it directly within a browser. You now have a functional OData service that can
be called by an application, typically from within a Web page.
An OData service exposes data stored in database tables or views as OData collections for analysis and display
by client applications. However, first of all, you need to ensure that the tables and views to expose as an OData
collection actually exist.
Procedure
call
_SYS_REPO.GRANT_SCHEMA_PRIVILEGE_ON_ACTIVATED_CONTENT('select','<SCHEMANAME>',
'<username>');
The OData service definition is a configuration file you use to specify which data (for example, views or tables)
is exposed as an OData collection for analysis and display by client applications.
Prerequisites
You have defined the data to expose with the OData application, for example, at least the following:
● A database schema
● A database table
Context
An OData service for SAP HANA XS is defined in a text file with the file extension .xsodata, for example,
OdataSrvDef.xsodata. The file resides in the package hierarchy of the OData application and must contain
at least the entry service {}, which would generate an operational OData service with an empty service catalog
and an empty metadata file.
Procedure
Note
The file containing the OData service definition must be placed in the root package of the OData
application for which the service is intended.
Note
The XS OData editor will detect syntax errors, highlight keywords, and provide code assistance.
The following example shows a simple OData service definition exposing a simple table:
This service definition exposes a table defined in the file sample.odata:table.hdbtable and creates
an EntitySet for this entity named MyTable. The specification of an alias is optional. If omitted, the default
name of the EntitySet is the name of the repository object file, in this example, table.
4. Save the OData service definition.
Tip
To run an OData service, select the OData service file and choose (Run).
Related Information
The OData service definition is the mechanism you use to define what data to expose with OData, how, and to
whom. Data exposed as an OData collection is available for analysis and display by client applications, for
example, a browser that uses functions provided by an OData client library running on the client system.
To expose information by means of OData to applications using SAP HANA XS, you must define database views
that provide the data with the required granularity. Then you create an OData service definition, which is a file
you use to specify which database views or tables are exposed as OData collections.
Note
SAP HANA XS supports OData version 2.0, which you can use to send OData queries (for example, using
the HTTP GET method). Language encoding is restricted to UTF-8.
An OData service for SAP HANA XS is defined in a text file with the file suffix .xsodata, for example,
OdataSrvDef.xsodata. The file must contain at least the entry service {}, which would generate a
completely operational OData service with an empty service catalog and an empty metadata file. However,
In the OData service-definition file, you can use the following ways to name the SAP HANA objects you want to
expose by OData:
Note
The syntax to use in the OData service-definition file to reference objects depends on the object type, for
example, repository (design-time) or database catalog (runtime).
● Repository objects
Expose an object using the object's repository (design-time) name in the OData service-definition file. This
method of exposing database objects using OData enables the OData service to be automatically updated
if the underlying repository object changes. Note that a design-time name can be used to reference
analytic and calculation views; it cannot be used to reference SQL views. The following example shows how
to include a reference to a table in an OData service definition using the table's design-time name.
service {
"acme.com.odata::myTable" as "myTable";
}
Note
Calculation views are only accessible from within xsodata files by referring to the design-time name.
However, it is recommended to use design-time names whenever possible for calculation views or
common tables. With design-time names, the cross references are recreated during activation (for
example, for where-used), which means changes are visible automatically.
● Database objects
Expose an object using the object's database catalog (runtime) name. The support for database objects is
mainly intended for existing or replicated objects that do not have a repository design-time representation.
The following example shows how to include a reference to a table in an OData service definition using the
table's runtime name.
service {
"mySchema"."myTable" as "MyTable";
}
Note
It is strongly recommended not to use catalog (runtime) names in an OData service-definition. The use
of catalog object names is only enabled in a service-definition because some objects do not have a
design-time name. If at all possible, use the design-time name to reference objects in an OData service-
definition file.
By default, all entity sets and associations in an OData service are writeable, that is they can be modified with a
CREATE, UPDATE, or DELETE requests. However, you can prevent the execution of a modification request by
setting the appropriate keyword (create, update, or delete) with the forbidden option in the OData service
definition. The following example of an OData service definition for SAP HANA XS shows how to prevent any
modification to the table myTable that is exposed by the OData service. Any attempt to make a modification to
service {
“sap.test::myTable”
create forbidden
update forbidden
delete forbidden;
}
For CREATE requests, for example, to add a new entry to either a table or an SQL view exposed by an OData
service, you must specify an explicit key (not a generated key); the key must be included in the URL as part of
the CREATE request. For UPDATE and DELETE requests, you do not need to specify the key explicitly (and if
you do, it will be ignored); the key is already known, since it is essential to specify which entry in the table or
SQL view must be modified with the UPDATE or DELETE request.
Note
Without any support for IN/OUT table parameters in SQLScript, it is not possible to use a sequence to
create an entry in a table or view exposed by an OData service. However, you can use XS JavaScript exits to
update a table with a generated value.
Related Information
During the activation of the OData service definition, SQL types defined in the service definition are mapped to
EDM types according to a mapping table.
For example, the SQL type "Time" is mapped to the EDM type "EDM.Time"; the SQL type "Decimal" is mapped
to the EDM type "EDM.Decimal"; the SQL types "Real" and "Float" are mapped to the EDM type
"EDM.Single".
Note
The OData implementation in SAP HANA Extended Application Services (SAP HANA XS) does not support
all SQL types.
In the following example, the SQL types of columns in a table are mapped to the EDM types in the properties of
an entity type.
OData Service Definition: SQL-EDM Type Mapping (XS Advanced) [page 326]
OData Service Definitions [page 297]
The OData service definition provides a list of keywords that you use in the OData service-definition file to
enable important features. For example, the following list illustrates the most-commonly used features used in
an OData service-definition and, where appropriate, indicates the keyword to use to enable the feature:
● Aggregation
The results of aggregations on columns change dynamically, depending on the grouping conditions. As a
result, aggregation cannot be done in SQL views; it needs to be specified in the OData service definition
itself. Depending on the type of object you want to expose with OData, the columns to aggregate and the
function used must be specified explicitly (explicit aggregation) or derived from metadata in the database
(derived aggregation). Note that aggregated columns cannot be used in combination with the $filter
query parameter, and aggregation is only possible with generated keys.
● Association
Define associations between entities to express relationships between entities. With associations it is
possible to reflect foreign key constraints on database tables, hierarchies and other relations between
database objects.
● Key Specification
The OData specification requires an EntityType to denote a set of properties forming a unique key. In SAP
HANA, only tables can have a unique key, the primary key. All other (mostly view) objects require you to
specify a key for the entity. The OData service definition language (OSDL) enables you to do this by
denoting a set of existing columns or by generating a local key. Bear in mind that local keys are transient;
they exist only for the duration of the current session and cannot be dereferenced.
Note
OSDL is the language used to define a service definition; the language includes a list of keywords that
you use in the OData service-definition file to enable the required features.
The OData service definition describes how data exposed in an end point can be accessed by clients using the
OData protocol.
Each of the examples listed below is explained in a separate section. The examples show how to use the OData
Service Definition Language (OSDL) in the OData service-definition file to generate an operational OData
service that enables clients to use SAP HANA XS to access the OData end point you set up.
● Empty Service
● Namespace Definition
● Object Exposure
● Property Projection
● Key Specification
● Associations
● Aggregation
● Parameter Entity Sets
● ETag Support
● Nullable Properties
An OData service for SAP HANA XS is defined by a text file containing at least the following line:
service {}
A service file with the minimal content generates an empty, completely operational OData service with an
empty service catalog and an empty metadata file:
Note
● Examples and graphics are provided for illustration purposes only; some URLs may differ from the ones
shown.
http://<myHANAServer>:<port>/odata/services/<myService>.xsodata
{
"d" : {
"EntitySets" : []
}
}
An empty service metadata document consists of one Schema containing an empty EntityContainer. The
name of the EntityContainer is the name of the .xsodata file, in this example "empty".
Every .xsodata file must define it's own namespace by using the namespace keyword:
The resulting service metadata document has the specified schema namespace:
Note
Examples and graphics are provided for illustration purposes only; some URLs may differ from the ones
shown.
http://<myHANAServer>:<port>/odata/services/<myService>.xsodata/$metadata
In the examples provided to illustrate object exposure, the following definition of a table applies:
Database Objects
Similar to the exposure of an object by using the repository design-time name is the exposure by the database
name:
service {
"sample.odata::table" as "MyTable";
}
The service exposes the same table by using the database catalog name of the object and the name of the
schema where the table is created in.
If the object you want to expose with an OData service has more columns than you actually want to expose, you
can use SQL views to restrict the number of selected columns in the SELECT.
Nevertheless, SQL views are sometimes not appropriate, for example with calculation views, and for these
cases we provide the possibility to restrict the properties in the OData service definition in two ways. By
providing an including or an excluding list of columns.
Including Properties
You can specify the columns of an object that have to be exposed in the OData service by using the with
keyword. Key fields of tables must not be omitted.
service {
"sample.odata::table" as "MyTable" with ("ID","Text");
}
Note
Examples and graphics are provided for illustration purposes only; some URLs may differ from the ones
shown.
http://<myHANAServer>:<port>/odata/services/<myService>.xsodata/$metadata
Excluding Properties
The opposite of the with keyword is the without keyword, which enables you to specify which columns you
do NOT want to expose in the OData service:
service {
"sample.odata::table" as "MyTable" without ("Text","Time");
}
The generated EntityType then does NOT contain the properties derived from the specified columns:
http://<myHANAServer>:<port>/odata/services/<myService>.xsodata/$metadata
The OData specification requires an EntityType to denote a set properties forming a unique key. In HANA
only tables may have a unique key, the primary key. For all other (mostly view) objects you need to specify a key
for the entity.
In OSDL, you can specify a key for an entity/object by denoting a set of existing columns or by generating a key.
For the examples illustrating key specification, we use the following SQL view, which selects all data from the
specified table.
{
VIEW "sample.odata::view" as select * from "sample.odata::table"
}
If the object has set of columns that may form a unique key, you can specify them as key for the entity. These
key properties are always selected from the database, no matter if they are omitted in the $select query
option. Therefore explicit keys are not suitable for calculation views and analytic views as the selection has an
impact on the result.
service {
"sample.odata::view" as "MyView" key ("ID","Text");
}
The metadata document for the exposure of the view above is almost equal to the metadata document for
repository objects. Only the key is different and consists now of two columns:
Note
Examples and graphics are provided for illustration purposes only; some URLs may differ from the ones
shown.
http://<myHANAServer>:<port>/odata/services/<myService>.xsodata/$metadata
The OData infrastructure cannot check whether your specified keys are unique, so be careful when
choosing keys.
For objects that do not have a unique key in their results, for example, calculation views or aggregated tables,
you can generate a locally valid key. This key value numbers the results starting with 1 and is not meant for
dereferencing the entity; you cannot use this key to retrieve the entity. The key is valid only for the duration of
the current session and is used only to satisfy OData's need for a unique ID in the results. The property type of
a generated local key is Edm.String and cannot be changed.
service {
"sample.odata::view" as "MyView" key generate local "GenID";
}
http://<myHANAServer>:<port>/odata/services/<myService>.xsodata/$metadata
You can define associations between entities to express relationships between entities. With associations it is
possible to reflect foreign key constraints on database tables, hierarchies and other relations between
database objects. OSDL supports simple associations, where the information about the relationship is stored
in one of the participating entities, and complex associations, where the relationship information is stored in a
separate association table.
Associations themselves are freestanding. On top of them you can specify which of the entities participating in
the relationship can navigate over the association to the other entity by creating NavigationProperty
objects.
For the examples used to illustrate OData associations, we use the tables customer and order:
The definition of an association requires you to specify a name, which references two exposed entities and
whose columns keep the relationship information. To distinguish the ends of the association, you must use the
keywords principal and dependent. In addition, it is necessary to denote the multiplicity for each end of the
association.
service {
"sample.odata::customer" as "Customers";
"sample.odata::order" as "Orders";
association "Customer_Orders" with referential constraint principal
"Customers"("ID") multiplicity "1" dependent "Orders"("CustomerID") multiplicity
"*";
}
The association in the example above with the name Customer_Orders defines a relationship between the
table customer, identified by its EntitySet name Customers, on the principal end, and the table order,
identified by its entity set name Orders, on the dependent end. Involved columns of both tables are denoted
in braces ({}) after the name of the corresponding entity set. The multiplicity keyword on each end of the
association specifies their cardinality - in this example, one-to-many.
The with referential constraint syntax ensures that the referential constraint check is enforced at
design time, for example, when you activate the service definition in the SAP HANA repository. The referential
constraint information appears in the metadata document.
Note
SAP strongly recommends that you use the with referential constraint syntax.
The number of columns involved in the relationship must be equal for both ends of the association, and their
order in the list is important. The order specifies which column in one table is compared to which column in the
other table. In this simple example, the column customer.ID is compared to order.CustomerID in the
generated table join.
As a result of the generation of the service definition above, an AssociationSet named Customer_Orders
and an Association with name Customer_OrdersType are generated:
http://<myHANAServer>:<port>/odata/services/<myService>.xsodata/$metadata
Complex Associations
For the following example of a complex association, an additional table named knows is introduced that
contains a relationship between customers.
service {
"sample.odata::customer" as "Customers";
"sample.odata::order" as "Orders";
association "Customer_Orders"
principal "Customers"("ID") multiplicity "*"
dependent "Customers"("ID") multiplicity "*"
over "sample.odata::knows" principal ("KnowingCustomerID") dependent
("KnownCustomerID");
}
With the keywords principal and dependent after over you can specify which columns from the
association table are joined with the principal respectively dependent columns of the related entities. The
number of columns must be equal in pairs, and their order in the list is important.
The generated Association in the metadata document is similar to the one created for a simple association
except that the ReferentialConstraint is missing:
http://<myHANAServer>:<port>/odata/services/<myService>.xsodata/$metadata
Navigation Properties
By only defining an association, it is not possible to navigate from one entity to another. Associations need to
be bound to entities by a NavigationProperty. You can create them by using the keyword navigates:
service {
"sample.odata::customer" as "Customers" navigates ("Customer_Orders" as
"HisOrders");
"sample.odata::order" as "Orders";
association "Customer_Orders" principal "Customers"("ID") multiplicity "1"
dependent "Orders"("CustomerID") multiplicity "*";
}
The example above says that it is possible to navigate from Customers over the association Customer_Order
via the NavigationProperty named "HisOrders".
service {
"sample.odata::customer" as "Customers"
navigates ("Customer_Orders" as "HisOrders","Customer_Recruit" as
"Recruit" from principal);
"sample.odata::order" as "Orders";
association "Customer_Orders" principal "Customers"("ID") multiplicity "1"
dependent "Orders"("CustomerID") multiplicity "*";
association "Customer_Recruit" principal "Customers"("ID") multiplicity
"1" dependent "Customers"("RecruitID") multiplicity "*";
}
http://<myHANAServer>:<port>/odata/services/<myService>.xsodata/$metadata
The results of aggregations on columns change dynamically depending on the grouping conditions. This means
that aggregation cannot be performed in SQL views; it needs to be specified in the OData service definition
itself. Depending on the type of object to expose, you need to explicitly specify the columns to aggregate and
the function to use or derived them from metadata in the database.
In general, aggregations do not have consequences for the metadata document. It just effects the semantics of
the concerning properties during runtime. The grouping condition for the aggregation contain all selected non-
aggregated properties. Furthermore, aggregated columns cannot be used in $filter, and aggregation is only
possible with generated keys.
Derived Aggregation
The simplest way to define aggregations of columns in an object is to derive this information from metadata in
the database. The only objects with this information are calculation views and analytic views. For all other
service {
"sample.odata::calc" as "CalcView"
keys generate local "ID"
aggregates always;
}
Explicit Aggregation
The example for the explicit aggregation is based on the following table definition:
sample.odata:revenues.hdbtable
You can aggregate the columns of objects (without metadata) that are necessary for the derivation of
aggregation by explicitly denoting the column names and the functions to use, as illustrated in the following
example of a service definition: sample.odata:aggrexpl.xsodata
service {
"sample.odata::revenues" as "Revenues"
keys generate local "ID"
aggregates always (SUM of "Amount");
}
The results of the entity set Revenues always contain the aggregated value of the column Amount. To extract
the aggregated revenue amount per year, add $select=Year,Amount to your requested URI.
Parameter entity sets can be generated for calculation views by adding parameters via entity to the entity, as
illustrated in the following service-definition example:
service {
"sample.odata::calc" as "CalcView"
keys generate local "ID"
parameters via entity;
}
service {
"sample.odata::calc" as "CalcView"
keys generate local "ID"
parameters via entity "CVParams" results property "Execute";
}
With the definition above, the name of the parameter entity set is CVParams, and the name of the
NavigationProperty for the results is Execute.
In an OData service definition, you can enable navigation between an entity and a parameterized entity. This
feature is particularly useful if you need to have access to individual entries in a parameterized entity set, for
example, a calculation view with parameters. If you need to access individual entries in an entity set that has
parameters, you must expose the parameters as keys. If you do not need to have access to individual entries in
an entity set, you can use the key generate local option to generate a pseudo key.
To enable navigation between an entity and a parameterized entity, you must perform the following steps:
Enabling navigation between an entity and a parameterized entity is only possible if the parameters are part of
the entity-type key in the OData service definition file. To make the parameters part of the key of the target
entity, use the via key syntax, as illustrated in the following example:
service {
"sap.test::calcview" key ("theKeyColumns") parameters via key and entity;
}
You also have to define an association between the source and target entities, for example, with additional
entries introduced by the via parameters keyword, as illustrated in the following example:
service {
"sap.test::table" as "Tab" navigates ("avp" as "ViewNav");
"sap.test::calcview" as "View" key ("theKeyColumns") parameters via key and
entity;
Note
The parameters you define in the dependent end of the association must be the first properties in the list. In
addition, the parameters specified must be given in the same order as they are specified in the view, as
illustrated in the following example:
In the example immediately above, the principal “Tab” has three columns that contain the information that is
required to navigate to the dependent “View” in the association.
● “col1”
The value of “col1” should be set for “parameter1”
● “col2”
The value of “col2” should be set for “parameter2”
● “col3”
The parameter “col3” contains additional information that is not passed as an input parameter, but as
part of a WHERE condition.
Note
This navigation property cannot be used in combination with the OData query options $expand, $filter
and $orderby.
This feature allows a service to define the fields that are to be included in the concurrency check.
You can now use entity tags (ETags) for optimistic concurrency control. If you choose to use this feature, then
you must enable it per entity in the .xsodata file. Enabling this feature per entity allows for the concurrency
control to be applied to multiple fields. The following code example provides information about how to do this.
Sample Code
service
{ entity "sap.test.odata.db.views::Etag" as "EtagAll"
key ("KEY_00") concurrencytoken;
entity "sap.test.odata.db.views::Etag" as "EtagNvarchar"
key ("KEY_00") concurrencytoken ("NVARCHAR_01","INTEGER_03");
}
If you specify concurrencytoken only, then all properties, except the key properties, are used to calculate the
ETag value. If you provide specific properties, then only those properties are used for the calculation.
Note
You cannot specify concurrencytoken on aggregated properties that use the AVG (average) aggregation
method.
During the 'Create' phase, the XSODATA layer generates all entity properties automatically. Since the
properties are not nullable, consumers of the code are forced to pass dummy values into them. However,
OData supports $filter and $orderby conditions on the null value. This means that it is now possible to
treat null as a value, if you enable it. You can enable this behavior for the entire service only, not per entity.
The following code example provides information about how you can do this.
Sample Code
service {
…
}
settings {
support null;
content cache-control "no-store";
metadata cache-control "max-age=86401,must-revalidate";
hints
"NO_CALC_VIEW_UNFOLDING";
limits
max_records = 10,
max_expanded_records = 30;
}
If you enable support for null, then $filter requests, such as $filter=NVARCHAR_01 eq null, are
possible. Otherwise null is rejected with an exception. If you do not enable the null support, then the default
behavior applies.
Note
null values are “ignored” in comparisons: "ignored" in the sens that if you compare a column with null
and the columns contain per definition no null values, no record passes the filter.
Related Information
You can create a service to configure the cache settings for the $metadata request to optimize performance.
When calling OData services, the services make repeated requests for the $metadata document. Since
changes to the underlying entity definitions occurs rarely, SAP has enabled the option to configure caching for
The following code example provides information about how you can do this.
Sample Code
service {
...
}
settings {
support null;
content cache-control "no-store";
metadata cache-control "max-age=86401,must-revalidate";
hints
"NO_CALC_VIEW_UNFOLDING";
limits
max_records = 10,
max_expanded_records = 30;
}
content cache-control
You can use the content cache-control parameter to set the HTTP header "value" that is used for cache
control in the data responses, for example:
The value you specify must be enclosed in double quotes (for example, "<value>"), and multiple parameters
must be separated by a comma.
Tip
You can include any value supported by the HTTP specification for cache-control. "no-store" indicates
that the cache should not store any details of the client request or server response.
metadata cache-control
You can use the metadata cache-control parameter to set the header HTTP "value" that is used for the
cache control in the metadata response, for example:
The value you specify for metadata cache-control must be enclosed in double quotes (for example,
"<value>"), and multiple elements must be separated by a comma, as illustrated in the example above.
Tip
You can include any value supported by the HTTP specification for cache-control. In the example, above,
"must-revalidate" indicates that the cache must verify the status of resources (fresh or stale) and not
Related Information
If you provide a custom exit for an OData write request, the code has to be provided in form of an SQLScript
procedure with signatures that follow specific conventions. The following type of write exits are supported for
OData write requests in SAP HANA XS:
● Validation Exits
These exits are for validation of input data and data consistency checks. They can be registered for create,
update, and delete events and executed before or after the change operation, or before or after the commit
operation. You can specify a maximum of four validation exits per change operation; the exit is registered
for the corresponding event with the respective keyword: “before”, “after”, “precommit” or “postcommit”.
● Modification Exits
You can define custom logic to create, update, or delete an entry in an entity set. If a modification exit is
specified, it is executed instead of the generic actions provided by the OData infrastructure. You use the
using keyword to register the exit.
If registered, the scripts for the exits are executed in the order shown in the following table:
Batch Insert before(1), using(1), after(1), before(2), using(2), after(2), … , precommit(1), precommit(2),
postcommit(1), postcommit(2)
The signature of a registered script has to follow specific rules, depending on whether it is registered for entity
or link write operations and depending on the operation itself. The signature must also have table-typed
parameters for both input and output:
For entity write operations, the methods registered for the CREATE operation are passed a table containing the
new entry that must be inserted into the target table; the UPDATE operation receives the entity both before and
after the modification; the DELETE operation receives the entry that must be deleted. The table type of the
parameters (specified with the EntityType keyword in the table below) corresponds to the types of the exposed
entity.
before, after, precommit, us IN new EntityType, OUT error IN new EntityType, IN old En IN old EntityType, OUT error
ing ErrorType tityType, OUT error ErrorType ErrorType
For link write operations, all exits that are executed before the commit operation take two table-typed input
parameters and one table-typed output parameter. The first parameter must correspond to the structure of
the entity type at the principal end of the association; the second parameter must correspond to the
dependent entity type.
before, after, precommit, us IN principal PrincipalEntityType, IN dependent DependentEntityType, OUT error ErrorType
ing
Note
Parameter types (IN, OUT) are checked during activation; the data types of table type columns are not
checked.
The OUT parameter enables you to return error information. The first row in the OUT table is then serialized as
inner error in the error message. If no error occurs, the OUT table must remain empty. The structure of the
table type ErrorType is not restricted. Any columns with special names HTTP_STATUS_CODE and
ERROR_MESSAGE are mapped to common information in the OData error response. Content of columns with
other names are serialized into the inner error part of the error message that allows the return of custom
error information.
Note
If the SQLScript procedure throws an exception or writes an error messages to the OUT parameter table,
the OData write operation is aborted. If more than one error message is added, only the content of the first
row is returned in the resulting error message. Any scripts registered for the postcommit event must not
have an OUT parameter as the write operation cannot be aborted at such a late stage, even in the event of
an error.
"sample.odata::error" AS TABLE (
"HTTP_STATUS_CODE" INTEGER,
"ERROR_MESSAGE" NVARCHAR(100),
"DETAIL" NVARCHAR(100)
)
The following example shows how information is extracted from the error table if an error occurs during the
execution of a create procedure for an OData write operation:
Use SQLScript to create a custom validation exit which runs server-side verification and data-consistency
checks for an OData update operation.
Prerequisites
Context
In this tutorial, you see how to register an SQL script for an OData update operation; the script verifies, before
the execution of the update operation, that the updated value is larger than the previous one. In the example
shown in this tutorial, you define the table to be updated and a table type for the error output parameter of the
exit procedure.
The table to expose is defined in sample.odata:table.hdbtable, which should look like the following
example:
2. Create a table type for the error output parameter of the exit procedure.
The error type file sample.odata:error.hdbtabletype should look like the following example:
"sample.odata::error" AS TABLE (
"HTTP_STATUS_CODE" INTEGER,
"ERROR_MESSAGE" NVARCHAR(100),
"DETAIL" NVARCHAR(100)
)
procedure "sample.odata::beforeupdate"
(IN new "sample.odata::table", IN old "sample.odata::table", OUT error
"sample.odata::error")
language sqlscript
sql security invoker as
idnew INT;
idold INT;
begin
select ID into idnew from :new;
select ID into idold from :old;
if :idnew <= :idold then
error = select 400 as http_status_code,
'invalid ID' error_message,
'the new value must be larger than the previous' detail from dummy;
end if;
end;
service {
“sample.odata::table”
update events (before “sample.odata::beforeupdate”);
}
Register an SQL script as a modification exit for an OData create operation for an entity.
Prerequisites
● A table to expose for the OData create operation, for example, sample.odata::table.hdbtable
● An error type, for example, sample.odata::error.hdbstructure
Note
Context
SAP HANA XS enables you to register custom code that handles the OData write operation for non-trivial
cases. In this tutorial, you see how to register a modification exit for an OData CREATE operation for an entity.
The procedure you register verifies the data to insert, refuses the insertion request if the specified ID is less
than 1000, and in the event of an error, inserts a row with error information into the output table.
Procedure
The table you create in this step is used in the procedure you create later in the tutorial. The table to expose
is defined in sample.odata:table.hdbtable, which should look like the following example:
table.schemaName = "ODATASAMPLES";
table.columns = [{name = "ID"; sqlType = INTEGER; nullable = false;}];
table.primaryKey.pkcolumns = ["ID"];
2. Create a table type for the error output parameter of the exit procedure.
The error type you create in this step is used in the procedure you create later in the tutorial. The error type
file sample.odata:error.hdbstructure should look like the following example:
table.schemaName = "ODATASAMPLES";
table.columns = [
{name = "HTTP_STATUS_CODE"; sqlType = INTEGER;},
{name = "ERROR_MESSAGE"; sqlType = NVARCHAR; length = 100;},
{name = "DETAIL"; sqlType = NVARCHAR; length = 100;}
];
procedure "ODATA_TEST"."sample.odata::createmethod"
(IN new "sample.odata::table", OUT error "sample.odata::error")
language sqlscript
sql security invoker as
id INT;
begin
select ID into id from :new;
if :id < 1000 then
error = select 400 as http_status_code,
'invalid ID' error_message,
'value must be >= 1000' detail from dummy;
else
insert into "sample.odata::table" values (:id);
end if;
end;
service {
“sample.odata::table”
create using “sample.odata::createmethod”;
}
You can use server-side JavaScript to write a script which you register as a modification exit for an OData
update operation for an entity.
Prerequisites
● A table to expose for the OData create operation, for example, sample.odata::table.hdbtable
Context
SAP HANA XS enables you to register custom code that handles the OData write operation. In this tutorial, you
see how to use server-side JavaScript (XSJS) to write a script which you register as a modification exit for
OData UPDATE operations for an entity. The script you register verifies the data to insert, and throws a defined
To register an XS JavaScript function as an OData modification exit, perform the following steps:
Procedure
1. Create a table definition file, for example, using the .hdbtable syntax.
The table you create in this step is used in the XS JavaScript function you create later in the tutorial. The
table to expose is defined in sample.odata:table.hdbtable, which should look like the following
example:
2. Create the XS JavaScript function that you want to register for OData modification events.
Note
The XS JavaScript function that you want to register for OData modification events must be created in
the form of an XSJS library, for example, with the file extension .xsjslib; the XS JavaScript function
cannot be an .xsjs file.
The function you register has one parameter, which can have the properties listed in the following table:
beforeTableName String The name of a temporary table with the single entry before
the operation (UPDATE and DELETE events only)
afterTableName String The name of a temporary table with the single entry after
the operation (CREATE and UPDATE events only)
The XS JavaScript function jsexit.xsjslib could look like the following example:
function update_instead(param) {
$.trace.debug(“entered function”);
let before = param.beforeTableName;
let after = param.afterTableName;
let pStmt = param.connection.prepareStatement('select * from ”' + after
+'"');
// ...
if (ok) {
// update
} else {
throw “an error occurred; check access privileges”
}
}
3. Bind the XS JavaScript function to the entity specified in the OData service definition.
service {
"sample.odata::table" as "Table" update using
"sap.test:jsexit.xsjslib::update_instead";
}
The OData Service Definition Language (OSDL) provides a set of keywords that enable you to set up an ODATA
service definition file that specifies what data to expose, in what way, and to whom.
The following list shows the syntax of the OData Service Definition Language (OSDL) in an EBNF-like format;
conditions that apply for usage are listed after the table.
Note
Support for OData annotations is currently not available in SAP HANA XS Advanced.
Conditions
1. If the namespace is not specified, the schema namespace in the EDMX metadata document will be the
repository package of the service definition file concatenated with the repository object name. E.g. if the
repository design time name of the .xsodata file is sap.hana.xs.doc/hello.xsodata the
namespace will implicitly be sap.hana.xs.doc.hello.
2. keyslist must not be specified for objects of type 'table'. They must only be applied to objects referring a
view type. keygenerated in turn, can be applied to table objects.
3. If the entityset is not specified in an entity, the EntitySet for this object is named after the repository
object name or the catalogobjectname. For example, if object is "sap.hana.xs.doc/odata_docu",
then the entitysetname is implicitly set to odata_docu, which then can also be referenced in
associations.
Related Information
During the activation of the OData service definition, the SAP HANA SQL types are mapped to the required
OData EDM types according to the rules specified in a mapping table.
The following mapping table lists how SAP HANA SQL types are mapped to OData EDM types during the
activation of an OData service definition.
Note
The OData implementation in SAP HANA XS supports only those SQL types listed in the following table.
Time Edm.Time
Date Edm.DateTime
SecondDate Edm.DateTime
LongDate Edm.DateTime
Timestamp Edm.DateTime
TinyInt Edm.Byte
SmallInt Edm.Int16
Integer Edm.Int32
BigInt Edm.Int64
SmallDecimal Edm.Decimal
Decimal Edm.Decimal
Real Edm.Single
Float Edm.Single
Double Edm.Double
Varchar Edm.String
NVarchar Edm.String
Char Edm.String
NChar Edm.String
Binary Edm.Binary
Varbinary Edm.Binary
The following examples shows how SAP HANA SQL types (name, integer, NVarchar) of columns in a table are
mapped to the OData EDM types in the properties of an entity type.
The following example illustrates how the SAP HANA SQL types illustrated in the previous example are mapped
to EDM types:
Enabling access to data by means of OData can create some security-related issues that you need to consider
and address, for example, the data you want to expose, who can start the OData service, and so on.
If you want to use OData to expose data to users and clients in SAP HANA application services, you need to
bear in mind the security considerations described in the following list:
● Data Access
Restrict user select authorization for tables/views exposed by the OData service
● OData Service
Restrict authorization rights to start the OData service
● OData Statistical content
Restrict access to the URL/Path used to expose OData content in the Web browser
The OData standard allows the collection of multiple individual HTTP requests into one single batched HTTP
request.
Clients using a defined OData service to consume exposed data can collect multiple, individual HTTP requests,
for example, retrieve, create, update and delete (GET, POST, PUT, DELETE), in a single “batch” and send the
batched request to the OData service as a single HTTP request. You can compile the batch request manually
(by creating the individual requests in the batch document by hand) or automatically, for example, with an
AJAX call that adds requests to a queue and loops through the queues to build the batch request. In both
cases, the OData standard specifies the syntax required for the header and body elements of a valid batch
request document.
SAP HANA XS supports the OData $batch feature out-of-the-box; there is nothing to configure in SAP HANA
XS to use $batch to perform operations in SAP HANA using an OData service. To understand how the $batch
feature works, you need to look at the following phases of the operation:
● Batch Request
● Batch Response
A batch request is split into two parts: the request header and the request body. The body of a batch request
consists of a list of operations in a specific order where each operation either retrieves data (for example, using
the HTTP GET command) or requests a change. A change request involves one or more insert, update or delete
operations using the POST, PUT, or DELETE commands.
Note
A change request must not contain either a retrieve request or any nested change requests.
The batch request must contain a Content-Type header specifying the value “multipart/mixed” and a
boundary ID boundary=batch_#; the batch boundary ID is then used to indicate the start of each batch
request, as illustrated in the following example.
--changeset_a4e3-a738 // Changeset 2
start
Content-Type: application/http
Content-Transfer-Encoding: binary
[POST...]
--changeset_a4e3-a738-- // Changeset (all)
end
--batch_8219-6895 // Batch part 2
start
Content-Type: application/http
Content-Transfer-Encoding:binary
[GET...]
--batch_8219-6895-- // Batch (all) end
Within the batch request, changeset is defined by another boundary ID (for example,
boundary=changeset_123), which is then used to indicate the start and end of the change requests. The
batch request must be closed, too.
Note
In the following example of a simple OData batch request, some content has been removed to emphasize
the structure and layout.
● The response to a retrieve request is always formatted in the same way regardless of whether it is sent
individually or as part of batch.
● The body of the collected response to a set of change-requests is one of the following:
○ A response for all the successfully processed change requests within the change set, in the correct
order and formatted exactly as it would have appeared outside of a batch
○ A single response indicating the failure of the entire change set
The following example shows the form and syntax of the OData batch response to the request illustrated above.
If you are developing a UI client using SAPUI5, you can make use of the ODataModel tools to ensure that the
data requests generated by the various UI controls bound to an OData service are collected and sent in
batches. The SAPUI5 ODataModel toolset includes a large selection of tools you can use to configure the use of
the OData batch feature, for example:
● setUseBatch
Enable or disable batch processing for all requests (read and change)
● addBatchChangeOperations
Appends the change operations to the end of the batch stack, which is sent with the submitBatch
function
● addBatchReadOperations
Appends the read operations to the end of the batch stack, which is sent with the submitBatch function
● submitBatch
Submits the collected changes in the batch which were collected via addBatchReadOperations or
addBatchChangeOperations.
Related Information
In SAP HANA Extended Application Services (SAP HANA XS) , the persistence model (for example, tables,
views and stored procedures) is mapped to the consumption model that is exposed to clients - the applications
you write to extract data from the SAP HANA database.
You can map the persistence and consumption models with XML for Analysis (XMLA). With XMLA, you write
multi-dimensional-expressions (MDX) queries wrapped in an XMLA document. An XML for Analysis (XMLA)
application running in SAP HANA application services (SAP HANA XS) is used to provide the consumption
model for client applications exchanging MDX queries (wrapped in XMLA documents) with the SAP HANA
database.
XMLA uses Web-based services to enable platform-independent access to XMLA-compliant data sources for
Online Analytical Processing (OLAP). XMLA enables the exchange of analytical data between a client
application and a multi-dimensional data provider working over the Web, using a Simple Object Access
Protocol (SOAP)-based XML communication application-programming interface (API).
Applications running in SAP HANA XS enable you to control the flow of data between the presentational layer,
for example, in the Web browser, and the data-processing layer in SAP HANA itself, where the calculations are
performed, for example in SQL or SqlScript. If you develop and deploy an XMLA service running in SAP HANA
XS, you can take advantage of the access to SAP HANA that SAP HANA XS provides to improve end-to-end
performance.
The XS advanced application xmla must not be installed in the SAP space.
If you are using multiple tenant databases, you must install the xmla application in a space (or spaces) other
than the default space SAP, for example, DEV or PROD. In addition, the target space must already be mapped to
a tenant database before you deploy the xmla application. You can map an XS advanced space to a tenant
database using the SAP HANA Service Broker Configuration tool that is included in the XS Advanced
Administration tools.
Tip
The XMLA interface for XS advanced is available either on the SAP HANA media or for download from SAP
Service Marketplace for those people with the required S-User ID:
Related Information
XML for Analysis (XMLA) uses Web-based services to enable platform-independent access to XMLA-compliant
data sources for Online Analytical Processing (OLAP).
XMLA enables the exchange of analytical data between a client application and a multi-dimensional data
provider working over the Web, using a Simple Object Access Protocol (SOAP)-based XML communication
application-programming interface (API).
Implementing XMLA in SAP HANA enables third-party reporting tools that are connected to the SAP HANA
database to communicate directly with the MDX interface. The XMLA API provides universal data access to a
particular source over the Internet, without the client having to set up a special component. XML for Analysis is
optimized for the Internet in the following ways:
● Query performance
Time spent on queries to the server is kept to a minimum
● Query type
Client queries are stateless by default; after the client has received the requested data, the client is
disconnected from the Web server.
In this way, tolerance to errors and the scalability of a source (the maximum permitted number of users) is
maximized.
The specification defined in XML for Analysis Version 1.1 from Microsoft forms the basis for the implementation
of XML for Analysis in SAP HANA.
The following list describes the methods that determine the specification for a stateless data request and
provides a brief explanation of the method's scope:
● Discover
Use this method to query metadata and master data; the result of the discover method is a rowset. You
can specify options, for example, to define the query type, any data-filtering restrictions, and any required
XMLA properties for data formatting.
● Execute
Use this method to execute MDX commands and receive the corresponding result set; the result of the
Execute command could be a multi-dimensional dataset or a tabular rowset. You can set options to
specify any required XMLA properties, for example, to define the format of the returned result set or any
local properties to use to determine how to format the returned data.
Related Information
You can use the XML for Analysis (XMLA) interface included in SAP HANA Extended Application Services (SAP
HANA XS) to provide a service that enables XMLA-capable clients to query multidimensional cubes in SAP
HANA.
Prerequisites
● A multidimensional data cube is available in SAP HANA, for example, in the form of a calculation view, an
analytic view, or an attribute view
● An XMLA client is available
Procedure
{
"exposed" : true
}
4. Create an XMLA service-definition file: and place it in your root XMLA package helloxmla.
a. Select your root XMLA package, for example, helloxmla, and from the context menu choose New
File .
Note
The file containing the XMLA service definition must be placed in the root package of the XMLA
application for which the service is intended.
b. Enter a file name with the file extension .xsxmla, for example, hello.xsxmla, and choose Create.
c. Enter the following content in the hello.xsxmla XMLA service-definition file:
service {*}
http://<hana.server.name>:80<HANA_instance_number>/helloxmla/hello.xsxmla
Note
You have successfully completed this step if you see a 404 Error page; the page indicates that the SAP
HANA XS Web server has responded.
7. Connect your XMLA client application to the inbuilt XMLA interface in SAP HANA XS.
To connect an XMLA-capable client (for example, Microsoft Excel) with the XMLA interface in SAP HANA
XS, you will need a product (for example, a plug-in for Microsoft Excel) that can transfer the XMLA message
that the SAP HANA XS XMLA interface can understand.
8. Configure your client to send an XMLA query to SAP HANA.
Prerequisites
If you already have a data model containing tables or views that can be exposed, you do not need to create
additional elements. You can use the tables and views that are already available.
Context
An XMLA service exposes data stored in database tables for analysis and display by client applications.
However, first of all, you need to ensure that the tables and views to expose as an XMLA service actually exist
and are accessible.
To define the data to expose using an XMLA service, you must perform at least the following tasks:
Procedure
Related Information
The XMLA service definition is a file you use to specify which data is exposed as XMLA/MDX collections for
analysis and display by client applications.
Prerequisites
Context
An XMLA service for SAP HANA XS is defined in a text file with the file extension .xsxmla, for example,
XMLASrvDef.xsxmla. The file resides in the package hierarchy of the XMLA application and must contain only
the entry {*}, which generates a completely operational XMLA service.
Procedure
Note
The file containing the XMLA service definition must be placed in the root package of the XMLA
application for which the service is intended.
b. Enter a file name with the file extension .xsxmla, for example, XMLASrvDef.xsxmla, and choose
Create.
3. Create the XMLA service definition.
The XMLA service definition is a configuration file that you use to specify which data is to be exposed as an
XMLA collection.
The following code is an example of a valid XMLA service definition, which exposes all authorized data to
XMLA requests:
service{*}
Not the currently the XMLA service-definition file enables you to specify only that all authorized data is
exposed to XMLA requests.
Related Information
The XMLA service definition is a file you use to specify which data is exposed as XMLA collections. Exposed
data is available for analysis and display by client applications, for example, a browser that uses functions
provided either by the XMLA service running in SAP HANA XS or by an XMLA client library running on the client
system.
To expose information via XMLA to applications using SAP HANA Extended Application Services (SAP HANA
XS), you define database views that provide the data with the required granularity and you use the XMLA
service definition to control access to the exposed data.
Note
SAP HANA XS supports XMLA version 1.1, which you can use to send MDX queries.
An XMLA service for SAP HANA XS is defined in a text file with the file suffix .xsxmla, for example,
XMLASrvDef.xsxmla. The file must contain only the entry {*}, which would generate a completely
operational XMLA service.
Currently, the XMLA service-definition file enables you to specify only that all authorized data is exposed to
XMLA requests, as illustrated in the following example:
Service {*}
Enabling access to data by means of XMLA opens up some security considerations that you need to address,
for example, the data you want to expose, who can start the XMLA service, and so on.
If you want to use XMLA to expose data to users and clients in SAP HANA XS, you need to bear in mind the
security considerations described in the following list:
● Data Access
Restrict user select authorization for data exposed by the XMLA service
Multidimensional Expressions (MDX) is a language for querying multidimensional data that is stored in OLAP
cubes.
MDX uses a multidimensional data model to enable navigation in multiple dimensions, levels, and up and down
a hierarchy. With MDX, you can access pre-computed aggregates at specified positions (levels or members) in
a hierarchy.
Note
MDX is an open standard. However, SAP has developed extensions to MDX to enable faster and more
efficient access to multidimensional data; for example, to serve specific SAP HANA application
requirements and to optimize the result set for SAP HANA clients.
MDX is implicitly a hierarchy-based paradigm. All members of all dimensions must belong to a hierarchy. Even
if you do not explicitly create hierarchies in your SAP HANA data model, the SAP HANA modeler implicitly
generates default hierarchies for each dimension. All identifiers that are used to uniquely identify hierarchies,
levels and members in MDX statements (and metadata requests) embed the hierarchy name within the
identifier.
In SAP HANA, the standard use of MDX is to access SAP HANA models (for example, analytical and attribute
views) that have been designed, validated and activated in the modeler in the SAP HANA studio. The studio
provides a graphical design environment that enables detailed control over all aspects of the model and its
language-context-sensitive runtime representation to users.
MDX in SAP HANA uses a runtime cube model, which usually consists of an analytical (or calculation) view that
represents data in which dimensions are modeled as attribute views. You can use the analytical view to specify
whether a given attribute is intended for display purposes only or for aggregation. The attributes of attribute
views are linked to private attributes in an analytic view in order to connect the entities. One benefit of MDX in
SAP HANA is the native support of hierarchies defined for attribute views.
Note
MDX in SAP HANA includes native support of hierarchies defined for attribute views. SAP HANA supports
level-based and parent-child hierarchies and both types of hierarchies are accessible with MDX.
SAP HANA supports the use of variables in MDX queries; the variables are an SAP-specific enhancement to
standard MDX syntax. You can specify values for all mandatory variables that are defined in SAP HANA studio
to various modeling entities. The following example illustrates how to declare SAP HANA variables and their
values:
MDX
Select
From [SALES_DATA_VAR]
Where [Measures].[M2_1_M3_CONV]
SAP VARIABLES [VAR_VAT] including 10,
[VAR_K2] including 112,
Aggregate
Ancestor
Ancestors
Ascendants
Avg
BottomCount
Children
ClosingPeriod
Count
Cousin
Crossjoin
CurrentMember
DefaultMember
Descendants
Dimension
Dimensions
Distinct
DistinctCount
DrillDownLevel
DrillDownLevelBottom
DrillDownLevelTop
DrillDownMember
DrillDownMemberBottom
DrillDownMemberTop
DrillUpLevel
DrillUpmember
Except
Filter
FirstChild
FirstSibling
Generate
Head
Hierarchize
Hierarchy
Instr
Intersect
For more information about these functions, see Microsoft's Multidimensional Expressions (MDX) Reference.
SAP HANA supports several extensions to the MDX language, including additional predefined functions and
support for variables.
The object Member includes a property called Sibling_Ordinal, that is equal to the 0-based position of the
member within its siblings.
Example
WITH
MEMBER [Measures].[Termination Rate] AS
[Measures].[NET_SALES] / [Measures].[BILLED_QUANTITY]
SELECT
{
[Measures].[NET_SALES],
[Measures].[BILLED_QUANTITY],
[Measures].[Termination Rate]
} ON COLUMNS,
Descendants
(
[DISTRIBUTION_CHANNEL].[DISTRIBUTION_CHANNEL].[All].[(all)],
1,
SELF_AND_BEFORE
)
DIMENSION PROPERTIES SIBLING_ORDINAL ON ROWS
FROM SALES_DATA
SAP HANA includes the MembersAscendantsDescendants function that enables you to get, for example, all
ascendants and descendants of a specific member.
This function improves on the standard MDX functions Ascendants and Descendants.
<flag> Indicates which related members to return, and can be one of the following:
● MEMBERS_AND_ASCENDANTS_AND_DESCENDANTS
● MEMBERS_AND_ASCENDANTS
● MEMBERS_AND_DESCENDANTS
● ASCENDANTS_AND_DESCENDANTS
● ONLY_ASCENDANTS
● ONLY_DESCENDANTS
Example
SELECT
{ [Measures].[SALES] }
ON COLUMNS,
NON EMPTY
{ Hierarchize( MembersAscendantsDescendants([SALES_DATA_TIME].[TimeHier].
[QUARTER].[3]:[SALES_DATA_TIME].[TimeHier].[QUARTER].[4],
MEMBERS_AND_ASCENDANTS_AND_DESCENDANTS )) }
ON ROWS
FROM [SALES_DATA]
SELECT
{ [Measures].[SALES] }
ON COLUMNS,
NON EMPTY
{ Hierarchize( MembersAscendantsDescendants([SALES_DATA_TIME].[TimeHier].
[QUARTER].[3]:[SALES_DATA_TIME].[TimeHier].[QUARTER].[4], ONLY_ASCENDANTS )) }
ON ROWS
FROM [SALES_DATA]
An MDX SELECT statement in SAP HANA enables you to send values for variables defined within modeling
views.
Analytic and calculation views can contain variables that can be bound to specific attributes. When calling the
view, you can send values for those variables. These variables can be used, for example, to filter the results.
SAP HANA supports an extension to MDX whereby you can pass values for variables defined in views by adding
an SAP Variables clause in your SELECT statement. Here is the syntax for a SELECT statement:
<select_statement>:
[WITH <formula_specification> ]
SELECT [<axis_specification>[,<axis_specification>...]]
FROM <cube_specification>
SAP HANA includes the following set of tables that contain information about the variables defined for views:
● BIMC_VARIABLE
● BIMC_VARIABLE_ASSIGNMENT
● BIMC_VARIABLE_VALUE
The tables enable, for example, an application to retrieve the variables defined for a view and create a user
interface so the user can enter values.
Example
The following statement specifies a single value for variables VAR_VAT, VAR_K2, and VAR_TARGET_CURRENCY.
SELECT
FROM [SALES_DATA_VAR]
WHERE [Measures].[M2_1_M3_CONV]
SAP VARIABLES [VAR_VAT] including 10,
[VAR_K2] including 112,
[VAR_TARGET_CURRENCY] including 'EUR'
The SAP HANA REST Application Programming Interface (REST API) is based on and extends the Orion server
and client APIs.
SAP HANA REST API supports the Orion protocol 1.0 which allows development tools to access the SAP HANA
Repository in a convenient and standards-compliant way. This not only makes access to the Repository easier
for SAP HANA tools, but it also enables the use of Orion-based external tools with the SAP HANA Repository.
For SAP tools, the Orion server protocol has been extended with the following SAP HANA-specific features
File Enables access to services that you to browse and manipulate files and directories via HTTP
Workspace Enables you to create and manipulate workspaces and projects via HTTP
Metadata Enables access to services that support search and auto-completion scenarios, for example, to
retrieve metadata from runtime, design-time, and other metadata locations
Change Tracking Enables the use of specific lifecycle-management features included with the SAP HANA Reposi
tory via HTTP
Info Enables access to information about the current version of the SAP HANA REST API
The SAP HANA REST API uses an additional parameter called SapBackPack to send request parameters that
are specific to SAP HANA; the SapBackPack parameter is added to the HTTP header. The value of the
SapBackPack parameter is a JSON object with the attributes and values of the additional SAP-specific
parameters. For example, when you create or update the content of a design-time artifiact, you can use the
SapBackPack value {"Activate":true} to request that the new version of the file is immediately activated
in the SAP HANA Repository. If you only want to create an inactive version of a design-time artifact, you can use
the “workspace” attribute to specify the name of the the Repository workspace where the inactive version is
to be stored.
Related Information
The SAP HANA REST API includes an Info API that can be used to display information about the current version
of the REST API.
GET /sap/hana/xs/dt/base/info
Orion-Version: 1.0
The information displayed by the Info API includes a description of the current version of the delivery unit and
the number of commands (API entry points) that are currently supported by the REST API.
HTTP/1.1 200 OK
{
"DeliveryUnit":{
"name":"HANA_DT_BASE",
"version":"1",
"responsible":"x###007,x###077",
"vendor":"sap.com",
"version_sp":"0",
"version_patch":"8",
"ppmsID":"",
"caption":"",
"lastUpdate":1386163749544,
"sp_PPMS_ID":"",
"ach":""
},
"Commands":[
"/sap/hana/xs/dt/base/file",
"/sap/hana/xs/dt/base/workspace",
"/sap/hana/xs/dt/base/xfer/import",
"/sap/hana/xs/dt/base/metadata",
"/sap/hana/xs/dt/base/change",
"/sap/hana/xs/dt/base/info"
]
}
Related Information
The SAP HANA REST API includes a File API which uses the basic HTTP methods GET, PUT, and POST to send
requests. JSON is used as the default representation format.
Action Description
Actions on files [page 346] Get, set, or change file content and metadata
Actions on directories [page 347] Get and change directory metadata and list directory con
tents
Create files and directories [page 348] Create files and directories with or without content
Copy and move files [page 348] Copy, move, or delete files and directories
Mass transfer actions [page 349] Get multiple files (or file metadata) from a list, repository
package, or a workspace
Change tracking [page 350] Activate selectively the latest approved versions of reposi
tory objects
Actions on Files
GET /sap/hana/xs/dt/base/file/MyProj/myfile.txt
Orion-Version: 1.0
If-Match: "358768768767"
SapBackPack: "{'Workspace': 'ABC', 'Version': 12}"
The REST File API enables you to retrieve the content of a specific file, for example, myfile.txt.
Note
In the request illustrated in the example above, the parameters Version, If-Match, and SapBackPack
are optional
HTTP/1.1 200 OK
Content-Type: text/plain
Content-Length: 22
This is the content
The REST File API enables you to retrieve the metadata associated with a specific file, for example,
myfile.txt.
Note
In the request illustrated in the example below, the parameters Version, If-Match, and SapBackPack
are optional
GET /sap/hana/xs/dt/base/file/MyProj/myfile.txt?parts=meta
Orion-Version: 1.0
SapBackPack: "{'History': 'false', 'Version': 12}"
If-Match: "35987989879"
The response to the retrieval request for metadata is displayed in the following example:
{
"Name": "myfile.txt",
"Location": "/sap/hana/xs/dt/base/file/MyProj/myfile.txt",
"RunLocation": "/MyProj/myfile.txt",
Actions on Directories
You can use the REST File API to retrieve and change directory (repository package) metadata as well as list the
contents of a directory. The following example shows how to list the contents of a single directory, for example,
myfolder.
Tip
To list all files from a directory recursively, use depth=infinity or -1. For security reasons the depth is
limited to 1000.
GET /sap/hana/xs/dt/base/file/MyProj/myfolder?depth=1
The following example shows the response to the directory listing request:
HTTP/1.1 200 OK
Content-Type: application/json
Content-Length: 132
{
"Name": "myfolder",
"Location": "/sap/hana/xs/dt/base/file/MyProj/myfolder",
"ContentLocation": "/MyProj/myfolder",
"LocalTimeStamp": 01234345009837,
"Directory": true
"Attributes": {
"ReadOnly": false,
"Executable": false
},
"Children": [
{
"Name": "myfile.txt",
"Location": "/sap/hana/xs/dt/base/file/MyProj/myfolder/myfile.txt",
"RunLocation": "/MyProj/myfolder/myfile.txt",
"Directory": false
} ] }
You can use the REST File API to create files and directories (repository packages) with or without content. The
following example shows how to create a new directory, for example, myfolder.
Note
If a parent directory (in which the new directory is created) is already assigned to a delivery unit, the
created directory will be assigned automatically to the same delivery unit.
POST /sap/hana/xs/dt/base/file/MyProj/
Content-Type: application/json
X-CSRF-Token: "65ABA3082325A3408FBE71C87929102B"
Slug: myfolder
{
"Name": "myfolder",
"Directory": "true"
}
The following example shows the response to the directory creation request:
HTTP/1.1 201 OK
{
"Name": "myfolder",
"Location": "/sap/hana/xs/dt/base/file/MyProj/myfolder",
"ContentLocation": "/MyProj/myfolder",
"ETag": "35fd43td3",
"LocalTimeStamp": 01234345009837,
"Directory": true
"Attributes": {
"ReadOnly": false,
"Executable": false
}
}
You can use the REST File API to copy, move, or delete files and directories (repository packages). You can also
use the File API to delete the workspace that contains files and directories used for development work. The
following example shows how to delete a directory, for example, myfolder.
DELETE /sap/hana/xs/dt/base/file/MyProj/myfile.txt
Orion-Version = 1.0
X-CSRF-Token: "65ABA3082325A3408FBE71C87929102B"
If-Match: "35" (optional)
Note
You need to include the parameters ProcessWorkspace=true and Workspace in the SapBackPack
parameter.
DELETE /sap/hana/xs/dt/base/file/MyProj/myfile.txt
HTTP/1.1 204 OK
Mass transfer with the REST File API enables you to apply GET and PUT operations to multiple files in a single
HTTP request.
Note
The mass-transfer feature is not a part of the Orion specification; it was developed to optimize the
performance of GET and PUT requests when dealing with a large number of files.
There are different ways of specifying the file paths. One way is to point the request's URL to the root of the file
repository, as illustrated in the request example below. In this case, you must specify the complete path from
the root of the repository for each file. Another possibility is to point the request's URL to a specified sub-
package in the Repository, which is then considered to be the root package for the files to be retrieved in the
request. To request a file's meta-data, use the parameter parts=meta; the response contains a list of file
metadata formatted as a JSON string. If the request does not contain the parameter parts=meta, a multipart
response is returned.
GET /sap/hana/xs/dt/base/file?parts=meta
Orion-Version = 1.0
SapBackPack: '{"MassTransfer":true, "MassTransferData": [
{"Pkg":"MyProj/myfolder","Name":"destination1.txt","Dir":false}, ...]}'
HTTP/1.1 200 OK
Content-Type: application/json
[
{
"Name": "destination1.txt",
"Location": "/sap/hana/xs/dt/base/file/MyProj/myfolder/destination1.txt",
"RunLocation": "/MyProj/myfolder/destination1.txt",
"ETag": "351234567",
"LocalTimeStamp": 01234345009837,
"Directory": false
"Attributes": { "ReadOnly": false, "Executable": true, "SapBackPack" :
{'Activated' : true}}
},
{
"Name": "destination2.txt",
"Location": "/sap/hana/xs/dt/base/file/MyProj/myfolder/destination2.txt",
"RunLocation": "/MyProj/myfolder/destination2.txt",
"ETag": 251237891,
"LocalTimeStamp": 01234345009837,
"Directory": false,
"Attributes": { "ReadOnly": false, "Executable": true, "SapBackPack" :
{'Activated' : true} }
}
Change Tracking
Use can use the REST File API to perform change-tracking operations. Change tracking enables you to activate
selectively the latest approved versions of objects.
Note
This feature of the REST File API assumes that the change-tracking feature is enabled in the SAP HANA
repository.
PUT /sap/hana/xs/dt/base/file/PATH?parts=meta
Orion-Version = 1.0
X-CSRF-Token: securityToken
SapBackPack: '{"MassTransfer":true, "Activate":true, "ChangeId": "ABC//11111",
"ChangeIdList": [{"Path": "PATH/file1.txt", "ChangeId" = "ABC//12345"},
{"Path": "PATH/file2.txt", "ChangeId" = "ABC//12345"}]}'
If an object (or a set of objects) is activated using the default change-tracking handling (for example, without
setting SapBackPack.ChangeTrackingMode or by setting SapBackPack.ChangeTrackingMode explicitly
to 0), a dynamic change list is created, and the file(s) are activated in the SAP HANA Repository using the
generated change list.
In the explicit handling of change tracking the user is allowed to activate files that are already assigned to a
change list. Files can also be activated using an explicitly provided change list ID. In the change-tracking
request above, the files PATH/file1.txt and PATH/file2.txt are assigned to the change list ABC//12345.
All other files will be activated using the change list ABC//11111.
The response to the change-tracking request would look like the following example:
HTTP/1.1 200 OK
Content-Type: application/json
...
Related Information
The SAP HANA REST API includes a Change Tracking API which enables you to make use of specific lifecycle-
management features that are included with the SAP HANA Repository via HTTP.
Change Tracking is integrated with the SAP HANA XS Repository transport tool set; with change tracking
enabled, you can ensure that an export operation (to build a delivery unit) includes only the latest approved
versions of repository objects.
Note
To use the Change-Tracking API, change tracking must enabled in the SAP HANA system whose repository
you are accessing.
To obtain the current status of change tracking in the system, for example, enabled or disabled, you can send a
GET request to the change entry point of the REST API.
GET /sap/hana/xs/dt/change
If the change tracking feature is enabled in the target system, the resulting response is true. If change
tracking is disabled in the target sytem or not supported by the system, the response to the the GET status
request is false
HTTP/1.1 200 OK
{
"ChangeTrackingStatus": true
}
You can also use the REST Change-Tracking API to manage change lists and track changes made to repository
objects. For example, to display all change lists, for which a specified user (“XYZ”) is a contributor:
GET /sap/hana/xs/dt/base/change
SapBackPack: {'User': 'XYZ', 'Status': 1}
HTTP/1.1 200 OK
[
{
"changeID":"ABC//1234",
"status":1,
"description":"",
"createdOn":"2014-04-09T13:26:58.868Z",
"createdBy":"XYZ"
},
{
"changeID":"ABC//1235",
"status":1,
"description":"",
"createdOn":"2014-04-09T14:08:53.024Z",
"createdBy":"XYZ"
}
]
To display the change status of a single file SomeFile.txt, use the following command:
GET /sap/hana/xs/dt/base/change/MyProj/SomeFile.txt
HTTP/1.1 200 OK
{
"ChangeId":"ABC//1234",
"User":"XYZ"
}
Related Information
The SAP HANA REST API includes a Metadata API which provides services to support search and
autocompletion scenarios
The REST-based Metadata API enables you to retrieve metadata from runtime and design-time objects as well
as other metadata locations. The typical location of runtime metadata is the SAP HANA database catalog. It is
possible to retrieve metadata for tables, views, procedures, functions, sequences, and schemas. The design-
time location for metadata is the SAP HANA Repository. Also accessible is the metadata location used by Core
Data Services (CDS).
The following services are provided with the Metadata API in the default location /sap/hana/xs/dt/base/
metadata; the services are called by setting the HTTP parameter Service-Name to the appropriate value:
● checkMetadataExistence
Checks for the existence of a provided set of entities and returns an array of entries which indicates if a
specified entity exists or not.
● checkMetadataExistence URI
Checks for the existence of a specific resource (entity) uniquely expressed as an HTTP universal resource
indicator (URI). checkMetadataExistence URI returns an array of entries which indicates if a given
entity exists or not.
● getMetadataSuggestion
Note
The following example shows how to use checkMetadataExistence URI to check for the existence of a
specific URI resource.
checkMetadataExistence URI returns an array of entries which indicates if a given entity exists or not.
List<metadata>
localName
isExist
List<exist> [6]
namespace
separator [7]
baseLocalName
baseType
type
mode [8]
desc
Related Information
The SAP HANA REST Transfer API is used to import and export packages and files.
You can use the Transfer API to perform both import and export operations:
Importing Files
The following example shows how to use the Transfer API to start an operation to upload files to the SAP HANA
Repository. The request URL uses the POST command to perform the action and must indicate the target
location of the uploaded file when the upload operation is complete. The request must also indicate the total
size of the file the server should expect to receive during the upload operation.
POST /sap/hana/xs/dt/base/xfer/import/MyProj/SomeFile.jpg
Orion-Version: 1.0
X-CSRF-Token: "65ABA3082325A3408FBE71C87929102B"
Slug: MyFile.jpg
X-Xfer-Content-Length: 901024
X-Xfer-Options: raw
HTTP/1.1 200 OK
Location: /sap/hana/xs/dt/base/xfer/import/fks3kjd7hf
ContentLocation: /xfer/fks3kjd7hf
After initiating the transfer, uploads are performed as many times as required using PUT actions.
PUT /sap/hana/xs/dt/base/xfer/import/fks3kjd7hf
Orion-Version: 1.0
X-CSRF-Token: "65ABA3082325A3408FBE71C87929102B"
Content-Length: 32768
Content-Type: image/jpeg
Content-Range: bytes 0-32767/901024
For each successful upload operation, you should see the following response:
Exporting Files
You can use the REST Transfer API to export (download) files and packages to a designated client in a zip
archive, as illustrated in the following example:
GET /sap/hana/xs/dt/base/xfer/export/MyProj/SomeFolder.zip
Orion-Version: 1.0
For each successful download operation, you should see the following response:
HTTP/1.1 201 OK
Content-Type: application/zip
File contents.
Related Information
The Workspace API enables you to create and manipulate Repository workspaces and projects via HTTP.
With the Workspace API, you can perform the following types of operation on workspaces and projects:
● Workspaces
List available workspaces, create or delete a workspace, and display or change workspace metadata
Workspace Actions
You can use the REST Workspace API to create a new workspace called “My Dev Workspace”, as illustrated in
the following example:
POST sap/hana/xs/dt/base/workspace
EclipseWeb-Version: 1.0
X-CSRF-Token: "65ABA3082325A3408FBE71C87929102B"
Slug: My Dev Workspace
The response to the workspace-creation request should look like the following example:
Projects
You can also use the REST Workspace API to create a new SAP HANA XS project (“My Project”) and add it an
existing workspace (“My Dev Workspace”), as illustrated in the following example. The Workspace API creates
the new project as an SAP HANA XS subpackage in the specified workspace package. The new project is
assigned to the list of projects in the specified workspace's metadata.
Note
POST /sap/hana/xs/dt/base/workspace/SAM_My_Dev_workspace_0
X-CSRF-Token: "65ABA3082325A3408FBE71C87929102B"
EclipseWeb-Version: 1.0
Slug: "My Project"
The response to the project-creation request should look like the following example:
{
"Id": "SAM_My_Dev_Workspace_0_My_Project_0",
"Location": "http://localhost:8080/sap/hana/xs/dt/base/file/sap/hana/xs/dt/base/
content/workspace/SAM_My_Dev_Workspace_0/My Project",
"ContentLocation": "http://localhost:8080/sap/hana/xs/dt/base/file/sap/
hana/xs/dt/base/content/workspace/SAM_My_Dev_Workspace_0/My Project",
Related Information
SAP HANA Extended Application Services (SAP HANA XS) provide applications and application developers
with access to the SAP HANA database using a consumption model that is exposed via HTTP.
In addition to providing application-specific consumption models, SAP HANA XS also host system services that
are part of the SAP HANA database, for example: search services and a built-in Web server that provides
access to static content stored in the SAP HANA repository.
The consumption model provided by SAP HANA XS focuses on server-side applications written in JavaScript.
Applications written in server-side JavaScript can make use of a powerful set of specially developed API
functions, for example, to enable access to the current request session or the database. This section describes
how to write server-side JavaScript code that enables you to expose data, for example, using a Web Browser or
any other HTTP client.
In SAP HANA Extended Application Services, the persistence model (for example, tables, views and stored
procedures) is mapped to the consumption model that is exposed via HTTP to clients - the applications you
write to extract data from SAP HANA.
You can map the persistence and consumption models in the following way:
● Application-specific code
Write code that runs in SAP HANA application services. Application-specific code (for example, server-side
JavaScript) is used in SAP HANA application services to provide the consumption model for client
applications.
Applications running in SAP HANA XS enable you to accurately control the flow of data between the
presentational layer, for example, in the Browser, and the data-processing layer in SAP HANA itself, where the
calculations are performed, for example in SQL or SQLScript. If you develop and deploy a server-side
JavaScript application running in SAP HANA XS, you can take advantage of the embedded access to SAP
HANA that SAP HANA XS provides; the embedded access greatly improves end-to-end performance.
SAP HANA application services (XS server) supports server-side application programming in JavaScript. The
server-side application you develop can use a collection of JavaScript APIs to expose authorized data to client
requests, for example, to be consumed by a client GUI such as a Web browser or any other HTTP client.
The functions provided by the JavaScript APIs enable server-side JavaScript applications not only to expose
data but to update, insert, and delete data, too. You can use the JavaScript APIs to perform the following
actions:
JavaScript programs are stored in the repository along with all the other development resources. When the
programs are activated, the code is stored in the repository as a runtime object.
Tip
To enable the Web Browser to display more helpful information if your JavaScript code causes an HTTP
500 exception on the SAP HANA XS Web server, ask someone with administrator privileges to start the
SAP HANA studio's Administration Console perspective and add the parameter developer_mode to the
xsengine.ini httpserver section of the Configuration tab and set it to true.
Related Information
SAP HANA Extended Application Services (SAP HANA XS) supports server-side application programming in
JavaScript. The server-side application you develop uses JavaScript APIs to expose authorized data to client
requests, for example, for consumption by a client GUI such as a Web browser, SAPUI5 applications, or mobile
clients.
Prerequisites
You have the privileges granted by the role sap.hana.ide.roles::EditorDeveloper; this role is included in the
parent role sap.hana.ide.roles::Developer.
In this introductory tutorial, you create a simple application that consists of the following files:
Procedure
{
"exposed" : true,
"authentication" : { "method": "Form" }
"authorization": // Optional: see xsprivileges file
[
"helloxsjs::Execute",
"helloxsjs::Admin"
]
}
The application-privileges file does not have a name; it only has the file extension .xsprivileges. The
contents of the .xsprivileges file must be formatted according to JavaScript Object Notation (JSON)
rules. The privileges defined in a .xsprivileges file are bound to the package to which the file belongs
and can only be applied to this package and its subpackages.
Note
The .xsprivileges file lists the authorization levels available for granting to an application package;
the .xsaccess file defines which authorization level is assigned to which application package.
a. Select the helloxsjs package and from the context menu choose New File .
b. Enter the file name .xsprivileges and choose Create.
{
"privileges" :
[
{ "name" : "Execute", "description" : "Basic execution
privilege" },
{ "name" : "Admin", "description" : "Administration privilege" }
]
}
call
"_SYS_REPO"."GRANT_APPLICATION_PRIVILEGE"('"helloxsjs::Execute"','<UserName>')
10. Create the server-side JavaScript (XSJS) files that contain the application logic.
Server-side JavaScript files have the file suffix .xsjs, for example, hello.xsjs and contain the code that
is executed when SAP HANA XS handles a URL request.
a. Select the helloxsjs package and from the context menu choose New File .
b. Enter the file name hello.xsjs and choose Create.
c. Enter the following content in the .xsjs file:
$.response.contentType = "text/plain";
$.response.setBody( "Hello, World!");
.
\
helloxsjs
\
.xsapp
.xsaccess
.xsprivileges // optional
hello.xsjs
12. To view the results, select the hello.xsjs file and choose (Run) in the toolbar.
You can write server-side JavaScript using the JavaScript editor, which provides syntax validation and code
highlighting.
Code Validation
The JavaScript editor in the SAP HANA Web-based Development Workbench includes the ESLint open-source
library, which helps to validate Javascript code. All JavaScript files (.js, .xsjs, and .xsjslib) are checked.
The editor highlights any code that does not conform to ESLint standards and flags each issue detected
according to its severity as follows: error (red); warning (yellow); information (blue).
By hovering over a flag, you can display a tooltip that describes one or more detected issues with the following
information:
● Category: Classifies the issue, for example, possible error, best practice, stylistic issue, and others.
● Rule ID: Defines the issue. For example, the rule ID semi enforces the use of semicolons.
● Message: Details the issue.
Example
The ESLint code check is enabled by default. In the Editor settings, you can configure the severity level of the
code check, disable the code check entirely, or just so that it is not triggered every time you make a change to
the code.
Quick Fixes
ESLint quick fixes allow you to correct errors quickly and are supported for the following ESLint rules:
● http://eslint.org/docs/rules/eqeqeq
● http://eslint.org/docs/rules/quotes
● http://eslint.org/docs/rules/no-comma-dangle
● http://eslint.org/docs/rules/no-extra-semi
● http://eslint.org/docs/rules/no-space-before-semi
● http://eslint.org/docs/rules/semi
● http://eslint.org/docs/rules/space-infix-ops
Beautify
Beautify JavaScript to make the source code more readable. Once applied, the code is reformatted by parsing
the JavaScript source code into components, (for example, statements, blocks, and so on) and applying the
correct style and format to the code. The beautifier, for example:
You can beautify any open JavaScript file by choosing Beautify from the context menu.
Related Information
ESLint Rules
Configuring ESLint
The function flow is a tool that allows you to view an outline list and code flow visualization of the functions
called within a JavaScript file. It also lets you display code previews of function definitions as well as navigate
directly to the function definitions in the source code.
Context
The function flow provides a graphical and list view of the functions in a file.
● Function graph: Shows the calls being made from and to the currently active function (based on your
cursor position):
○ Red node: the currently active function
○ Blue nodes: functions defined in the displayed file
○ Gray nodes: functions defined elsewhere
○ Dotted arcs: calls to functions defined in the same file
○ Solid arcs: calls to functions defined elsewhere
● Function list: Shows an outline of the functions and objects (which in turn contain functions) present in the
code. The active function shows, when expanded, the calls being made from and to it:
Procedure
1. Open a JavaScript file and choose (Show function flow) in the toolbar.
The Outline panel opens on the right. The caller-callee relationships of the function or object in which the
cursor is positioned are shown in both the graph and list.
Example:
2. Choose an option.
The cursor focus changes in both the list and graph. The graph is now centered on
the newly selected function.
Note
The function flow reacts only to cursor position changes. A double-click causes
the cursor position to change, placing it inside the selected entity and making
the function flow (both graph and list) react and update accordingly. Similarly,
whenever the cursor is moved inside the code, the information in the Outline
panel is updated automatically.
Navigate backward and for Choose the back arrow button located in the function graph toolbar to navigate
ward in the graph back to the previously selected function. Navigate forward again by choosing the
forward arrow.
Highlight a function in the code Click the function node in the graph.
Move the cursor freely within Choose the (Lock) button in the function graph toolbar.
the file while retaining the cur
rent state of the graph The graph is now locked and will not respond to cursor changes until you release the
The cursor focus changes in both the list and graph. The graph is now centered on
the newly selected function.
Highlight a function call in the Click the function call in the list.
code
The section of code is highlighted where the corresponding call is made.
Display a function definition as In the source code, press CTRL and hover over the function call for a couple of sec
hover text onds.
Note that the code preview only works across files if the files are located within the
same package.
Go to a function definition In the source code, press CTRL + CLICK on the function call.
Note that code navigation only works across files if the files are located within the
same package.
You can use an immediate feedback session to get a better understanding of how your XSJS function works
and performs.
Prerequisites
1. Start the SAP HANA studio and open the Administration perspective.
2. On the Configuration tab, add a section called xsengine.ini debugger (if it does not exist) and add (or
set) the following parameter: enabled=true
Context
An immediate feedback session is similar to a debug session. It allows you to evaluate an XSJS function by
stepping through code execution, inspecting variables, and modifying statements. The details of the function
execution are displayed in the Immediate Feedback panel:
● Function Evaluation
The time slider indicates which steps of the function have been executed. The variable list shown just
below the time slider displays the values of the variables in each step.
● Called Function List
If other functions are called by the selected function, they are listed here.
● Database Performance
Procedure
1. In an XSJS JavaScript file, position the cursor inside a JavaScript function declaration on the top level and
Note
The function must be an XSJS JavaScript function that is defined as a function declaration, for
example, in the following form:
Other types of function declarations are not supported. In addition, the function to be investigated
must be on the top level in the XSJS file, that is, it must not be nested.
The Immediate Feedback panel opens on the right and shows how the function is evaluated when it is
called without arguments. The execution step shown on the slider reflects your current cursor position in
the source code.
2. Choose an option.
Step through the execution In the Function Evaluation panel, use the time slider to move step execution forward
or backward.
The variable list shows the values of the variables as evaluated in each step. Each
variable newly assigned in a step is marked with an asterisk (*).
Step into called functions 1. In the Called Function List pane, select the checkboxes of the functions that you
want to step into and evaluate in the next immediate feedback run.
If you choose to use server-side JavaScript to write your application code, you need to bear in mind the
potential for (and risk of) external attacks such as cross-site scripting and forgery, and insufficient
authentication.
The following list illustrates the areas where special attention is required to avoid security-related problems
when writing server-side JavaScript. Each of the problems highlighted in the list is described in detail in its own
dedicated section:
● SSL/HTTPS
Enable secure HTTP (HTTPS) for inbound communication required by an SAP HANA application.
● Injection flaws
In the context of SAP HANA Extended Application Services (SAP HANA XS) injection flaws concern SQL
injection that modifies the URL to expand the scope of the original request.
● Cross-site scripting (XSS)
Web-based vulnerability that involves an attacker injecting JavaScript into a link with the intention of
running the injected code on the target computer.
● Broken authentication and session management
Leaks or flaws in the authentication or session management functions allow attackers to impersonate
users and gain access to unauthorized systems and data.
● Insecure direct object references
An application lacks the proper authentication mechanism for target objects.
● Cross-site request forgery (XSRF)
Exploits the trust boundaries that exist between different Web sites running in the same web browser
session.
● Incorrect security configuration
Attacks against the security configuration in place, for example, authentication mechanisms and
authorization processes.
● Insecure cryptographic storage
Sensitive information such as logon credentials is not securely stored, for example, with encryption tools.
● Missing restrictions on URL Access
Sensitive information such as logon credentials is exposed.
● Insufficient transport layer protection
Network traffic can be monitored, and attackers can steal sensitive information such as logon credentials
or credit-card data.
● Invalid redirects and forwards
Web applications redirect users to other pages or use internal forwards in a similar manner.
● XML processing issues
Potential security issues related to processing XML as input or to generating XML as output
If you choose to use server-side JavaScript to write your application code, you need to bear in mind the
potential for (and risk of) external attacks such as cross-site scripting and forgery, and insufficient
authentication. You can set up SAP HANA to use secure HTTP (HTTPS).
SSL/HTTPS Problem
Incoming requests for data from client applications use secure HTTP (HTTPS), but the SAP HANA system is
not configured to accept the HTTPS requests.
SSL/HTTPS Recommendation
Ensure the SAP Web Dispatcher is configured to accept incoming HTTPS requests. For more information, see
the SAP HANA Security Guide.
Note
The HTTPS requests are forwarded internally from the SAP Web Dispatcher to SAP HANA XS as HTTP
(clear text).
Tip
For more information about security in SAP HANA, see the SAP HANA Security Guide.
If you choose to use server-side JavaScript to write your application code, you need to bear in mind the
potential for (and risk of) injection flaws. Typically, injection flaws concern SQL injection and involve modifying
the URL to expand the scope of the original request.
The XS JavaScript API provides a number of different ways to interact with the SAP HANA database by using
SQL commands. By default, these APIs allow you to read data, but they can also be used to update or delete
data, and even to grant (or revoke) access rights at runtime. As a general rule, it is recommended to write a
query which is either a call to an SQLScript procedure or a prepared statement where all parameters specified
in the procedure or statement are escaped by using either setString or setInt, as illustrated in the
examples provided in this section. Avoid using dynamic SQL commands with parameters that are not escaped.
In the context of SAP HANA XS, injection flaws mostly concern SQL injection, which can occur in the SAP
HANA XS JavaScript API or SQL script itself (both standard and dynamic). For example, the URL http://
xsengine/customer.xsjs?id=3 runs the code in the JavaScript file customer.xsjs shown below:
Note
SAP HANA XS applications rely on the authorization provided by the underlying SAP HANA database.
Users accessing an SAP HANA XS based application require the appropriate privileges on the database objects
to execute database queries. The SAP HANA authorization system will enforce the appropriate authorizations.
This means that in those cases, even if the user can manipulate a query, he will not gain more access than is
assigned to him through roles or privileges. Definer mode SQL script procedures are an exception to this rule
that you need to take into consideration.
To prevent injection flaws in the JavaScript API, use prepared statements to create a query and place-holders
to fill with results of function calls to the prepared-statement object; to prevent injection flaws in standard SQL
Script, use stored procedures that run in caller mode; in caller mode, the stored procedures are executed with
the credentials of the logged-on HANA user. Avoid using dynamic SQL if possible. For example, to guard against
the SQL-injection attack illustrated in the problem example, you could use the following code:
Prepared statements enable you to create the actual query you want to run and then create several
placeholders for the query parameters. The placeholders are replaced with the proper function calls to the
prepared statement object. The calls are specific for each type in such a way that the SAP HANA XS JavaScript
API is able to properly escape the input data. For example, to escape a string, you can use the setString
function.
Tip
For more information about security in SAP HANA, see the SAP HANA Security Guide and the SAP HANA
SQL System Views and Reference.
If you use server-side JavaScript to write your application code, bear in mind the potential for (and risk of)
cross-site scripting (XSS) attacks. Cross-site scripting is a Web-based vulnerability that involves an attacker
injecting JavaScript into a link with the intention of running the injected code on the target computer.
● Reflected (non-persistent)
Code affects individual users in their local Web browser
● Stored (persistent)
Code is stored on a server and affects all users who visit the served page
A successful cross-site scripting attack could result in a user obtaining elevated privileges or access to
information that should not be exposed.
Since there are currently no libraries provided by the standard SAP HANA XS JavaScript API to provide proper
escaping, we recommend not to write custom interfaces but to rely on well-tested technologies supplied by
SAP, for example, OData or JSON together with SAPUI5 libraries.
Tip
For more information about security in SAP HANA, see the SAP HANA Security Guide.
If you choose to use server-side JavaScript to write your application code, you need to bear in mind the
potential for (and risk of) attack against authentication infrastructure. Leaks or flaws in the authentication or
session management functions allow attackers to impersonate users and gain access to unauthorized systems
and data.
Authentication Problem
Leaks or flaws in the authentication or session management functions allow attackers to impersonate users;
the attackers can be external as well as users with their own accounts to obtain the privileges of those users
they impersonate.
Authentication Recommendation
Use the built-in SAP HANA XS authentication mechanism and session management (cookies). For example,
use the "authentication" keyword to enable an authentication method and set it according to the
authentication method you want implement, for example: SAP logon ticket, form-based, or basic (user name
and password) in the application's .xsaccess file, which ensures that all objects in the application path are
available only to authenticated users.
Tip
For more information about security in SAP HANA, see the SAP HANA Security Guide.
If you choose to use server-side JavaScript to write your application code, you need to bear in mind the
potential for (and risk of) attacks using insecure references to objects.
An SAP HANA XS application is vulnerable to insecure direct object reference if the application lacks the proper
authentication mechanism for target objects.
Make sure that only authenticated users are allowed to access a particular object. In the context of SAP HANA
XS, use the "authentication" keyword to enable an authentication method and set it according to the
authentication method you implement, for example: SAP logon ticket, form-based, or basic (user name and
password) in the application's .xsaccess file, which ensures that all objects in the application path are
available only to authenticated users.
Tip
For more information about security in SAP HANA, see the SAP HANA Security Guide.
If you choose to use server-side JavaScript to write your application code, you need to bear in mind the
potential for (and risk of) cross-site request forgery (XSRF). Cross-site scripting is a web-based vulnerability
that exploits the trust boundaries that exist between different websites running in the same web browser
session.
Since there are no clear trust boundaries between different Web sites running in the same Web-browser
session, an attacker can trick users (for example, by luring them to a popular Web site that is under the
attacker's control) into clicking a specific hyperlink. The hyperlink displays a Web site that performs actions on
the visitor's behalf, for example, in a hidden iframe. If the targeted end user is logged in and browsing using an
account with elevated privileges, the XSRF attack can compromise the entire Web application.
SAP HANA XS provides a way to include a random token in the POST submission which is validated on the
server-side. Only if this token is non-predictable for attackers can one prevent cross-site, request-forgery
attacks. The easiest way to prevent cross-site, request-forgery attacks is by using the standard SAP HANA XS
cookie. This cookie is randomly and securely generated and provides a good random token which is
unpredictable by an attacker ($.session.getSecurityToken()).
To protect SAP HANA XS applications from cross-site request-forgery (XSRF) attacks, make sure you always
set the prevent_xsrf keyword in the application-access (.xsaccess) file to true, as illustrated in the following
example:
{
"prevent_xsrf" : true
}
Note
The default setting is false, which means there is no automatic prevention of XSRF attacks. If no value is
assigned to the prevent_xsrf keyword, the default setting (false) applies.
The following client-side JavaScript code snippet show how to use the HTTP request header to fetch, check,
and apply the XSRF security token required to protect against XSRF attacks.
<html>
<head>
<title>Example</title>
<script id="sap-ui-bootstrap" type="text/javascript"
src="/sap/ui5/1/resources/sap-ui-core.js"
data-sap-ui-language="en"
data-sap-ui-theme="sap_goldreflection"
data-sap-ui-libs="sap.ui.core,sap.ui.commons,sap.ui.ux3,sap.ui.table">
</script>
<script type="text/javascript" src="/sap/ui5/1/resources/jquery-sap.js"></
script>
<script>
function doSomething() {
$.ajax({
url: "logic.xsjs",
type: "GET",
beforeSend: function(xhr) {
xhr.setRequestHeader("X-CSRF-Token", "Fetch");
},
success: function(data, textStatus, XMLHttpRequest) {
var token = XMLHttpRequest.getResponseHeader('X-CSRF-Token');
var data = "somePayLoad";
$.ajax({
url: "logic.xsjs",
type: "POST",
data: data,
beforeSend: function(xhr) {
xhr.setRequestHeader("X-CSRF-Token", token);
},
success: function() {
alert("works");
},
error: function() {
alert("works not");
}
});
}
});
}
</script>
</head>
<body>
<div>
<a href="#" onClick="doSomething();">Do something</a>
</div>
</body>
</html>
For more information about security in SAP HANA, see the SAP HANA Security Guide.
If you choose to use server-side JavaScript to write your application code, you need to bear in mind the
potential for (and risk of) attacks against the security configuration in place, for example, authentication
mechanisms and authorization processes.
Applications should have proper authentication in place, for example, by using SAP HANA built-in
authentication mechanisms and, in addition, the SAP HANA XS cookie and session handling features.
Application developers must also consider and control which paths are exposed by HTTP to the outside world
and which of these paths require authentication.
Tip
For more information about security in SAP HANA, see the SAP HANA Security Guide.
If you choose to use server-side JavaScript to write your application code, you need to bear in mind the
potential for (and risk of) attacks against the insecure or lack of encryption of data assets.
Storage-Encryption Problem
To prevent unauthorized access, for example, in the event of a system break-in, data such as user logon
credentials must be stored in an encrypted state.
Tip
For more information about security in SAP HANA, see the SAP HANA Security Guide.
If you choose to use server-side JavaScript to write your application code, you need to bear in mind the
potential for (and risk of) unauthorized access to URLs.
Unauthenticated users have access to URLs that expose confidential (unauthorized) data.
Make sure you have addressed the issues described in "Broken Authentication and Session Management" and
"Insecure Direct Object References". In addition, check if a user is allowed to access a specific URL before
actually executing the code behind that requested URL. Consider putting an authentication check in place for
each JavaScript file before continuing to send any data back to the client's Web browser.
Tip
For more information about Security in SAP HANA, see the SAP HANA Security Guide.
If you choose to use server-side JavaScript to write your application code, you need to bear in mind the
potential for (and risk of) insufficient protection of the transport layer.
Without transport-layer protection, the user's network traffic can be monitored, and attackers can steal
sensitive information such as logon credentials or credit-card data.
Turn on transport-layer protection in SAP HANA XS; the procedure is described in the SAP HANA security
guide.
Tip
For more information about security in SAP HANA, see the SAP HANA Security Guide.
If you use server-side JavaScript to write your application code, bear in mind the potential for (and risk of)
redirection and internal fowarding from the requested Web page.
Web applications frequently redirect users to other pages or use internal forwards in a similar manner.
Sometimes the target page is specified in an invalid (not permitted) parameter. This enables an attacker to
choose a destination page leading to the possibility of phishing attacks or the spamming of search engines.
To prevent invalidated redirects or forwards, application developers should validate the requested destination
before forwarding, for example, by checking if the destination is present in a white list. If the destination URL
specified in the redirection request is not present in the white list, the redirection is refused.
Tip
Alternatively, you can refuse to allow any direct user input; instead, the input can be used to determine the final
destination for the redirection, as illustrated in the following example:
Tip
For more information about security in SAP HANA, see the SAP HANA Security Guide.
If you choose to use server-side JavaScript to write your application code, you need to bear in mind the
potential for (and risk of) attacks aimed at the process used to parse XML input and generate the XML output.
There are several potential security issues related to processing XML as input or to generating XML as output.
In addition, problems with related technologies (for example, XSL Transformations or XSLT) can enable the
inclusion of other (unwanted) files.
Turn on transport-layer protection in SAP HANA XS; the procedure is described in the SAP HANA security
guide.
Bear in mind the following rules and suggestions when processing or generating XML output:
● When processing XML that originates from an untrusted source, disable DTD processing and entity
expansion unless strictly required. This helps prevent Billion Laugh Attacks (Cross-Site Request Forgery),
which can bring down the processing code and, depending on the configuration of the machine, an entire
server.
● To prevent the inclusion (insertion) of unwanted and unauthorized files, restrict the ability to open files or
URLs even in requests included in XML input that comes from a trusted source. In this way, you prevent the
disclosure of internal file paths and internal machines.
● Ensure proper limits are in place on the maximum amount of memory that the XML processing engine can
use, the amount of nested entities that the XML code can have, and the maximum length of entity names,
attribute names, and so on. This practice helps prevent the triggering of potential issues.
Tip
For more information about security in SAP HANA, see the SAP HANA Security Guide.
The elements defined in normal server-side JavaScript programs cannot be accessed from other JavaScript
programs. To enable the reuse of program elements, SAP HANA Extended Application Services support server-
side JavaScript libraries.
Server-side JavaScript libraries are a special type of JavaScript program that can be imported and called in
other JavaScript programs. You can use JavaScript libraries to perform simple, repetitive tasks, for example, to
handle forms and form date, to manipulate date and time strings, to parse URLs, and so on.
The following example shows how to import a JavaScript mathematics library using the import function:
● Package name
Full name of the package containing the library object you want to import, for example, sap.myapp.lib
● Library name
Name of the library object you want to import, for example, math
Note
Restrictions apply to the characters you can use in the names of JavaScript libraries and application
packages. Permitted characters are: upper- and lower-case letters (Aa-Zz), digits 0-9, and the dollar sign
($).
The standard JavaScript limitations apply to the characters you can use in either the name of the XSJS library
you create or the name of the package where the library is deployed. For example, you cannot use the hyphen
(-) in the name of an XSJS library or, if you are referencing the library, the name of a package in the application
package path. To prevent problems with activation of the object in the SAP HANA repository, you must follow
the standard rules for accessing JavaScript property objects by name. The following example, shows how to
use square brackets and quotes (["<STRING>"]) to access an object whose name uses non-permitted
characters such as a hyphen (-):
Related Information
Server-side JavaScript libraries are a special type of JavaScript program that can be imported and called in
other JavaScript programs. You can use JavaScript libraries to perform simple, repetitive tasks, for example:
handle forms and form date, manipulate date and time strings, parse URLs, and so on.
Context
JavaScript libraries are internally developed extensions for SAP HANA. The libraries exist in the context of a
package, which is referenced when you import the library. The following example of a JavaScript library
displays the word "Hello" along with a name and an exclamation mark as a suffix.
Note
This procedure uses the illustrated example JavaScript library to explain what happens when you import a
JavaScript library, for example, which objects are created, when, and where. If you have your own library to
import, substitute the library names and paths shown in the steps below as required.
To import a JavaScript library for use in your server-side JavaScript application, perform the following tasks
Procedure
$.import("<path.to.your.library.filename>","greetLib");
var greeting = $.<path.to.your.library.filename>.greet("World");
$.response.setBody(greeting);
$.path.to.your.library.greetLib
$.path.to.your.library.greetLib.greet()
○ Pre-import checks:
○ It is not possible to import the referenced library if the import operation would override any
predefined runtime objects.
○ Do not import the referenced library if it is already present in the package.
○ Library context
Imported libraries exist in the context defined by their repository location.
Server-side JavaScript libraries are a special type of JavaScript program that can be imported and called in
other JavaScript programs. You can use JavaScript libraries to perform simple, repetitive tasks, for example, to
handle forms and form date, to manipulate date and time strings, to parse URLs, and so on.
Context
JavaScript libraries are internally developed extensions for SAP HANA. However, you can write your own
libraries, too. JavaScript libraries exist in the context of a package, which is referenced when you import the
library. To write a JavaScript library to use in your server-side JavaScript application, perform the following
steps:
Procedure
1. Create the file that contains the JavaScript library you want to add to the package and make available for
import.
In SAP HANA XS, server-side JavaScript libraries have the file extension .xsjslib, for example
greetLib.xsjslib.
a. Select the package where you want to create the new JavaScript library file and from the context menu
choose New File .
b. Enter a file name, for example, greetLib.xsjslib, and choose Create.
c. Enter the following content in the greetLib.xsjslib file.
This example creates a simple library that displays the word “Hello” along with a supplied name and
adds an exclamation point (!) as a suffix.
SAP HANA Extended Application Services (SAP HANA XS) provides a set of server-side JavaScript application
programming interfaces (API) that enable you to configure your applications to interact with SAP HANA.
The SAP HANA XS JavaScript Reference lists all the functions that are available for use when programing
interaction between your application and SAP HANA. For example, you can use the database API to invoke SQL
statements from inside your application, or access details of the current HTTP request for SAP HANA data with
the request-processing API.SAP HANA XS includes the following set of server-side JavaScript APIs:
Database Enables access to the SAP HANA by means of SQL statements. For example, you can open a
connection to commit or rollback changes in SAP HANA, to prepare stored procedures (or SQL
statements) for execution or to return details of a result set or a result set's metadata.
Outbound connectivity Enables outbound access to a defined HTTP destination that provides services which an applica
tion can use. For example, you can read the connection details for an HTTP destination, request
data, and set details of the response body. You can also set up an SMTP connection for use by
outgoing multipart e-mails.
Request processing Enables access to the context of the current HTTP request, for example, for read requests and
write responses. You can use the functions provided by this API to manipulate the content of the
request and the response.
Session Enables access to the SAP HANA XS session, for example, to determine the language used in the
session or if a user has the privileges required to run an application.
Job Schedule Enables access to the job-scheduling interface which allows you to define and trigger recurring
tasks that run in the background. The XS jobs API allows you to add and remove schedules from
jobs.
Security Enables access to the $.security.crypto namespace and the classes AntiVirus and
Store, which provide tools that allow you to configure a secure store, set up anti-virus scans,
and generate hashes..
Trace Enables access to the various trace levels you can use to generate and log information about ap
plication activity. You can view trace files in the diagnosis Files tab of the SAP HANA studio's
Administration perspective.
Utilities Enables access to utilities that you can use to parse XML and manipulate Zip archives, for exam
ple, to zip and unzip files, add and remove entries from Zip archives, and encrypt Zip archives
with password protection.
XS Data Services Provides access to a library of JavaScript utilities, which can be used to enable server-side Java
Script applications to consume data models that are defined using Core Data Services.
XS Procedures Provides access to a library of JavaScript utilities, which can be used to enable server-side Java
Script applications to call SAP HANA stored procedures as if the procedures were JavaScript
functions.
Restriction
XSProc is intended only for use with database connections made with the old $.db API. It
is not recommended to use XSProc with $.hdb connections. For $.hdb connections, use
$hdb.loadProcedure instead.
Database API
The SAP HANA XS Database API ($.hdb) provides tools that enable simple and convenient access to the
database.
Caution
The $.hdb namespace is intended as a replacement for the older $.db namespace. Since different
database connections are used for the $.hdb and $.db APIs, avoid using both APIs in a single http-
request, for example, to update the same tables as this can lead to problems, including deadlocks.
You can use the Database API for the following operations
● $.hdb.Connection
Establish a connection to the SAP HANA database
● $.hdb.ProcedureResult
Represents the result of a stored procedure call to the SAP HANA database
● $.hdb.ResultSet
Represents the result of a database query
The following example shows how to use the database API to connect to the SAP HANA database, commit
some changes, and end the current transaction.
Note
By default, auto-commit mode is disabled, which means that all database changes must be explicitly
committed.
The following example of usage of the SAP HANA XS database API shows how to establish a connection with
SAP HANA and return a result set from the specified procedure call. The example code assumes that a
procedure exists with the following signature:
PROCEDURE 'DB_EXAMPLE'.icecream.shop::sell(
IN flavor VARCHAR,
Note that the result can be accessed as if it were a JSON object with a structure similar to the following
example: {change: 1.50, $resultSets:[....]} .
Tip
Outbound API
The Outbound API ($.net) provides tools that you can use to perform the following actions:
● $.net.SMTPConnection
For sending $.net.Mail objects by means of an SMTP connection
● $.net.Mail
For constructing and sending multipart e-mails
● $.net.http
HTTP(s) client (and request) classes for outbound connectivity and an HTTP(s) destination class that hold
metadata, for example: host, port, useSSL.
The following example shows how to use the $.net.SMTPConnection class to send e-mail objects
($.net.Mail) by means of an SMTP connection object:
Note
If mandatory information is missing or an error occurs during the send operation, the mail.send() call
fails and returns an error.
The following example of server-side JavaScript shows how to use the outbound API to get (read) an HTTP
destination. You can also set the contents of the response, for example, to include details of the header, body,
and any cookies. For HTTPs connections you need to maintain a certificate (CA or explicit server certificate) in
a Trust Store; you use the certificate to check the connection against.
Tip
You define the HTTP destination in a text file using keyword=value pairs. You must activate the HTTP
destination in the SAP HANA repository. After activation, you can view details of the HTTP destination in
the SAP HANA XS Administration tool.
The Request-Processing API ($.web) provides access to the body of HTTP request and response entities. For
example, you can use the following classes:
● $.web.Body
Represents the body of an HTTP request entity and provides access to the data included in the body of the
HTTP request entity
● $.web.EntityList
Represents a list of request or response entities; the EntityList holds WebEntityRequest or
WebEntityResponse objects.
● $.web.TupelList
Represents a list of name-value pairs. The TupelList is a container that provides tuples for cookies,
headers, and parameters. A “tuple” is a JavaScript object with the properties “name” and “value”.
● $.web.WebRequest
Enables access to the client HTTP request currently being processed
● $.web.WebResponse
Enables access to the client HTTP response currently being processed for the corresponding request
object (
● $.web.WebEntityRequest
Represents an HTTP request entity and provides access to the entity's metadata and (body) content.
● $.web.WebEntityResponse
Represents the HTTP response currently being populated
The following example shows how to use the request-processing API to display the message “Hello World” in a
browser.
$.response.contentType = "text/plain";
$.response.setBody( "Hello, World !");
In the following example, you can see how to use the request-processing API to get the value of parameters
describing the name and vendor ID of a delivery unit (DU) and return the result set in JSON-compliant form.
In the following example of use of the request-processing API, we show how to access to the request's meta
data (and body) and, in addition, how to set and send the response.
// send response
$.response.contentType = "plain/test";
$.response.setBody("result: " + result);
Session API
Enables access to the SAP HANA XS session, for example, to determine the language used in the session or
check if a user has the privileges required to run an application.
You can use the XS JavaScript $.session API to request and check information about the currently open
sessions. For example, you can find out the name of a user who is currently logged on to the database or get
the session-specific security token. The $.session API also enables you to check if a user has sufficient
privileges to call an application. The following example checks if the user has the execute privilege that is
required to run an application. If the check reveals that the user does not have the required privilege, an error
message is generated indicating the name of the missing privilege.
if (!$.session.hasAppPrivilege("sap.xse.test::Execute")) {
$.response.setBody("Privilege sap.xse.test::Execute is missing");
$.response.status = $.net.http.INTERNAL_SERVER_ERROR;
}
In SAP HANA XS, a scheduled job is created by means of an .xsjob file, a design-time file you commit to (and
activate in) the SAP HANA repository. The .xsjob file can be used to define recurring tasks that run in the
background; the Job Schedule API allows developers to add and remove schedules from such jobs.
● Job
$.jobs.Job represents a scheduled XS job
● JobLog
$.jobs.JobLog provide access to the log entries of a scheduled job
● JobSchedules
$.jobs.JobSchedules enables control of an XS job's schedules.
Note
It is not possible to call the $.request and $.response objects as part of an XS job.
The XS jobs API $.jobs.Job enables you to add schedules to (and remove schedules from) jobs defined in
an .xsjob file.
The following example of server-side JavaScript shows how to use the Job Schedule API to add a schedule to a
existing job and delete a schedule from an existing job.
If the XS job file referred to in the URI is not in the same package as the XS JavaScript or SQLScript function
being called, you must add the full package path to the XS job file specified in the URI illustrated in line 1 of the
example above, for example, </path/to/package.>MyXSjob.xsjob.
Note
In addition, the SQL connection defined in sqlcc/otheruser.xssqlcc is used to modify the job; it is not
used to execute the job specified in myJob.xsjob.
To understand the cron-like syntax required by the xscron job scheduler, use the following examples:
● 2013 * * fri 12 0 0
Run the job every Friday in 2013 at 12:00.
● * * 3:-2 * 12:14 0 0
Run every hour between 12:00 and 14:00 every day between the third and second-to-last day of the month.
● * * * -1.sun 9 0 0
Run the job on the last Sunday of every month at 09:00.
Security API
The SAP HANA XS JavaScript security API $.security includes the $.security.crypto namespace and
the following classes:
● $.security.AntiVirus
Scan data with a supported external anti-virus engine
● $.security.Store
Store data securely in name-value form
The $.security.crypto namespace includes methods (for example, md5(), sha1(), and sha256()) that
enable you to compute an MD5 or SHA1/256 hash (or HMAC-MD5, HMAC-SHA1, and HMAC-SHA256).
The AntiVirus class includes a method scan() that enables you to set up a scan instance using one of the
supported anti-virus engines. The Store class enables you to set up a secure store for an SAP HANA XS
application; the secure store can be used to store sensitive information either at the application level
(store()) or per user (storeForUser()).
The following code example shows how to use the SAP HANA XS virus-scan interface (VSI) to scan a specific
object type: a Microsoft Word document.
For more information about which antivirus engines SAP HANA supports, see SAP Note 786179.
The following code example shows how to set up a simple scan for data uploads using the SAP HANA XS virus-
scan interface.
The SAP HANA XS $.security.Store API can be used to store data safely and securely in name-value form.
The security API enables you to define a secure store (in a design-time artifact) for each application and refer
to this design time object in the application coding.
Note
The design-time secure store is a file with the file extension “.xssecurestore”, for example,
localStore.xssecurestore; the secure-store file must include only the following mandatory content:
{}.
SAP HANA XS looks after the encryption and decryption of data and also ensures the persistency of the data.
For the stored data, you can choose between the following visibility options:
function store() {
var config = {
name: “foo”,
value: “bar”
};
var aStore = new $.security.Store("localStore.xssecurestore");
aStore.store(config);
}
function read() {
var config = {
name: “foo”
};
try {
var store = new $.security.Store("localStore.xssecurestore");
var value = store.read(config);
}
catch(ex) {
//do some error handling
}
Trace API
Enables access to the various trace levels you can use to generate and log information about application
activity. The specified error message is written to the appropriate trace file.
● $.trace.debug(message)
Writes the string defined in (message) to the application trace with debug level
● $.trace.error(message)
Writes the string defined in (message) to the application trace with error level
● $.trace.fatal(message)
Writes the string defined in (message) to the application trace with fatal level
● $.trace.info(message)
Writes the string defined in (message) to the application trace with info level
● $.trace.warning(message)
Writes the string defined in (message) to the application trace with warning level
Note
If tracing is enable, messages generated by the $.trace API are logged in the SAP HANA trace file
xsengine_<host>_<Instance>_<#>.trc on the SAP HANA server, for example, in
<installation_path>/<SID>/HDB<nn>/<hostname>/trace. Trace messages with severity status
“warning”, “error” and “fatal” are also written to a similarly named alert file, for example,
xsengine_alert_<host>.trc.
Utilities API
The SAP HANA XS JavaScript Utilities API includes the $.util namespace, which contains the following
classes
● $.util.SAXParser
Tools for parsing XML content (for example, strings, array buffers, and the content of Web response body
objects)
● $.util.Zip
Compression tools for building, modifying, extracting, and encrypting archives
Note
You can stop, reset, and resume a parsing operation. If the content to be parsed does not contain XML, the
parser throws an error.
The following code snippet shows how to use the $.util.SAXParser tools to parse the content of a
$.web.Body object.
● UTF-8 (default)
● UTF-16
● US-ASCII
The SAP HANA XS JavaScript Utilities API also includes the$.util.Zip tool, which enables you to perform a
series of actions on Zip archives, for example:
● Compress files into (zip) and extract files from (unzip) a Zip archive
● Add new entries to, update existing entries in, and remove entries from a Zip archive
● Encrypt Zip archives with password protection
The following code snippets show how to use the $.util.Zip tools to work with Zip file content, for example,
by adding, updating, extracting, and deleting entries. When modeling folder hierarchies, the Zip object behaves
like an associative array; the entry names are the keys (the full paths to the indicated files). In the following
example, we add an entry to a Zip file:
Note
In the following example, we extract an entry from a Zip file: if the entry does not exist, this returns undefined.
In the following example, we delete an entry from a Zip file: if the entry does not exist, nothing happens.
Note
There is a restriction on the amount of uncompressed data that can be extracted from a Zip archive using
the XS JS utilities API.
When using the XS JS utilities API to extract data from a Zip archive, the maximum amount of uncompressed
data allowed during the extraction process is defined with the parameter
max_uncompressed_size_in_bytes, which you can set in the zip section of the xsengine.ini
configuration file for a given SAP HANA system. If the zip section does not already exist, you must create it
and add the parameter to it, for example, using the SAP HANA Administration Console in SAP HANA studio. If
the parameter max_uncompressed_size_in_bytes is not set, a default value is assumed. The default value
is the value assigned to the property max_runtime_bytes in section jsvm section of the xsengine.ini file.
You can deactivate the global check on the amount of uncompressed data. If the global system
parametermax_uncompressed_size_in_bytes is set to -1, no check is performed on the amount of
uncompressed data generated by an extraction process using the Utilities API, unless there is a specific user
limitation in the XS JavaScript code, for example, with the maxUncompressedSizeInBytes parameter.
With the $.util.Zip class or the $.util.compression namespace, you can use the property
maxUncompressedSizeInBytes to override the global setting and reduce the amount of uncompressed data
allowed.
Note that the parameter max_uncompressed_size_in_bytes cannot be used to increase the amount of
uncompressed data allowed beyond the value specified in the global setting.
SAP HANA XS Data Services (XSDS) is a collection of tools that includes a native client for Core Data Services
(CDS) and a query builder for SAP HANA Extended Application Services (SAP HANA XS) JavaScript. The XSDS
API provides a high-level abstraction of the database API ($.db, $.hdb) and gives access to SAP HANA
artifacts such as CDS entities or stored procedures. XSDS enables server-side JavaScript applications to
consume data models that are defined using Core Data Services more efficiently.
The following example shows how to import a CDS entity and how to update a given entity instance in XSDS
managed mode.
The following example shows how to query the database using CDS model data in XSDS unmanaged mode.
SAP HANA XS Procedures is a library of JavaScript tools which enable you to call SAP HANA stored procedures
from server-side JavaScript (XS JS) as if the stored procedures were native JavaScript functions.
Restriction
XSProc is intended only for use with database connections made with the old $.db API. It is not
recommended to use XSProc with $.hdb connections. For $.hdb connections, use
$hdb.loadProcedure instead.
The following example shows how to consume a stored procedure using the XS Procedures API.
Related Information
The application package you create in this tutorial includes all the artifacts you need to enable your server-side
JavaScript application to use the Outbound Connectivity API to request and obtain data via HTTP from a
service running on a remote host.
Prerequisites
● You have the privileges granted by the role sap.hana.ide.roles::EditorDeveloper; this role is included in the
parent role sap.hana.ide.roles::Developer.
● You have been assigned the HTTPDestViewer or HTTPDestAdministrator user role.
SAP HANA Extended Application Services (SAP HANA XS) includes a server-side JavaScript API that enables
outbound access to a defined HTTP destination. The HTTP destination provides services which an application
can use, for example, to read live data. In this tutorial, you create a JavaScript application that queries financial
services to display the latest stock values. The financial services are available on a remote server, whose details
are specified in an HTTP destination configuration.
Procedure
Caution
You must place the HTTP destination configuration and the XSJS application that uses it in the same
application package. An application cannot reference an HTTP destination configuration that is located
in another application package.
a. From the context menu of the testApp folder, choose New File .
b. Enter a file name, for example, yahoo.xshttpdest, and choose Create.
c. Enter the following code in the new file yahoo.xshttpdest.
host = "download.finance.yahoo.com";
port = 80;
description = "my stock-price checker";
useSSL = false;
pathPrefix = "/d/quotes.csv?f=a";
authType = none;
proxyType = none;
proxyHost = "";
proxyPort = 0;
timeout = 0;
d. If necessary, set proxyType to http and enter your proxy host and port number.
e. Save the file.
4. View the activated HTTP destination.
Note
To make changes to the HTTP Destination configuration, you must use a text editor, save the changes
and reactivate the file.
To start the SAP HANA XS Administration Tool, select the yahoo.xshttpdest file and choose
(Maintain Credentials) in the toolbar. The details of the HTTP destination are displayed.
5. Create a server-side JavaScript application that uses the HTTP destination you have defined.
The XSJS file must have the file extension .xsjs, for example, sapStock.xsjs.
Caution
You must place the XSJS application and the HTTP destination configuration it references in the same
application package. An application cannot use an HTTP destination configuration that is located in
another application package.
a. From the context menu of the testApp folder, choose New File .
b. Enter the file name sapStock.xsjs and choose Create.
c. Enter the following code in the new file sapStock.xsjs.
In this example, you define the following:
○ A variable (<stock>) that defines the name of the stock whose value you want to check, for
example SAP.DE
○ A variable (<amount>) that defines the number of stocks you want to check, for example, 100
○ A variable (<dest>) that retrieves metadata defined for the specified HTTP(S) destination, for
example: host, port, useSSL
○ A variable (<client>) that creates the client for the outbound connection
○ A variable (<req>) that enables you to add details to the request URL
○ A variable (<res>) that calculates the value of the stock/amount
○ The format and content of the response body displayed in the browser
/testApp/sapStock.xsjs?amount=100&stock=SAP.DE
The current value of your specified number of stocks is shown, for example:
7. Enter different values for the parameters amount and stock in the URL:
○ amount=250
Change the number of stocks from 100 to 250.
○ stock=MCRO.L
Change the name of the stock to check from SAP.DE to MCRO.L.
Prerequisites
● The delivery unit HANA_XS_DBUTILS contains the XS procedures library. The content is available in the
package sap.hana.xs.libs.dbutils.
● Create a new (or use an existing) stored procedure.
This tutorial refers to the stored procedure get_product_sales_price, which is included in the
demonstration content provided with the SAP HANA Interactive Education (SHINE) delivery unit (DU). The
SHINE DU is available for download in the SAP Software Download Center.
Note
Access to the SAP Software Download Center is only available to SAP customers and requires logon
credentials.
Context
You can call stored procedures by using the contents of the XS Procedures library as if they were JavaScript
functions. For example, the library allows you to pass arguments as a JavaScript object to a stored procedure
Procedure
2. Specify a schema where temporary tables can be created and filled with the values that are passed as
arguments to the stored procedure.
XS procedures use temporary tables to pass table-valued parameters. As a user of XS procedures you
must specify the name of a schema where these temporary tables reside, for example, a user's own
schema.
Note
The application code using XS procedures must ensure that the necessary privileges have been
granted to enable the creation and update of (and selection from) temporary tables in the specified
schema.
XSProc.setTempSchema($.session.getUsername().toUpperCase());
Note
Example
Call stored SAP HANA procedures from XS server-side JavaScript (XSJS) and process the results of the calls in
JavaScript.
XS procedures provide a convenient way to call stored procedures in SAP HANA from XS server-side Javascript
(XSJS) and process the results of the calls in JavaScript. The XS procedures library extends the features
already available with the SAP HANA XS JavaScript database API. Using XS procedures, SAP HANA stored
procedures can be considered as simple XS JavaScript functions for anyone developing XS JavaScript services.
For example, where an SAP HANA stored procedure uses a table as input parameter and a table as output, XS
Procedures use JavaScript objects (or an array of objects) which can be passed to the procedure. Similarly, the
result of the procedure call is provided as an array of JavaScript objects. You declare a stored procedure as an
XS JavaScript function and then call the stored procedure as if it were a JavaScript function delivering a
JavaScript object.
To use a stored procedure as an XS JavaScript function, the following steps are required:
Restriction
XSProc is intended only for use with database connec
tions made with the old $.db API. It is not recom
mended to use XSProc with $.hdb connections. For
$.hdb connections, use $hdb.loadProcedure in
stead.
2 Specify a schema for temporary tables Temporary tables are used to store the JavaScript arguments
provided for the function.
3 Import the procedure Create the XS JavaScript functions, which can later be used
to call the stored SAP HANA procedure. You can define func
tions which map your call arguments to the parameters of
the stored procedure.
4 Call the procedure Use the imported procedure in the same way as any normal
JavaScript function, for example, using JavaScript object ar
gument lists.
Restriction
XSProc is intended only for use with database connec
tions made with the old $.db API. For $.hdb connec
tions, use $hdb.loadProcedure instead.
Use Arguments that Reference an Exist (Optional) Write the results or a procedure call into a physical
ing Table [page 399] table and pass the table as an argument rather than a Java
Script object
Use Table-Valued Arguments [page 400] (Optional) Call a procedure with arguments stored as values
in a table
If you want to pass a table as an argument rather than a JavaScript object, you must specify the name of the
table (as a string) in the call statement as well as the name of the schema where the table is located. The
following example shows how to reference the table rating_table.
getRating('schema.rating_table', 3);
The SAP HANA database enables you to materialize the results of a procedure call; that is, to write the results
into a physical table using the WITH OVERVIEW expression. In the WITH OVERVIEW expression, you pass a
string value to the output parameter position that contains the result you want to materialize. The value
The WITH OVERVIEW expression also allows you to write the results of a procedure into a global temporary
table; that is, a table that is truncated at session close. To use XS Procedures to write the results of a procedure
into a global temporary table, you do not specify a name for the result table; you include an empty string (''),
as illustrated in the following example:
The returned reference points to a global temporary table which can be queried for the procedure results with
the same connection.
Note
To ensure access to the global temporary table, it is necessary to specify the connection object conn.
XS Procedures enables you to call procedures with arguments stored as values in a table, as illustrated in the
following example. Table-valued input arguments are passed using a JavaScript array that corresponds to the
rows of the table to pass. These row objects must contain properties that correspond to the name of the
columns. Skipped columns are filled with NULL, and properties that do not correspond to an identically named
column are ignored.
You can use the SAP HANA XS Data Services (XSDS) library to query CDS entities as if they were JavaScript
objects.
Prerequisites
This tutorial refers to CDS models that are included in the demonstration content provided with the SAP HANA
Interactive Education (SHINE) delivery unit (DU). The SHINE DU is available for download in the SAP Software
Download Center.
Note
Access to the SAP Software Download Center is only available to SAP customers and requires logon
credentials.
Context
XS Data Service queries are used to build incrementally advanced queries against data models that are defined
with Core Data Service. Query results are arrays of nested JSON objects that correspond to instances of CDS
entities and their associations.
Procedure
In addition to the basic CDS definition, the code in the example above shows how to extend the definition of
soHeader by an explicit association called items. This is done by using the keyword $association
together with the referenced entity (soItem) and the type of the association. In this case, $viaBacklink
is used as type, that is; the items of soHeader stored in soItem have a foreign key SALESORDERID
referencing the key of the soHeader table.
3. Add a query.
A general query related to an entity is built by calling the $query() method of the entity constructor.
qOrders = qOrders.$limit(3);
result contains an array of unmanaged values, each of which represents a row of the Post entity.
Note
In the refinements to the query, you must call $execute to send the query to the database.
The list of projected fields is a JavaScript object, where desired fields are marked by either true or a String
literal such as "TotalNet" denoting an alias name. The query illustrated in the example above would
return the following result.
[{
"SALESORDERID": "0500000236",
"TotalNet": 273.9,
"items": {
"NETAMOUNT": 29.9
}
}, {
The actual database query automatically JOINs all required tables based on the associations involved. In
the example above, the generated SQL looks like the following:
Note
In the following code example, the names of table are abbreviated to help readability.
SELECT "t0"."SALESORDERID" AS
"t0.SALESORDERID",
"t0"."NETAMOUNT" AS "t0.NETAMOUNT",
"t0.items"."NETAMOUNT" AS "t0.items.NETAMOUNT"
FROM "Header" "t0"
LEFT OUTER JOIN "Item" "t0.items"
ON "t0"."SALESORDERID"="t0.items"."SALESORDERID"
LIMIT 10
var qSelectedOrders =
qOrderAndItemTitles.$where(soHeader.items.NETAMOUNT.
$div(soHeader.NETAMOUNT).$gt(0.5))
References to fields and associations such as items are available as properties of the entity constructor
function, for example, soHeader.items. As in the case with projections, XSDS generates all required
JOINs for associations referenced by the conditions automatically, even if they are not part of the current
projection. To build more complex expressions in $where, see the SAP HANA XS Data Services JavaScript
API Reference.
8. Refine the query conditions to a specific matching pattern.
With the $matching() method you can specify conditional expressions using the JSON-like syntax of the
$find() and $findAll() methods. The following code example shows how to further refine the selection
returned by the result set, for example, to accept only those items with a EUR currency and quantity
greater than 2.
qSelectedOrders = qSelectedOrders.$matching({
items: {
CURRENCY: 'EUR',
QUANTITY: {
$gt: 2
}
}
});
Unlike $findAll(), $matching() returns an unmanaged plain value and ignores all unpersistent
changes to any entity instances.
qSelectedOrders = qSelectedOrders.$addFields({
"DaysAgo": soHeader.items.DELIVERYDATE.$prefixOp("DAYS_BETWEEN", new
Date())
});
Note
This query refers to the SQL function DAYS_BETWEEN, which is not a pre-defined function in XSDS.
Instead, you can use the generic operator $prefixOp, which can be used for any SQL function f, for
example, with the syntax f(arg1, … argN).
Tip
In SQL terms, the $aggregate() operator creates a GROUP BY expression for the specified paths and
automatically projects the result.
If you need to use a more restrictive projection, you can replace true with false in the $aggregate call,
as illustrated in the following example, which removes the sales order IDs for the result set.
You can use the XS Data Services (XSDS) library to update CDS entities as if they were JavaScript objects.
Prerequisites
This tutorial refers to CDS models that are included in the demonstration content provided with the SAP HANA
Interactive Education (SHINE) delivery unit (DU). The SHINE DU is available for download in the SAP Software
Download Center.
Note
Access to the SAP Software Download Center is only available to SAP customers and requires logon
credentials.
Context
For read-write scenarios, SAP HANA XS Data Services (XSDS) offer a managed mode with automatic entity
management and additional consistency guarantees. Managed mode shares CDS imports and transaction
handling with unmanaged mode but uses a different set of methods provided by the entity constructors.
Procedure
1. Import the XSDS library and the CDS entities into your application.
In your entity import, specify a SAP HANA sequence that is used to generate the required keys.
order.CURRENCY = "USD";
order.HISTORY.CHANGEDAT = new Date();
order.$save();
order.$discard();
To control how associations are being followed, declare “lazy” associations during the import operation, as
shown in the following example:
// disable auto-commit
XSDS.Transaction.$setAutoCommit(false);
var order = SOHeader.$get({ SALESORDERID: "0500000236" });
order.CURRENCY = "JPY";
order.$save(); // persist update
XSDS.Transaction.$commit(); // commit change
order.CURRENCY = "EUR";
order.$save(); // persist update
order.HISTORY.CHANGEDAT = new Date();
order.$save(); // persist update
XSDS.Transaction.$rollback(); // database rollback
// order #0500000236 now has currency JPY again
In SAP HANA Extended Application Services (SAP HANA XS), you use the SQL-connection configuration file to
configure a connection to the database; the connection enables the execution of SQL statements from inside a
server-side JavaScript application with credentials that are different to the credentials of the requesting user.
In cases where it is necessary to execute SQL statements from inside your server-side JavaScript application
with credentials that are different to the credentials of the requesting user, SAP HANA XS enables you to define
and use a specific configuration for individual SQL connections. Each connection configuration has a unique
name, for example, Registration or AdminConn, which is generated from the name of the corresponding
connection-configuration file (Registration.xssqlcc or AdminConn.xssqlcc) on activation in the
repository. The administrator can assign specific, individual database users to this configuration, and you can
use the configuration name to reference the unique SQL connection configuration from inside your JavaScript
application code.
The following code example shows how to use the XS SQL connection AdminConn.xssqlcc.
function test() {
var body;
var conn;
$.response.status = $.net.http.OK;
try {
conn = $.db.getConnection("sap.hana.sqlcon::AdminConn");
var pStmt = conn.prepareStatement("select CURRENT_USER from dummy");
var rs = pStmt.executeQuery();
if (rs.next()) {
body = rs.getNString(1);
}
rs.close();
pStmt.close();
} catch (e) {
body = "Error: exception caught";
$.response.status = $.net.http.BAD_REQUEST;
}
if (conn) {
conn.close();
}
$.response.setBody( body );
}
test();
To use the SQL connection from your application during runtime, you must bind the SQL connection
configuration to a registered database user and assign the user the appropriate permissions, for example, by
assigning a pre-defined role to the user. To maintain this user mapping, SAP HANA XS provides the Web-based
SAP HANA XS Administration Tool. When the run-time status of the XSSQLCC artifact is set to active, SAP
HANA generates a new auto user (with the name XSSQLCC_AUTO_USER_[...]. The new user is granted the
permissions specified in a role, which can be assigned using the parameter role_for_auto_user - either in
the design-time artifact or the run-time configuration.
Note
Access to the tools provided by the XS Administration Tool requires the privileges granted by one or more
specific user roles.
● sap.hana.xs.admin.roles::SQLCCViewer
Required to display the available SQL Connections and the current user mapping
● sap.hana.xs.admin.roles::SQLCCAdministrator
Required to modify details of the user mapping; the SQLCCAdministrator role includes the role
SQLCCViewer.
Troubleshooting Tips
If you are having problems implementing the XS SQL connection feature using an .xssqlcc configuration,
check the following points:
● User permissions
Make sure that you grant the necessary user the activated role (for example,
sap.hana.xs.admin.roles::SQLCCAdministrator). You can use the developer tools to grant roles (or
privileges), as follows:
Note
The granting user must have the object privilege EXECUTE on the procedure
GRANT_ACTIVATED_ROLE.
Note
If you have to authorize libxsauthenticator, you might also need to refresh the Web page in your
browser the next time you want to access .xssqlcc to display the logon dialog again.
The .xssqlcc file enables you to establish a database connection that you can use to execute SQL statements
from inside your server-side JavaScript application with credentials that are different to the credentials of the
requesting user.
Prerequisites
You have been assigned the following SAP HANA user roles:
● sap.hana.xs.admin.roles::SQLCCViewer
● sap.hana.xs.admin.roles::SQLCCAdministrator
Note
This tutorial combines tasks that are typically performed by two different roles: the application developer
and the database administrator. The developer would not normally require the privileges of the SAP HANA
administrator or those granted by the SQLCCAdministrator user role.
Context
In this tutorial, you learn how to configure an SQL connection that enables you to execute SQL statements from
inside your server-side JavaScript application with credentials that are different to the credentials of the user
requesting the XSJS service.
Procedure
{
"exposed" : true,
"authentication" : { "method" : "Form"},
"prevent_xsrf" : true
}
a. From the context menu of the testApp folder, choose New File .
b. Enter a file name, for example, AdminConn.xssqlcc, and choose Create.
Note
The SQL connection configuration file (.xssqlcc) you create must be located in the same
package as the application that references it.
4. Configure the details of the SQL connection that the XS JavaScript service will use.
a. Define the required connection details.
{
"description" : "Admin SQL connection"
"role_for_auto_user" : "com.acme.roles::JobAdministrator"
}
Tip
Replace the package path (com.acme.roles) and role name (JobAdministrator) with the
suitable ones for your case.
function test() {
var body;
var conn;
$.response.status = $.net.http.OK;
try {
conn = $.db.getConnection("testApp::AdminConn");
var pStmt = conn.prepareStatement("select CURRENT_USER from dummy");
var rs = pStmt.executeQuery();
if (rs.next()) {
body = rs.getNString(1);
}
rs.close();
pStmt.close();
} catch (e) {
body = "Error: exception caught";
$.response.status = $.net.http.BAD_REQUEST;
}
if (conn) {
conn.close();
}
$.response.setBody( body );
}
To start the SAP HANA XS Administration Tool, select the AdminConn.xssqlcc file and choose
(Maintain Details) in the toolbar.
The details of the XS SQL configuration connection are displayed.
8. Set the runtime status of the XS SQL connection configuration.
You must change the runtime status of the XS SQL connection configuration to Active. This runtime status
can only be changed by an administrator. When the runtime status of the XSSQL connection configuration
is set to active, SAP HANA automatically generates a new user (XSSQLCC_AUTO_USER_[...]) for the
XSSQL connection configuration object and assigns the role defined in role_for_auto_user to the new
auto-generated user.
Related Information
The SQL-connection configuration file specifies the details of a connection to the database that enables the
execution of SQL statements from inside a server-side (XS) JavaScript application with credentials that are
different to the credentials of the requesting user.
If you want to create an SQL connection configuration, you must create the configuration as a flat file and save
the file with the suffix .xssqlcc, for example, MYSQLconnection.xssqlcc. The new configuration file must
be located in the same package as the application that references it.
Note
An SQL connection configuration can only be accessed from an SAP HANA XS JavaScript application
(.xsjs) file that is in the same package as the SQL connection configuration itself. Neither subpackages
nor sibling packages are allowed to access an SQL connection configuration.
The following example shows the composition and structure of a configuration file AdminConn.xssqlcc for an
SAP HANA XS SQL connection called AdminConn. On activation of the SQL connection configuration file
AdminConn.xssqlcc (for example, in the package sap.hana.sqlcon), an SQL connection configuration
with the name sap.hana.sqlcon::AdminConn is created, which can be referenced in your JavaScript
application. In the xssqlcc artifact, you can set the following values:
● description
A short description of the scope of the xs sql connection configuration
sap.hana.sqlcon::AdminConn.xssqlcc
{
"description" : "Admin SQL connection"
"role_for_auto_user" : "com.acme.roles::JobAdministrator"
}
The run-time status of an XSSQL connection configuration is inactive by default; the run-time status can only
be activated by an SAP HANA user with administrator privileges, for example, using the SAP HANA XS
Administration Tools. When the run-time status of the XSSQLCC artifact is set to active, SAP HANA generates a
new auto user (with the name XSSQLCC_AUTO_USER_[...]) and assigns the role defined in
role_for_auto_user to the new auto-generated user.
Tip
In the SAP HANA XS Administration Tools, it is possible to view and edit both the the user's parameters and
the role's definition.
To create a preconfigured SQL connection using the configuration object AdminConn, for example, from inside
your JavaScript application code, you reference the object using the object name and full package path, as
illustrated in the following code example.
{
conn = $.db.getConnection("sap.hana.sqlcon::AdminConn");
}
Related Information
The XS SQL connection-configuration file .xssqlcc uses pairs of keywords and values to define the SQL
connection.
Example:
The XS SQL Connection Configuration (.xssqlcc) File
Code Syntax
{
"description" : "Admin SQL connection"
"role_for_auto_user" : "com.acme.roles::JobAdministrator"
}
description
Sample Code
role_for_auto_user
The name of (and package path to) the role assigned to be assigned to the new user that is automatically
generated on activation of the XSSQL connection-configuration artifact.
Sample Code
"role_for_auto_user" : "com.acme.roles::JobAdministrator"
Activating the design-time XSSQL connection configuration generates a run-time object whose status is
“inactive” by default; the run-time status must be set to active by an SAP HANA user with administrator
privileges, for example, using the SAP HANA XS Administration Tools. When the run-time status of the
XSSQLCC artifact is set to active, SAP HANA generates a new auto user (with the name
XSSQLCC_AUTO_USER_[...]) and assigns the role defined in role_for_auto_user to the new auto-
generated user.
HTTP requests can define the language used for communication in the HTTP header Accept-Language. This
header contains a prioritized list of languages (defined in the Browser) that a user is willing to accept. SAP
HANA XS uses the language with the highest priority to set the language for the requested connection. The
language setting is passed to the database as the language to be used for the database connection, too.
In server-side JavaScript, the session object's language property enables you to define the language an
application should use for a requested connection. For example, your client JavaScript code could include the
following string:
Note
Use the language-code format specified in BCP 47 to set the session language, for example: “en-US” (US
English), “de-AT” (Austrian German), “fr-CA” (Canadian French).
As a client-side framework running in the JavaScript sandbox, the SAP UI5 library is not aware of the Accept-
Language header in the HTTP request. Since the current language setting for SAPUI5 is almost never the
same as the language specified in the SAP HANA XS server-side framework, SAPUI5 clients could have
problems relating to text displayed in the wrong language or numbers and dates formatted incorrectly.
The application developer can inform the SAP UI5 client about the current server-side language setting, for
example, by adding an entry to the <script> tag in the SAPUI5 HTML page, as illustrated in the following
examples:
<script id="sap-ui-bootstrap"
type="text/javascript"
src="/sap/ui5/1/resources/sap-ui-core.js"
data-sap-ui-theme="sap_goldreflection"
data-sap-ui-libs="sap.ui.commons"
data-sap-ui-language="de">
</script>
<script>
window["sap-ui-config"] = {
"language" : "de"
}
</script>
[…]
<script id="sap-ui-bootstrap"
[…]
The sap-ui-config object must be created and filled before the sap-ui-bootstrap script.
It is important to understand that the session starts when a user logs on, and the specified language is
associated with the session. Although the user can start any number of applications in the session, for
example, in multiple Browser tabs, it is not possible to set a different language for individual applications called
in the session,
The script tag for the SAPUI5 startup can be generated on the server side, for example, using the
$.session.language property to set the data-sap-ui-language parameter. Applications that have the
SAPUI5 <script> tag in a static HTML page can use this approach, as illustrated in the following example:
<script id="sap-ui-bootstrap"
type="text/javascript"
src="/sap/ui5/1/resources/sap-ui-core.js"
data-sap-ui-theme="sap_goldreflection"
data-sap-ui-libs="sap.ui.commons"
data-sap-ui-language="$UI5_LANGUAGE$">
</script>
The called XSJS page can be instructed to replace the $UI5_LANGUAGE$ parameter, for example, with the
value stored in $.session.language when loading the static HTML page.
You can include an HTTP call in the static HTML page to fetch the correct language from the server using some
server-side JavaScript code, as illustrated in the following example:
<script>
var xmlHttp = new XMLHttpRequest();
xmlHttp.open( "GET", "getAcceptLanguage.xsjs", false );
xmlHttp.send( null );
window["sap-ui-config"] = {
"language" : xmlHttp.getResponseHeader("Content-Language")
}
</script>
<script id="sap-ui-bootstrap"
…
</script>
This approach requires an XSJS artifact (for example, getAcceptLanguage.xsjs) that responds to the AJAX
call with the requested language setting, as illustrated in the following example:
$.response.contentType = "text/plain";
$.response.headers.set("Content-Language", $.session.language);
$.response.setBody("");
Scheduled jobs define recurring tasks that run in the background. The JavaScript API $.jobs allows
developers to add and remove schedules from such jobs.
If you want to define a recurring task, one that runs at a scheduled interval, you can specify details of the job in
a .xsjob file. The time schedule is configured using cron-like syntax. You can use the job defined in
an .xsjob file to run an XS Javascript or SQLScript at regular intervals. To create and enable a recurring task
using the xsjob feature, you perform the following high-level tasks:
Note
The tasks required to set up a scheduled job in SAP HANA XS are performed by two distinct user roles: the
application developer and the SAP HANA administrator. In addition, to maintain details of an XS job in the
SAP HANA XS Administration Tool, the administrator user requires the privileges granted by the role
template sap.hana.xs.admin.roles::JobAdministrator.
1 Create the function or script you want to run at Application developer Text editor
regular intervals
2 Create the job file .xsjob that defines details of Application developer Text editor
the recurring task
3 Maintain the corresponding runtime configuration SAP HANA administrator XS Job Dashboard
for the xsjob
4 Enable the job-scheduling feature in SAP HANA SAP HANA administrator XS Job Dashboard
XS
5 Check the job logs to ensure the job is running ac SAP HANA administrator XS Job Dashboard
cording to schedule.
Related Information
The xsjob file enables you to run a service (for example, an XS JavaScript or an SQLScript) at a scheduled
interval.
Prerequisites
● You have the privileges granted by the SAP HANA user role sap.hana.xs.admin.roles::JobAdministrator.
● You have the privileges granted by the SAP HANA user role
sap.hana.xs.admin.roles::HTTPDestAdministrator.
Note
This tutorial combines tasks that are typically performed by two different roles: the application developer
and the database administrator. The developer would not normally require the privileges granted to the
JobAdministrator user role, the sap.hana.xs.admin.roles::HTTPDestAdministrator user role, or the SAP
HANA administrator.
Context
In this tutorial, you learn how to schedule a job that triggers an XS JavaScript application that reads the latest
value of a share price from a public financial service available on the Internet. You also see how to check that
the XS job is working and running on schedule.
To do this, you create a root application package called yahoo, which contains the following artifacts:
/yahoo/
.xsapp // application descriptor
.xsaccess // application access file
yahoo.xsjob // job schedule definition
yahoo.xshttpdest // HTTP destination details
yahoo.xsjs // Script to run on schedule
Procedure
a. From the context menu of the yahoo folder, choose New File , enter the file name yahoo.xsjs
(remember to use the .xsjs extension), and choose Create.
b. Add the application code.
The XS JavaScript code shown in the following example connects to a public financial service on the
Internet to check and download the latest prices for stocks and shares:
function readStock(input) {
var stock = input.stock;
host = "download.finance.yahoo.com";
port = 80;
Note
If you use a proxy, set proxyType to http and enter your proxy host and port:
proxyType = http;
proxyHost = "<your_proxy_host>";
proxyPort = <your_proxy_port>;
{
"description": "Read stock value",
"action": "yahoo:yahoo.xsjs::readStock",
"schedules": [
{
"description": "Read current stock value",
"xscron": "* * * * * * 59",
"parameter": {
"stock": "SAP.DE"
}
}
]
}
a. Select the yahoo.xsjob file and choose (Maintain Details) in the toolbar.
The XS Job Dashboard opens.
b. Switch to the Configuration tab to maintain the details of the XS job:
○ User
The user account in which the job runs, for example, SYSTEM.
○ Password
The password required for user, whose account is used to run the job.
○ Locale
The language encoding required for the locale in which the job runs, for example, en_US.
○ Start/Stop time
An optional value to set the period of time during which the job runs. Enter the values using the
syntax used for the SAP HANA data type LocalDate and LocalTime, for example, 2014-11-05
00:30:00 (thirty minutes past midnight on the 5th of November 2014).
○ Active
Enable or disable the job schedule
c. Save the job.
The changes to the job schedule are activated.
7. Enable the job-scheduling feature in SAP HANA XS.
This step requires the permissions granted to the SAP HANA administrator.
Note
It is not possible to enable the scheduler for more than one host in a distributed SAP HANA XS
landscape.
In the XS Job Dashboard set the Scheduler Enabled toggle button to YES.
Toggling the setting for the Scheduler Enabled button changes the value set for the SAP HANA
configuration variable xsengine.ini scheduler enabled , which is set in the Configuration tab of the
SAP HANA studio's Administration perspective. If the scheduler section is not already there, create it and
add the new parameter enabled, and assign the value true.
The .xsjob file defines the details of a task that you want to run (for example, an XS JavaScript or an
SQLScript) at a scheduled interval.
The XS job file uses a cron-like syntax to define the schedule at which the service defined in an XS JavaScript
or SQLScript must run, as you can see in the following example, which runs the specified job (the stock-price
checking service yahoo.xsjs) on the 59th second minute of every minute.
{
"description": "Read stock value",
"action": "yahoo:yahoo.xsjs::readStock",
"schedules": [
{
"description": "Read current stock value",
"xscron": "* * * * * * 59",
"parameter": {
"stock": "SAP.DE"
}
}
]
}
When defining the job schedule in the xsjob file, pay particular attention to the entries for the following
keywords:
● action
Text string used to specify the path to the function to be called as part of the job.
"action": "<package_path>:<XSJS_Service>.xsjs::<FunctionName>",
Note
● description
Text string used to provide context when the XSjob file is displayed in the SAP HANA XS Administration
tool.
● xscron
The schedule for the specified task (defined in the “action” keyword); the schedule is defined using
cron-like syntax.
● parameter
A value to be used during the action operation. In this example, the parameter is the name of the stock
SAP.DE provided as an input for the parameter (stock) defined in the readStock function triggered by
the xsjob action. You can add as many parameters as you like as long as they are mapped to a parameter
in the function itself.
● 2013 * * fri 12 0 0
Every Friday of 2013 at 12:00 hours
● * * 3:-2 * 12:14 0 0
Every hour between 12:00 and 14:00 hours on every day of the month between the third day of the month
and the second-last day.
Tip
In the day field, third from the left, you can use a negative value to count days backwards from the end
of the month. For example, * * -3 * 9 0 0 means: three days from the end of every month at
09:00.
● * * * * * */5 *
Every five minutes (*/5) and at any point (*) within the specified minute.
Note
Using the asterisk (*) as a wild card in the seconds field can lead to some unexpected consequences, if
the scheduled job takes less than 59 seconds to complete; namely, the scheduled job restarts on
completion. If the scheduled job is very short (for example, 10 seconds long), it restarts repeatedly until
the specified minute ends.
To prevent short-running jobs from restarting on completion, schedule the job to start at a specific second
in the minute. For example, * * * * * */5 20 indicates that the scheduled job should run every five
minutes and, in addition, at the 20th second in the specified minute.
● * * * -1.sun 9 0 0
Every last Sunday of a month at 09:00 hours
Related Information
The XS job file .xsjob uses a number of keywords to define the job that must be run at a scheduled interval.
Example:
The XS Job (.xsjob) File
{
"description": "Read stock value",
"action": "yahoo:yahoo.xsjs::readStock",
"schedules": [
{
"description": "Read current stock value",
"signature_version": 1,
"xscron": "* * * * * * 59",
"parameter": {
"stock": "SAP.DE"
}
}
]
}
description
{
"description": "Read stock value",
}
The description keyword enables you define a text string used to provide context when the XS job is displayed
for maintenance in the SAP HANA XS Administration Tool. The text string is used to populate the Description
field in the SCHEDULED JOB tab.
action
{
"action": "myapps.finance.yahoo:yahoo.xsjs::readStock",
}
The action keyword enables you to define the function to run as part of the XS job, for example, an XS
JavaScript or an SQLScript. The following syntax is required: “action” :
“<package.path>:<XSJS_Service>.xsjs::<functionName>”.
If you want to use the action to call an SQLScript, replace the name of the XSJS service in the example, with
the corresponding SQLScript name.
schedules
{
"schedules": [
{
"description": "Read current stock value",
"xscron": "* * * * * * 59",
"parameter": {
"stock": "SAP.DE"
}
}
]
}
The schedules keyword enables you define the details of the XS job you want to run. Use the following
additional keywords to provide the required information:
● description (optional)
Short text string to provide context
● xscron
Uses cron-like syntax to define the schedule at which the job runs
● parameter (optional)
Defines any values to be used as input parameters by the (XSJS or SQLScript) function called as part of
the job
signature_version
{
"signature_version": 1,
}
The signature_version keyword enables you manage the version “signature” of an XS job. You change the XS job
version if, for example, the parameter signature of the job action changes; that is, an XS job accepts more (or
less) parameters, or the types of parameters differ compared with a previous version of an XS job. On
activation in the SAP HANA Repository, the signature of an XS job is compared to the previous one and, if the
job’s signature has changed, any job schedules created at runtime will be deactivated.
Note
Deactivation of any associated runtime job schedules prevents the schedules from silently failing (no
information provided) and enables you to adjust the parameters and reactivate the job schedules as required,
Tip
Minor numbers (for example, 1.2) are not allowed; the job scheduler interprets “1.2” as “12”.
xscron
{
"schedules": [
{
"description": "Read current stock value",
"xscron": "* * * * * * 59",
"parameter": {
"stock": "SAP.DE"
}
}
]
}
The xscron keyword is used in combination with the schedules keyword. The xscron keyword enables you to
define the schedule at which the job runs. As the name suggests, the xscron keyword requires a cron-like
syntax.
The following table explains the order of the fields (*) used in the “xscron” entry of the .xsjob file and lists
the permitted value in each field.
Month 1 to 12
Day -31 to 31
DayofWeek mon,tue,wed,thu,fri,sat,sun
Hour 0 to 23
Minute 0 to 59
Second 0 to 59
Note
Using the asterisk (*) as a wild card in the seconds field can lead to some unexpected consequences, if the
scheduled job takes less than 59 seconds to complete; namely, the scheduled job restarts on completion. If
the scheduled job is very short (for example,10 seconds long), it restarts repeatedly until the specified
minute ends.
To prevent short-running jobs from restarting on completion, schedule the job to start at a specific second in
the minute. For example, * * * * * */5 20 indicates that the scheduled job should run every five minutes
and, in addition, at the 20th second in the specified minute. The job starts at precisely 20 seconds into the
specified minute and runs only once.
a,b,c Anywhere a or b or c
parameter
{
"schedules": [
{
"description": "Read current stock value",
"xscron": "* * * * * * 59",
"parameter": {
"stock": "SAP.DE",
"share": "BMW.DE"
}
}
]
}
The optional parameter keyword is used in combination with the schedules keyword. The parameter keyword
defines values to be used as input parameters by the XSJS function called as part of the job. You can list as
many parameters as you like, separated by a comma (,) and using the JSON-compliant syntax quotations (“”).
Context
You can use the $.jobs.JobSchedules API to add a schedule to (or delete a schedule from) a job defined in
an .xsjob file at runtime.
Note
Schedules added at runtime are deleted when the .xsjob file is redeployed.
Note
If you have already created this XS job file, for example, in another tutorial, you can skip this step.
{
"description": "Read stock value",
"action": "yahoo:yahoo.xsjs::readStock",
"schedules": [
{
"description": "Read current stock value",
"xscron": "* * * * * * 59",
"parameter": {
"stock": "SAP.DE"
}
}
]
}
Note
Saving a file in a shared project automatically commits the saved version of the file to the repository, To
explicitly commit a file to the repository, right-click the file (or the project containing the file) and
choose Team Commit from the context-sensitive popup menu.
2. Create the XS JavaScript (.xsjs) file you want to use to define the automatic scheduling of a job at
runtime.
Name the file schedule.xsjs.
3. Use the $.jobs JavaScript API to add or delete a schedule to a job at runtime.
The following example schedule.xsjs adds a new schedule at runtime for the XS job defined in
yahoo.xsjob, but uses the parameter keyword to change the name of the stock price to be checked.
The SAP HANA XS server-side JavaScript API provides tracing functions that enable your application to write
predefined messages in the form of application-specific trace output in the xsengine trace files
(xsengine*.trc) according to the trace level you specify, for example, “info”(information) or “error”.
If you use the server-side JavaScript API to enable your application to write trace output, you can choose from
the following trace levels:
● debug
● info
● warning
● error
● fatal
For example, to enable debug-level tracing for your JavaScript application, include the following code:
In this tutorial, you use the tracing functions provided by the server-side JavaScript API for SAP HANA XS to
enable tracing in a JavaScript application. The application-specific trace messages are written into a trace file,
according to the trace level you specify, for example, "debug", "info", "warning", "error", or "fatal".
Prerequisites
● You have the TRACE ADMIN system privilege (required to set trace levels in the trace tool).
● You have the privileges granted by the roles sap.hana.ide.roles::EditorDeveloper and
sap.hana.ide.roles::TraceViewer; both roles are included in the parent role sap.hana.ide.roles::Developer.
You can view trace files and assign trace levels to applications using the Trace component of the SAP HANA
Web-based Development Workbench. The Web-based trace tool is available on the SAP HANA XS Web server
at the following URL: http://<WebServerHost>:80<SAPHANAinstance>/sap/hana/ide/trace
Procedure
function getUsername(){
var username = $.session.getUsername();
return username;
}
$.trace.debug("Let's say hello to my demo");
var result = "Hello World from User " + getUsername();
$.response.setBody(result);
a. In the Editor tool, select the indexUI5.html file in the demo.helloxs package and choose (Run)
in the toolbar.
b. In the Web browser, choose Call Backend.
7. View the application trace file.
In the trace tool, open the newest XS engine trace file, which you can find in the XS Engine folder.
SAP HANA XS provides an integrated debugger to enable you to debug the XS JavaScript code that you write.
Prerequisites
To use the debugging tools and features, set up your debugging environment as follows:
Tip
To enable the display of more helpful and verbose information for HTTP 500 exceptions on the SAP
HANA XS Web server, add the parameter developer_mode to the xsengine.ini httpserver
section and set it to true.
Note
By default, other users do not have the permissions required to access your XS JavaScript debugging
sessions. However, you can grant a user global access to any of your debug sessions or grant access to
a debug session that is flagged with a specified token. You can also restrict access to a debug session
to a specified period of time.
Related Information
In this tutorial, you use the SAP HANA Web-based Development Workbench Editor to create and debug a
server-side JavaScript application. The application displays a browser window where you can enter two values
in URL parameters and display the results immediately in the browser window.
Prerequisites
● You have the privileges granted by the role sap.hana.ide.roles::EditorDeveloper; this role is included in the
parent role sap.hana.ide.roles::Developer.
● You have been assigned the user role sap.hana.xs.debugger::Debugger.
● Your SAP HANA administrator has enabled debugging in the SAP HANA system.
Procedure
b. Close the dialog box and choose (Resume (F10)) to finish the debugger session.
c. Switch to the application window to confirm that the change you made is reflected in the displayed
result.
Related Information
The SAP HANA Web-based Development Workbench provides an integrated debugger, which makes available
the tools you need to debug server-side JavaScript code.
The debug tools allow you to perform standard debugging tasks, such as adding breakpoints to the code and
starting, stepping through, and resuming code execution. You can also inspect variables and check the validity
of expressions.
Breakpoints
You set breakpoints in the source code by clicking the line number of the statement you want to stop at. You
can set as many breakpoints as you want. A red arrow in the line number column shows each breakpoint that
has been set:
In the Debugger panel, the Breakpoints pane displays a list of the breakpoints set in the source file you are
currently debugging. You can remove breakpoints by deleting them from the list.
Code Execution
To run an application in debug mode, set a breakpoint on the line you want to start debugging from and then
choose (Run). The debugger stops at the defined breakpoint and the Debugger panel opens to the right of
the source code.
A blue arrow inside the red arrow tag shows that a breakpoint has been reached:
Note
In the source code, a blue arrow in the line number column shows the next code line that will be executed.
You can execute the script using the buttons located in the toolbar at the top of the Debugger panel:
Steps through the code line-by-line and, whenever a function is called, steps into the body of
(Step in)
that function
Steps through the code line-by-line but if a function is called it does not step into the body of
(Step over)
that function
Callstack
In the Debugger panel, the Callstack pane shows the line number of the current code line where execution has
been suspended. When stepping through the code, the call stack allows you to see the path that is taken when
the XS JavaScript file is executed. If a function is called, an entry is added to the top of the call stack and
removed when the function has finished. The debugger then continues from the position given at the top of the
stack.
The Watch Area pane allows you to inspect the current runtime values of existing variables when execution of
the code is suspended at breakpoints or is being stepped through. You can drill down within the tree to view the
values of sub-elements.
In the expression evaluation dialog box (open the dialog box by choosing (Evaluate expression) on the
Debugger toolbar), you can write your own statements to check the validity of expressions and manipulate
variable values.
The JavaScript debugger included with SAP HANA Extended Application Services (SAP HANA XS) requires
user authentication to start a debug session. SAP HANA XS includes a dedicated debugger role, which defines
the permissions needed by a developer who wants to debug server-side JavaScript code.
Debugging application code is an essential part of the application-development process. SAP HANA Extended
Application Services (SAP HANA XS) provides integrated debugger functionality and a dedicated debugger role
that must be assigned to any developer who wants to debug XS JavaScript. The debugging role is named
sap.hana.xs.debugger::Debugger and can be assigned to a user with the standard role-assignment feature
included in the SAP HANA Web-based Development Workbench security tool.
Since developers primarily need to debug their own HTTP calls, the following limitations apply to a debug
session:
Note
It is also possible to use SSL for debugging. If SSL is configured, the server redirects the Web-socket
connect call to the corresponding SSL (secure HTTP) URL, for example, if sent by plain HTTP.
Related Information
You can grant other developers access to the debug sessions you use for debugging server-side JavaScript on
SAP HANA XS.
By default, other users are not allowed to access your XSJS debugging sessions. However, SAP HANA XS
provides a tool that enables you to grant access to your debugging sessions to other users, too.
Note
You can grant a user global access to any of your sessions or grant access to a session that is flagged with a
specified token. You can also restrict access to a debug session to a specified period of time.
The XS Debugging tool is available on the SAP HANA XS Web server at the following URL:
<SAPHANAWebServer>80<SAPHANAinstance>/sap/hana/xs/debugger/.
When you are grant access to your debugging session, the following options are available:
● User Name
The name of the database user who requires access to your debug session
● Privilege Expires
The point in time that marks the end of the period for which access to one or more debug sessions is
allowed.
● grant debug permission for all sessions
You can grant a user global access to any of your debug sessions.
Restriction
The user you grant access to must already be registered and authenticated in the SAP HANA database.
Restriction
The following rules apply to access to debug sessions flagged with a token:
○ The session used for granting access to the debug sessions is flagged automatically.
○ The session token is distributed by means of a session cookie; the cookie is inherited by any session
created with the current browser session.
● Session Name
A freely definable name that can be used to distinguish your debug session in the context of multiple
sessions.
Related Information
The test framework SAP HANA XSUnit (XSUnit) is a custom version of the open-source JavaScript test
framework, Jasmine, adapted for use with SAP HANA XS. You can use the XSUnit test framework to automate
the tests that you want to run for SAP HANA XS applications, for example, to test the following elements:
Prerequisites
To use the tools and features provided with the XSUnit test framework, set up your test environment as follows:
Note
To import a delivery unit into an SAP HANA system, you requires the REPO.IMPORT privilege, which is
normally granted only to the system administrator.
sap.hana.testtools.common::TestEx Enables you to view the persisted test results produced by the XSUnit test
ecute framework and to execute the examples included in the demonstration package
(sap.hana.testtools.demo).
sap.hana.xs.debugger::Debugger Enables you to debug your server side JavaScript (test) code
sap.hana.xs.ide.roles::Developer Enables you to view source files in the SAP HANA Web-based Development
Workbench
Note
You must ensure that _SYS_REPO has select permission on the schema where the tables are located
(for example, either your user schema or the test schema).
Related Information
XSUnit is an integrated test environment that enables you to set up automatic tests for SAP HANA XS
applications.
People developing applications in the context of the SAP HANA database need to understand how to
implement a test-automation strategy. Especially for new applications which are designed to work exclusively
with SAP HANA, it is a good idea to consider the adoption of best practices and tools.
If you want to develop content that is designed to run specifically in SAP HANA, it is strongly recommended to
use the XSUnit test framework that is integrated in SAP HANA XS; this is the only way to transport your test
To write self-contained unit tests that are executable in any system, you have to test the various SAP HANA
objects in isolation. For example, an SAP HANA view typically has dependencies to other views or to database
tables; these dependencies pass data to the view that is being tested and must not be controlled or overwritten
by the test. For this reason, you need to be able to simulate dependencies on the tested view. XSUnit includes a
test-isolation tool that provides this functionality; it allows you to copy a table for testing purposes.
Note
Although you cannot copy a view for testing purposes, you can create a table that acts like a view.
All (or specific) dependencies on any tables or views are replaced by references to temporary tables, which can
be created, controlled, and populated with values provided by the automated test.
Test Data
Preparing and organizing test data is an important part of the process of testing SAP HANA content such as
views and procedures; specific data constellations are required that have to be stable in order to produce
reliable regression tests. In addition, test-isolation tools help reduce the scope of a test by enabling you to test
a view without worrying about dependent tables and views. Limiting the scope of a test also helps to reduces
the amount of data which needs to be prepared for the test.
Related Information
Dedicated roles enable developers to access and use the tools provided with the SAP HANA XS test framework
(XSUnit).
To grant access to the SAP HANA XS test framework that enables developers to set up automatic testing for
SAP HANA applications, the SAP HANA system administrator must ensure that the appropriate roles are
assigned. The following table lists the roles that are available; one (or more) of the listed roles must be assigned
to the application developers who want to use the XSUnit testing tools.
sap.hana.testtools.common::TestExe Enables you to view the persisted test results produced by the XSUnit test frame
cute work and to execute the examples included in the demonstration package
(sap.hana.testtools.demo).
sap.hana.xs.ide.roles::Developer Enables you to view source files in the SAP HANA Web-based Work Bench (Web
IDE)
Use the XSUnit tools to set up automated testing of your applications in SAP HANA XS.
Prerequisites
Context
If you want to develop content that is designed to run specifically with SAP HANA, you can use the XSUnit tools
that are integrated in SAP HANA XS. The XSUnit tools are based on a JavaScript unit test framework that uses
Jasmine as the underlying test library.
Procedure
a. Select the test package and from the context menu choose New File .
b. Enter the file name .xsapp and choose Create.
c. Select the .xsapp file and from the context menu choose Activate.
3. Create an XSUnit test.
a. Select the test package and from the context menu choose New File .
b. Enter a file name, for example, MyFirstTest.xsjslib, and choose Create.
c. Add the following content to the new XS library test file MyFirstTest.xsjslib.
Select the MyFirstTest.xsjslib file and choose Invoke Tests in the context menu or (Run XS Unit
Test Suite) in the toolbar.
The integrated XS Unit Test Runner opens in a panel on the right and shows the progress of the test run.
Note that depending on your tests this may take some time.
5. Inspect the test results.
The test results are displayed in the form of a tree with a node for each of the following:
○ Each "describe" test specification
○ Each "it" test case
○ Each failed test case: an error message together with the stack trace
In the XS Unit Test Runner panel, choose from the following options:
○ Navigate to a code line: For failed test cases, drill down to the @ node below the error message, and
from the context menu choose Open code.
○ Generate a test report: Click Create test report to show an HTML report of the XS Unit test run in a
separate browser tab.
○ Rerun tests on save: While the XS Unit test runner is open, the tests are rerun each time the file is
saved, irrespective of which file is open in the editor.
6. Inspect the code line coverage.
To rerun the tests and collect code line coverage information, select the Track Code Coverage option.
In addition to the test results, this gives you an overview of all files affected by the test run. For each file, the
number of lines covered is shown in relation to the overall number of lines. To see exactly which code lines
were covered in a specific file, choose Show covered lines in the context menu of that file. The lines covered
at least once are shown in yellow in the editor.
The XSUnit test framework is a custom version of the JavaScript test framework Jasmine adapted to suit SAP
HANA XS.
A test specifications begin with a call to the global Jasmine function describe. The describe functions
define suites that enable you to group together related test suites and specifications. Test-suite specifications
are defined by calling the global Jasmine function it. You can group several test suites in one test file. The
following code snippet shows one test suite (introduced by the function “describe”) and two test
specifications, indicated by the function “it”.
/*jslint undef:true */
describe('testSuiteDescription', function() {
beforeOnce(function() {
// beforeOnce function is called once for all specifications
});
beforeEach(function() {
// beforeEach function is called before each specification
});
it('testSpecDescription', function() {
expect(1).toEqual(1);
});
it('anotherTestSpecDescription', function() {
expect(1).not.toEqual(0);
});
});
To enable a test suite to remove any duplicate setup and teardown code, Jasmine provides the global functions
beforeEach and afterEach. As the name implies the beforeEach function is executed before each
specification in the enclosing suite and all sub-suites; the afterEach function is called after each
specification. Similarly, the special methods beforeOnce and afterOnce are called once per test suite.
● beforeOnce
Executed once before all specifications of the test suite
● afterOnce
Executed once after all specifications of the test suite
The XSUnit framework provides a managed database connection called jasmine.dbConnection, which is
globally available. You can use it in the following scenarios:
One obvious advantage of this is that you no longer have to pass the database connection as a parameter or
define it as a global variable. The jasmine.dbConnection is opened automatically and rolled back (and
closed). However, if you want to persist your data, you have to commit() jasmine.dbConnection manually.
Example syntax for the functions, assertions, and parameters required by the SAP HANA XSUnit test tools.
The following code example lists the most commonly used functions and assertions used in the XSUnit
syntax.For more information about the assertions used, for example, toBe, toBeTruthy, or toBeFalsy, see
Assertions.
The following code example lists the most commonly used assertions, shows the required syntax, and the
expected parameters.
expect(actual).toBe(expected);
expect(actual).toBeFalsy();
expect(actual).toBeTruthy()
expect(actual).toEqual(expected);
expect(actualArray).toContain(expectedItem);
expect(actual).toBeNull();
expect(actualNumber).toBeCloseTo(expectedNumber, precision);
expect(actual).toBeDefined();
expect(actual).toBeUndefined();
expect(actualString).toMatch(regExpression);
expect(actualFunction).toThrowError(expectedErrorMessage);
expect(actualFunction).toThrowError(expectedErrorType, expectedErrorMessage);
expect(actualTableDataSet).toMatchData(expected, keyFields);
expect(actual).toBeLessThan(expected);
expect(actual).toBeGreaterThan(expected);
The XSUnit tool suite includes a generic tool that you can use to run tests.
You can start the XSUnit test-running tool (TestRunner.xsjs) by entering the following URL in a Web
Browser:
http://<hostname>:80<HANAinstancenumber>/sap/hana/testtools/unit/jasminexs/
TestRunner.xsjs?<parameters>
The following table lists the parameters that you can use to control the behavior of test-runner tool. If you
execute the test runner without specifying the pattern parameter, only the tests in *Test.xsjslib files are
discovered (and run) within the package hierarchy.
Note
You can specify multiple parameters by separating each parameter=value pair with the ampersand
character (&), for example:coverage=true&exclude=sap.hana.tests
package yes Package that acts as starting point for discovering the tests. If not otherwise
specified by parameter “pattern” all .xsjslib files in this package and its sub-pack
ages conforming to the naming pattern “*Test” will be assumed to contain tests
and will be executed.
package=sap.hana.testtools.demo
pattern no Naming pattern that identifies the .xsjslib files that contain the tests. If not speci
fied, the pattern “*Test” is applied. You can use question mark (?) and asterisk (*)
as wildcards to match a single or multiple arbitrary characters, respectively. To
match all “Suite.xsjslib” files, use the following code:
pattern=Suite
format no Specifies the output format the test runner uses to report test results. By default,
the results will be reported as HTML document. This parameter has no effect if a
custom reporter is provided via parameter “reporter”. To display outputs results
using the JSON format, use the following code:
format=json
reporter no Complete path to module that provides an implementation of the Jasmine re
porter interface. With this parameter a custom reporter can be passed to publish
the test results in an application specific format . To specify the reporter interface,
use the following code:
reporter=sap.hana.testtools.unit.jasminexs.reporter.db
.dbReporter
Note
format=db produces the same result
tags no Comma-separated list of tags which is used to define the tests to be executed.
tags=integration,long_running
profile no Name of a "profile" defined in the test which filters the tests to be executed on the
basis of tags.
profile=end2end
coverage no Activate code coverage measurement for all server-side (XS) JavaScript code
that is executed by the tests or which is in the scope of a specified package.
coverage=true
coverage=sap.hana.testtools.mockstar
coverage=true&exclude=sap.hana.testtools.mockstar.test
s
XSUnit includes a selection of test packages that demonstrate the scope of tests you can perform on an SAP
HANA XS application.
The following table lists the test packages included in the XSUnit test framework. The table also indicates the
name of the test file and provides a quick overview of the scope of the test.
Note
If you want to have a look at the code in the tests, checkout the package sap.hana.testtools.demo as
an XS project to your local workspace.
ExampleTest Units
Package Name Test Name (.xsjslib) Description
To write self-contained unit tests that are executable in any system, it is essential to be able to test the selected
SAP HANA objects in isolation. For a typical unit test using the XSUnit tools, you need to be able to change any
direct dependencies between the tested objects and other views or tables with references to simple tables. For
integration tests, rather than change the direct dependencies to a view or a table, you might need to change
dependencies between the dependent views (deeper in the dependency hierarchy).
Mockstar is a tool that is specifically designed to enable you to isolate test objects, for example, a view or
procedure. Mockstar allows you to create a copy of the tested view or procedure and substitute the
dependency to a another view or table with a table that is stored in a test schema. It is strongly recommended
to use a dedicated schema for the tests; in this test schema, you have write permissions and, as a result, full
control over the data in the tables and views.
● Creates a copy of the SAP HANA object to test (for example, a view or database table); the copied object
retains the same business logic as the original one object, but replaces some or all dependencies.
● Replaces the (static) dependencies to tables or views with temporary tables
● Supports deep dependency substitution
Mockstar can determine dependencies deep within a hierarchy of dependencies and copy only the
necessary parts of the hierarchy.
Mockstar tools are included in the delivery unit HANA_TEST_TOOLS, which you must install manually, for
example, using the SAP HANA studio or the SAP HANA Application Lifecycle Management tool. After the
installation completes, the Mockstar tools are available in the package sap.hana.testtools.mockstar.
Note
Importing a delivery unit into an SAP HANA system requires the REPO.IMPORT privilege, which is normally
granted only to the system administrator.
A basic example of the syntax required to set up the Mockstar test environment.
Note
The names of schemas, tables, and views used in the following code example are intended to be for
illustration purposes only.
Use trace files and other tools to fix problems with test operations.
The Mockstar test-isolation tools write helpful information in the SAP HANA trace files. You can adapt the trace
level, for example, to debug to ensure the right amount and type of information is written during the test run.
Note that you need the corresponding administration role to be able to change the trace-level settings in SAP
HANA. The trace files are written in the trace component xsa:sap.hana.testtools (truncated to
“xsa:sap.hana.tes” in the trace files).
Tip
As an alternative to reading the trace files directly, you can also use the SQL console to select data from the
table M_MERGED_TRACES.
This section contains information about the problems that developers frequently encounter during test runs:
The JavaScript library you want to test can only be loaded when there is an application descriptor (.xsapp file)
defined within the package hierarchy. The application descriptor is the core file that you use to describe an
The following error message is displayed when testing access to an OData service in SAP HANA XS:
If you encounter problems concerning duplicate entries when running tests, try the following solutions:
1. When inserting records into a productive table, ensure that no jasmine.dbConnection.commit() call
occurs during test execution.
2. When inserting records into a test table, ensure that the table entries are deleted (dropped) before they are
(re)created.
You encounter an error message that explains that a test table cannot be created during the test because the
table already exists. You must ensure that the specified table is deleted before the test tries to create it during
the test run.
the name of the model is too long (including the package name). You can reduce the name by setting the
TruncOptions option as shown in the following code snippet:
Tip
To generate a detailed and structured error log, in the SAP HANA Systems view in the SAP HANA studio and
locate the test package and activate it manually.
To test whether a test inserts data as expected into the created test table, implement a
jasmine.dbConnection.commit( ) connection to ensure that the data created during the test is stored
The default timeout setting for the TestRunner tool is ten (10) minutes. If your test run for longer than ten
minutes and cause a timeout, try splitting the test into smaller and shorter elements. If this is no possible, try
running the test in three phases:
This error sometimes occurs if you try to create a copy of the original view and replace some dependencies
with test tables. The reason for the error is one of the following:
● You did not provide any dependency substitutions. For example, you passed an empty array as the third
parameter of mockstar.createTestModel()).
● The view that you want to test does not depend on any of the original views specified in the dependency
substitutions.
● For active schema mapping, you have written the dependencies with the physical schema whereas the
view refers to the authoring schema. Provide the schema in the same way as it is written in the view (or
stored procedure).
The XSUnit test framework provides a new “managed” database connection called jasmine.dbConnection,
which is automatically opened and rolled back (and closed) after each test completes. You can use it in
beforeEach or afterEach functions, in other functions defined in your test libraries, or even in imported
libraries, in the event that you have moved test code into external libraries.
Related Information
As the XSUnit test tools are based on a custom version of the JavaScript test framework Jasmine, you can use
XSUnit to test JavaScript. XSUnit provides tools that enable you to create and install a test “double” for one or
more object methods. In the Jasmine framework, a test double is known as a “spy”. A spy can be used not only
to stub any function but also to track calls to it and all arguments, too.
Note
XSUnit includes special matchers that enable interaction with Jasmine spies.
The XSUnit test tools delivery unit (DU) includes a small XS JavaScript demo “Ratings” application which
comprises an SAPUI5 client front end on top of OData and XS JavaScript services; the Ratings application
enables you to experiment with different test techniques. You can try out the application at the following URL:
http://<SAPHANA_host>:80<instancenumber>/sap/hana/testtools/demo/apps/rating/
WebContent/
The following code example provides a quick overview of commonly used commands that enable the use of
Jasmine Spies. You can see how to perform the following actions:
The following code example shows how install a method double (simple example).
spyOn(object, "method");
expect(object.method).toHaveBeenCalled();
The following code example shows how install a method double (variant).
The following code example shows how install a method double (custom action for double).
The following code example shows how to check whether the function has been called as expected, and if so, if
the the right values were used.
expect(spyObject.method).toHaveBeenCalled();
expect(spyObject.method).toHaveBeenCalledWith(expArgValue1, expArgValue2);
expect(spyObject.method.calls.allArgs()).toContain([ expArgValue1,
expArgValue2 ]);
expect(spyObject.method.calls.mostRecent().args).toEqual([ expArgVal1,
expArgVal2 ]);
expect(spyObject.method.calls.count()).toBe(2);
spyObject.method.calls.reset(); // reset all calls
XS JavaScript files that can be accessed by performing an HTTP call against the service defined in the XS
JavaScript file.
You can use the TestRunner tool to call an XS JavaScript service. The TestRunner service is part of the test-
tools package sap.hana.testtools.unit.jasminexs and has one mandatory parameter, namely
package. Since TestRunner is an HTTP GET service, you can execute the service in the browser using the
following URL:
http://<hostname>:80<instancenumber>/sap/hana/testtools/unit/jasminexs/
TestRunner.xsjs?package=<mypackage>
Since it is not possible to import XS Javascript files (.xsjs) files into a JavaScript library (.xsjslib), the
functions you implement inside the XS JavaScript file cannot be tested within an XSUnit test. As a
consequence, it is recommended to include only minimal logic within the XSJS files and delegate tasks to the
functions implemented in corresponding JavaScript libraries; these libraries can be tested in isolation using
XSUnit tools (for example, Mockstar).
Note
XSUnit enables you to perform an HTTP call to your XSJS services via HTTP. However, this is an end-to-end
system test with no possibility to use test doubles during the test. These tests are not suitable for testing a
JavaScript function.
Since you cannot insert test data into the test table during the test, the tests have no control over the data. This
restriction reduces the scope of the tests you can perform for HTTP calls, for example, you can test the
following scenarios:
To ensure access to SAP HANA, you need to adapt the default HTTP destination file
(:localhost.xshttpdest) provided with the XSUnit test tools. The default HTTP destination configuration
file is located in sap.hana.testtools.unit.jasminexs.lib:localhost.xshttpdest to fit to your
HANA instance. To access an HTTP destination configuration, you need the permissions granted in the user
role sap.hana.xs.admin.roles::HTTPDestAdministrator.
Caution
To change the HTTP destination, create an HTTP extension" of your own; do not make any changes to the
file localhost.xshttpdest. Changes to localhost.xshttpdest are overwritten by updates to the
XSUnit test tools on your system.
Related Information
Use XSUnit tools to test JavaScript code that depends on functions in your code, for example: dependencies on
functions, libraries, or to database tables.
In JavaScript it is possible to overwrite anything that is visible in a context, for example: public data, public
functions, or even the whole class. With XSUnit, you can make use of a simulation framework that is included
with Jasmine. The simulation framework provides a mechanism that enables you to create and install a test
double (so-called Jasmine “Spy”), which can help you to reduce some of the basic code and keep the code
more concise. Jasmine Spies should be created in the test setup, before you define any expectations. The
Spies can then be checked, using the standard Jasmine expectation syntax. You can check if a Spy is called (or
not) and find out what (if any) parameters were used in the call. Spies are removed at the end of every test
specification.
Note
Each dependency increases the complexity of testing involved for a function or a component.
The following code snippet defines a controller that you want to test; the controller depends on a Date object.
The accompanying code snippet shows how you can test this code.
The following code snippets shows an example of the test code you could run; the code uses a Jasmine Spy
ensures the dependencies on the Date object are replaced and tested as expected.
beforeEach(function() {
model = new DataModel();
It is important to try to avoid mixing business logic that is implemented in JavaScript with the data base
interaction. We recommend moving the database persistency logic into a dedicated persistency class, so that
just the business logic remains for testing. The goal of the test is to be able to test both normal and special
cases without interacting with the data base at all.
To unit test the persistency class, you can parameterize the schema and use a schema for testing, for example,
the user schema where you have all authorizations required to create, modify, and drop objects, and cannot
mess things up with the test. Last of all, you can offer a small set of integration tests, that just ensure that the
productive classes, the AnyService class, and the Persistency class, integrate well.
Note
For sake of conciseness, resource closing and error handling is missing from the following code example.
The following code snippets shows an example of the test code you could run to test the dependencies.
The following code snippet show how to use XSUnit to test a self-contained JavaScript function (mathlib); a
self-contained function has no dependencies to other JavaScript functions, database tables or session
parameters.
You can use the SAPUI5 user interface technology to build and adapt client applications based on SAP HANA.
Related Information
Building User Interfaces with SAPUI5 for SAP HANA [page 459]
Consuming Data and Services with SAPUI5 for SAP HANA [page 460]
UI development toolkit for HTML5 (SAPUI5) is a user-interface technology that is used to build and adapt client
applications based on SAP HANA. You can install SAPUI5 and use it to build user interfaces delivered by SAP
HANA's Web server.
The SAPUI5 run time is a client-side HTML5 rendering library with a rich set of standard and extension
controls. The SAPUI5 run time provides a lightweight programming model for desktop as well as mobile
applications. Based on JavaScript, it supports Rich Internet Applications (RIA) such as client-side features.
SAPUI5 complies with OpenAjax and can be used with standard JavaScript libraries.
● Documentation
Information about the programming languages used, open source technology, development tools, and API
usage. You can also find tutorials to help you get started and details about the new features delivered with
each version of SAPUI5.
● API reference
The complete JavaScript documentation for the Framework and Control API, including featured
namespaces such as sap.m (main controls), sap.ui.layout (layout controls), sap.ui.table (table
controls), sap.f (SAP Fiori), and sap.ui.core (UI5 core run time).
● Samples
A detailed view of almost every control available in the kit, including detailed information about featured
controls such as: user interaction elements, lists, tables, pop-up dialogs tiles, messages, maps and charts,
smart controls, step-based interaction, and so on. You can also find detailed information about object
pages, dynamic pages, and how to use flexible columns.
Related Information
SAP HANA Extended Application Services (SAP HANA XS) can be used to expose the database data model,
with its tables, views and database procedures, to UI clients.
You can expose an SAP HANA model using OData services or by writing native server-side JavaScript code that
runs in the SAP HANA context. You can also use SAP HANA XS to build dynamic HTML5 client applications, for
example, using SAPUI5 for SAP HANA.
The server-centric approach to native application development envisaged for SAP HANA assumes the following
high-level scenario:
● View
UI rendering occurs completely in the client (SAPUI5, browser, mobile applications)
● Controller
Procedural (control-flow) logic is defined in (XS) JavaScript, SQLScript or an OData service
● Model
All application artifacts are stored in SAP HANA
Each of the levels illustrated in the graphic (view, control, model) is manifested in a particular technology and
dedicated languages. After you have defined the data model with design-time artifacts and the equivalent run-
time objects, you develop the control-flow logic to expose the data, for example, using server-side JavaScript or
an OData service. With the data model and control-flow logic in place, you can build the presentation logic to
view the exposed data in a UI client application using SAPUI5 for SAP HANA. For example, you can use an
SAPUI5 client to request and display data exposed by an OData service; the UI could include buttons that
trigger operations performed by SAP HANA XS JavaScript service; and the data displayed is retrieved from
data end points defined in your data model, for example, SQLScript or CDS.
Related Information
Tutorials are designed to extend task-based information to show you how to use real code and examples to
build native SAP HANA applications. The tutorials provided here include examples of how to build simple
SAPUI5 applications.
The tutorials provided here show you how to create your own simple SAPUI5-based applications. Some of the
tutorials make use of sample data, design-time development objects, and functions provided by the SAP HANA
Interactive Education (SHINE) demo application, for example: database tables, data views, server-side
JavaScript (XSJS) and OData services, and user-interface elements.
Note
If the SHINE DU (HCODEMOCONTENT) is not already installed on your SAP HANA system, you can download
the DU from the SAP Software Download Center in the SAP Support Portal at http://
service.sap.com/swdc. On the SAP HANA PLATFORM EDIT. 1.0 Web page, locate the download package
SAP HANA DEMO MODEL 1.0 # OS independent SAP HANA database .
● SAPUI5 clients
○ Hello world
Build a simple “Hello World” application using SAPUI5 tools; the exercise shows how the development
process works and which components are required.
● Consuming Server-side JavaScript (XSJS) services with SAPUI5
Build an SAPUI5 application that calls an XSJS service in response to user interaction with the user
interface, for example, clicking a button to perform an action. In this case, the XSJS service called by the UI
request performs an action and returns a response, which is displayed in the SAPUI5 client.
● Consuming OData services with SAPUI5
Build an SAPUI5 application that calls an OData service in response to user interaction with the user
interface, for example, clicking a graph or report chart. In this case, the OData service called by the UI
request performs an action (collects data) and returns a response, which is displayed in the SAPUI5 client.
○ Bind a UI element in an SAPUI5 application to the data specified in an OData service. For example, you
can populate the contents of a table column displayed in an SAPUI5 application by using the data
stored in a database table defined in an OData service.
○ Build an SAPUI5 view that provides input fields, which you can use to create a new record or update an
existing record in a database table, for example, using the OData create, update, and delete (CRUD)
features.
● Localizing UI Strings in SAPUI5
Create a simple text-bundle file for translation purposes and re-import the translated text into SAP HANA
for use with a specific language locale. Textbundles containing text strings that define elements of the user-
interface (for example, buttons and menu options).
Related Information
SAPUI5 provides a client-side HTML5 rendering library with a comprehensive set of standard controls and
extensions that you can use to build a UI quickly and easily.
Prerequisites
You have installed the SAPUI5 tools included in the delivery unit (DU) SAPUI5_1.
In this tutorial, you create a simple “Hello World” application in SAPUI5 using the model-view-controller
concept. You create a bootstrap HTML (index.html) page in a HelloWorld package and a HelloWorld
controller and HelloWorld view in a sub-package called helloworldx:
\
HelloWorld
\
helloworldx
\
HelloWorld.controller.js
HelloWorld.view.js
.xsaccess
.xsapp
index.html
Procedure
<!DOCTYPE HTML>
<html>
<head>
<meta http-equiv="X-UA-Compatible" content="IE=edge">
sap.ui.jsview("helloworldx.HelloWorld", {
/** Specifies the Controller belonging to this View.
* In the case that it is not implemented, or that "null" is returned,
this View does not have a Controller.
* @memberOf helloworldx.HelloWorld
*/
getControllerName : function() {
return "helloworldx.HelloWorld";
},
/** Is initially called once after the Controller has been instantiated.
It is the place where the UI is constructed.
* Since the Controller is given to this method, its event handlers can be
attached right away.
* @memberOf helloworldx.HelloWorld
*/
createContent : function(oController) {
sap.ui.controller("helloworldx.HelloWorld", {
});
An XS server-side JavaScript (XSJS) application can be used to perform an action linked to an element such as
a button or a text box in an SAPUI5 application.
Prerequisites
● You have installed the SAPUI5 tools included in the delivery unit (DU) SAPUI5_1.
● You have installed the SHINE (democontent) delivery unit; this DU contains the XSJS service you want to
consume with the SAPUI5 application you build in this tutorial.
Note
You might have to adjust the paths in the code examples provided to suit the folder/package hierarchy in
your SAP HANA repository, for example, to point to the underlying content (demonstration tables and
services) referenced in the tutorial.
Context
You can configure an SAPUI5 application to call an XSJS service in response to user interaction with the UI; the
XSJS service performs an action and returns a response. This tutorial demonstrates how to trigger an XSJS
service which performs a mathematical multiplication when numbers are typed in text boxes displayed in an
SAPUI5 application.
<!DOCTYPE HTML>
<html>
<head>
<meta http-equiv="X-UA-Compatible" content="IE=edge">
<script src="/sap/ui5/1/resources/sap-ui-core.js" id="sap-ui-bootstrap"
data-sap-ui-libs="sap.ui.commons,sap.ui.table" data-sap-ui-
theme="sap_bluecrystal">
</script>
<script>
sap.ui.localResources("xsjsmultiply");
var view = sap.ui.view({
id: "idxsjsMultiply",
viewName: "xsjsmultiply.xsjsMultiply",
type: sap.ui.core.mvc.ViewType.JS
});
view.placeAt("content");
</script>
</head>
<body class="sapUiBody" role="application">
<div id="content"></div>
</body>
</html>
sap.ui.jsview("xsjsmultiply.xsjsMultiply", {
getControllerName : function() {
return "xsjsmultiply.xsjsMultiply";
},
createContent : function(oController) {
var multiplyPanel = new sap.ui.commons.Panel().setText("XS Service
Test - Multiplication");
return multiplyPanel;
}
});
sap.ui.controller("xsjsmultiply.xsjsMultiply", {
onLiveChange: function(oEvent,oVal){
var aUrl = '/sap/hana/democontent/epm/services/multiply.xsjs?
cmd=multiply'+'&num1='
+escape(oEvent.getParameters().liveValue)
+'&num2='+escape(oVal.getValue());
jQuery.ajax({
url: aUrl,
method: 'GET',
dataType: 'json',
success: this.onCompleteMultiply,
error: this.onErrorCall });
},
If the AJAX call is successful, call a controller event named onCompleteMultiply; if the AJAX call is
not successful, call a controller event named onErrorCall.
c. Add the code that creates an event handler named onCompleteMultiply.
The onCompleteMultiply function accepts the response object as an input parameter called myTxt.
This text box will contain the result of the multiplication in clear text. Use the
sap.ui.core.format.NumberFormat to format the output as an integer and set the value back into
the oResult textView.
onCompleteMultiply: function(myTxt){
var oResult = sap.ui.getCore().byId("result");
if(myTxt==undefined){ oResult.setText(0); }
else{
jQuery.sap.require("sap.ui.core.format.NumberFormat");
var oNumberFormat =
sap.ui.core.format.NumberFormat.getIntegerInstance({
maxFractionDigits: 12,
minFractionDigits: 0,
groupingEnabled: true });
oResult.setText(oNumberFormat.format(myTxt)); }
},
d. Add the code that produces an error dialog if the event produces an error.
The onErrorCall function displays a message dialog (sap.ui.commons.MessageBox.show) in the
event of an error during the multiplication action provided by the XSJS service. The information
displayed in the error message is contained in jqXHR.responseText.
The complete xsjsMultiply.controller.js file should look like the following example:
sap.ui.controller("xsjsmultiply.xsjsMultiply", {
onLiveChange: function(oEvent,oVal){
var aUrl = '/sap/hana/democontent/epm/services/multiply.xsjs?
cmd=multiply'+'&num1='
onCompleteMultiply: function(myTxt){
var oResult = sap.ui.getCore().byId("result");
if(myTxt==undefined){ oResult.setText(0); }
else{
jQuery.sap.require("sap.ui.core.format.NumberFormat");
var oNumberFormat =
sap.ui.core.format.NumberFormat.getIntegerInstance({
maxFractionDigits: 12,
minFractionDigits: 0,
groupingEnabled: true });
oResult.setText(oNumberFormat.format(myTxt)); }
},
An OData service can be used to provide the data required for display in an SAPUI5 application.
Prerequisites
● You have installed the SAPUI5 tools included in the delivery unit (DU) SAPUI5_1.
● You have installed the SHINE delivery unit (DU); this DU contains the views
(sap.hana.democontent.epm.models:: AN_SALES_OVERVIEW_WO_CURR_CONV and
Note
You might have to adjust the paths in the code examples provided to suit the folder/package hierarchy in
your SAP HANA repository, for example, to point to the underlying content (demonstration tables and
services) referenced in the tutorial.
Context
You can bind a UI element in an SAPUI5 application to the data specified in an OData service. For example, you
can populate the contents of a table column displayed in an SAPUI5 application with the data stored in a
database table defined in an OData service.
Procedure
<!DOCTYPE HTML>
<html>
<head>
<meta http-equiv="X-UA-Compatible" content="IE=edge">
<script src="/sap/ui5/1/resources/sap-ui-core.js" id="sap-ui-bootstrap"
data-sap-ui-libs="sap.ui.commons,sap.ui.table" data-sap-ui-
theme="sap_bluecrystal">
</script>
<script>
sap.ui.localResources("odatabasic");
var view = sap.ui.view({
id: "idodataBasic",
viewName: "odatabasic.odataBasic",
type: sap.ui.core.mvc.ViewType.JS
});
view.placeAt("content");
</script>
</head>
<body class="sapUiBody" role="application">
<div id="content"></div>
</body>
</html>
Note
You need to declare any libraries you want the SAPUI5 application to use to render the data it
consumes. For this tutorial, you add sap.ui.table to the list of SAPUI5 libraries.
○ Application script: SAPUI5 is based on the model-view-controller paradigm. To create the view and
controller, the SAPUI5 runtime needs to know from where to load the related resources
(sap.ui.localResources); in this case from the relative sub-folder /odatabasic. In this example,
you place the newly created instance of the odataBasic view from the odatabasic sub-folder in an
HTML element with the ID content. SAPUI5 supports different view types; here the JS (JavaScript)
view type is used.
○ HTML body: The HTML element with the ID content, in which you placed the view, needs to be
included in the HTML page. To do this, you add a <div> block with id="content" to the HTML body.
The <body> attribute class="sapUiBody" defines the SAPUI5 CSS class to be used, which ensures
that the page background and some other styles are properly set. The attribute
role="application" sets the WAI-ARIA landmark role.
7. Connect the SAPUI5 table element to the OData service salesOrders.xsodata.
Add the following code to the SAPUI5 view controller file odataBasic.view.js.
sap.ui.jsview("odatabasic.odataBasic", {
/** Specifies the Controller belonging to this View.
* In the case that it is not implemented, or that "null" is returned,
this View does not have a Controller.
* @memberOf databasic.odataBasic
*/
getControllerName : function() {
return "odatabasic.odataBasic";
var oControl;
this.oSHTable = new sap.ui.table.Table("soTable",{
visibleRowCount: 10,
});
this.oSHTable.setTitle("SALES_ORDER_HEADERS");
oControl = new
sap.ui.commons.TextView().bindProperty("text","PARTNERID.PARTNERID");
this.oSHTable.addColumn(new sap.ui.table.Column({label:new
sap.ui.commons.Label({text: "PARTNER_ID"}),
template: oControl, sortProperty: "PARTNERID", filterProperty:
"PARTNERID" }));
oControl = new
sap.ui.commons.TextView().bindText("GROSSAMOUNT",oController.numericFormatter)
;
oControl.setTextAlign("End");
this.oSHTable.addColumn(new sap.ui.table.Column({label:new
sap.ui.commons.Label({text: "GROSS_AMOUNT"}),
template: oControl, sortProperty: "GROSSAMOUNT",
filterProperty: "GROSSAMOUNT", hAlign: sap.ui.commons.layout.HAlign.End}));
oControl = new
sap.ui.commons.TextView().bindProperty("text","CURRENCY");
this.oSHTable.addColumn(new sap.ui.table.Column({label:new
sap.ui.commons.Label({text: "CURRENCY"}),
template: oControl, sortProperty: "CURRENCY", filterProperty:
"CURRENCY" }));
this.oSHTable.setModel(oModel);
var sort1 = new sap.ui.model.Sorter("SALESORDERID", true);
this.oSHTable.bindRows({
path: "/SalesOrderHeader",
parameters: {expand: "Buyer",
select:
"SALESORDERID,CURRENCY,GROSSAMOUNT,PARTNERID.PARTNERID,Buyer/COMPANYNAME"},
sorter: sort1
this.oSHTable.setTitle("Sales Orders");
oLayout.createRow(this.oSHTable);
return oLayout;
}
});
○ Sets the model named oModel to the UI table control named oSHTable:
this.oSHTable.setModel(oModel);
○ Creates a sorting mechanism (of type sap.ui.model.Sorter) which uses the column
SALESORDERID:
○ Binds the table to the entity SalesOrderHeader in the OData service definition and adds the sorter
object to the binding:
this.oSHTable.bindRows({
path: "/SalesOrderHeader",
parameters: {expand: "Buyer",
select:
"SALESORDERID,CURRENCY,GROSSAMOUNT,PARTNERID.PARTNERID,Buyer/COMPANYNAME"},
sorter: sort1
});
sap.ui.controller("odatabasic.odataBasic", {
});
sap.ui.jsview("odatabasic.odataBasic", {
/** Specifies the Controller belonging to this View.
* In the case that it is not implemented, or that "null" is returned,
this View does not have a Controller.
* @memberOf databasic.odataBasic
*/
getControllerName: function() {
return "odatabasic.odataBasic";
},
/** Is initially called once after the Controller has been instantiated.
It is the place where the UI is constructed.
* Since the Controller is given to this method, its event handlers can
be attached right away.
* @memberOf databasic.odataBasic
*/
createContent: function(oController) {
var oLayout = new sap.ui.commons.layout.MatrixLayout({
width: "100%"
});
var oModel = new sap.ui.model.odata.ODataModel("/sap/hana/
democontent/epm/services/salesOrders.xsodata/", true);
var oControl;
this.oSHTable = new sap.ui.table.Table("soTable", {
visibleRowCount: 10,
});
this.oSHTable.setTitle("SALES_ORDER_HEADERS");
//Table Column Definitions
var oMeta = oModel.getServiceMetadata();
var oControl;
for (var i = 0; i <
oMeta.dataServices.schema[0].entityType[0].property.length; i++) {
var property =
oMeta.dataServices.schema[0].entityType[0].property[i];
oControl = new sap.ui.commons.TextField().bindProperty("value",
property.name);
this.oSHTable.addColumn(new sap.ui.table.Column({
label: new sap.ui.commons.Label({
text: property.name
}),
template: oControl,
sortProperty: property.name,
An OData service can be used to provide the data required for display in an SAPUI5 application.
Prerequisites
● You have installed the SAPUI5 tools included in the delivery unit (DU) SAPUI5_1.
● You have installed the SHINE delivery unit (DU); this DU contains the tables and OData services that you
want to consume with the SAPUI5 application you build in this tutorial.
● You have generated data to populate the tables and views provided by the SHINE delivery unit and used in
this tutorial. You can generate the data with tools included in the SHINE delivery unit.
You might have to adjust the paths in the code examples provided to suit the folder/package hierarchy in
your SAP HANA repository, for example, to point to the underlying content (demonstration tables and
services) referenced in the tutorial.
Context
You can bind a UI element in an SAPUI5 application to the data specified in an OData service. For example, you
can populate the contents of table columns displayed in an SAPUI5 application with the data stored in a
database table defined in an OData service. In this tutorial, you learn how to build an SAPUI5 view that provides
input fields, which you can use to create a new record or update an existing record in a database table, for
example, using the OData create, update, and delete (CRUD) features.
Procedure
<!DOCTYPE HTML>
<html>
<head>
Note
You need to declare any libraries you want the SAPUI5 application to use to render the data it
consumes. For this tutorial, you add sap.ui.table to the list of SAPUI5 libraries.
○ Application script: SAPUI5 is based on the model-view-controller paradigm. To create the view and
controller, the SAPUI5 runtime needs to know from where to load the related resources
(sap.ui.localResources); in this case from the relative sub-folder /usercrud. In this example,
you place the newly created instance of the userCRUD view from the helloworldx sub-folder in an
HTML element with the ID content. SAPUI5 supports different view types; here the JS (JavaScript)
view type is used.
○ HTML body: The HTML element with the ID content, in which you placed the view, needs to be
included in the HTML page. To do this, you add a <div> block with id="content" to the HTML body.
The <body> attribute class="sapUiBody" defines the SAPUI5 CSS class to be used, which ensures
that the page background and some other styles are properly set. The attribute
role="application" sets the WAI-ARIA landmark role.
7. Set up the SAPUI5 user interface and bind it to an OData service.
The code you need to add to the userCRUD.view.js performs the following actions:
○ Adds three text-entry boxes (sap.ui.commons.TextField) to the SAPUI5 application interface
(First Name, Last Name, and Email)
○ Adds a Create Record button (sap.ui.commons.Button) to the SAPUI5 application interface
○ Binds the SAPUI5 view to the OData service user.xsodata
sap.ui.jsview("usercrud.userCRUD", {
getControllerName : function() {
return "usercrud.userCRUD";
},
createContent : function(oController) {
oTable.setModel(this.oModel);
oTable.bindRows("/Users");
oTable.setTitle("Users" );
oTable.setEditable(true);
oLayout.createRow(oTable);
return oLayout;
}
});
The userCRUD.view.js file should display the UI view shown in the following example:
sap.ui.controller("usercrud.userCRUD", {
oModel : null,
b. Set up the callUserService function to handle create events (create new records in a table).
The code required for this implementation of the callUserService function is shown in the following
example:
callUserService : function() {
var oModel = sap.ui.getCore().byId("userTbl").getModel();
var oEntry = {};
oEntry.PERS_NO = "0000000000";
oEntry.FIRSTNAME = sap.ui.getCore().byId("fName").getValue();
oEntry.LASTNAME = sap.ui.getCore().byId("lName").getValue();
oEntry.E_MAIL = sap.ui.getCore().byId("email").getValue();
oModel.setHeaders({"content-type" : "application/
json;charset=utf-8"});
oModel.create('/Users', oEntry, null, function() {
alert("Create successful");
}, function() {
alert("Create failed");
});
},
updateService: function(Event) {
var oModel = sap.ui.getCore().byId("userTbl").getModel();
var index = Event.getSource().oParent.getIndex();
var oEntry = {};
oEntry.PERS_NO = sap.ui.getCore().byId("__field0-col0-
row"+index).getValue();
switch (Event.mParameters.id){
case "__field1-col1-row"+index:
oEntry.FIRSTNAME = Event.mParameters.newValue; break;
case "__field2-col2-row"+index:
oEntry.LASTNAME = Event.mParameters.newValue; break;
case "__field3-col3-row"+index:
oEntry.E_MAIL = Event.mParameters.newValue;
break;
}
}
});
Every user who wants to work directly with the SAP HANA database must have a database user with the
necessary privileges. Although privileges can be granted to users directly, roles are the standard way to
authorize users. A role is a collection of privileges.
After users have successfully logged on to SAP HANA, they can do only those things they are authorized to do.
This is determined by the privileges that they have been granted. Several privilege types exist in the SAP HANA
database, for example system privileges, object privileges, and application privileges.
Privileges can be granted to users directly or indirectly through roles. Roles are the standard mechanism of
granting privileges as they allow you to implement complex, reusable authorization concepts that can be
modeled on business roles. It is possible to create roles as pure runtime objects that follow classic SQL
principles (catalog roles) or as design-time objects in the repository of the SAP HANA database (repository
roles). In general, repository roles are recommended as they offer more flexibility. For example, they can be
transported between systems. For more information, see Catalog Roles and Repository Roles Compared in the
SAP HANA Security Guide.
Note
Part of the logon process is user authentication. SAP HANA supports several authentication mechanisms,
including user name/password authentication and external authentication services such as SAML and
Kerberos. For more information, see SAP HANA Authentication and Single Sign-On in the SAP HANA
Security Guide.
Application Authorization
Application developers define application descriptors to specify how users accessing their applications are
authenticated and what authorization is required. For more information, see Creating the Application
Descriptors.
User Management
User administrators are responsible for creating database users and granting them the required roles and
privileges. For more information about creating and authorizing users, as well as other user provisioning tasks,
see User Provisioning in the SAP HANA Administration Guide.
You create a role in the SAP HANA repository using the form-based role editor of the SAP HANA Web-based
Development Workbench.
Prerequisites
Caution
Theoretically, a user with authorization to create and activate repository objects can change a role that
he has been granted. Once the role is activated, the user has the new privileges that he or she just
added. Therefore, it is important that roles in production systems are imported from a test or
development system and that changes to imported objects are not allowed. This danger is however not
specific to roles but also applies to other repository objects, for example, modeled views.
● You can select all the roles and privileges that you plan to grant to the new role. For this, you need either the
system privilege CATALOG READ or the actual role or privilege to be granted.
● You have granted to the technical user _SYS_REPO those privileges on catalog-only objects that you plan to
grant the new role. For more information, see Roles as Repository Objects.
● If you're using the SAP HANA Web-based Development Workbench for either the system database or a
tenant database in a multiple-container system, you have configured the internal SAP Web Dispatcher so
that it can dispatch HTTP requests coming into the system to the correct database on the basis of alias
DNS names. Every tenant database needs an alias. For more information, see Configure HTTP Access to
Multitenant Database Containers in the SAP HANA Administration Guide.
The design-time definition of a role is specified in a text file with the extension .hdbrole. You can create and
define a role in a simple text editor using the role domain-specific language (DSL). However, the Editor tool of
the SAP HANA Web-based Development Workbench provides you with a form-based editor.
Procedure
1. Open the Editor tool of the SAP HANA Web-based Development Workbench.
The URL depends on whether you are connecting to a single-container system or to a database in a
multiple-container system.
Option Description
○ Roles
You can grant both catalog roles and other design-time roles.
○ System privileges
○ Object privileges
You can grant privileges on both catalog objects and design-time objects. First, select the object that
you want to grant privileges on, then the required privileges.
Caution
Do not grant object privileges on a catalog object if it was created in design time. If you do, the next
time the design-time object is activated (which results in the creation of a new version of the
catalog object), the privilege on the original catalog object will be removed from the role. Always
grant privileges on design-time objects.
○ Analytic privileges
○ Package privileges
First, select the package that you want to grant privileges on, then the required privileges.
○ Application privileges
Note
By default, roles and other objects that you can create and edit in the Editor tool are activated on
saving. If you do not want objects to be activated on saving, then select the option Enable Inactive Save
in the Editor settings. Once this option is enabled, the (Save without activating CTRL+I) button
becomes available.
Results
A user administrator can now grant the role to users. This is possible using the Security tool of the SAP HANA
Web-based Development Workbench or using the SAP HANA studio.
Related Information
You can change a design-time role in the SAP HANA Web-based Development Workbench using either the
form-based role editor or the text editor.
Prerequisites
Caution
Theoretically, a user with authorization to create and activate repository objects can change a role that
he has been granted. Once the role is activated, the user has the new privileges that he or she just
added. Therefore, it is important that roles in production systems are imported from a test or
development system and that changes to imported objects are not allowed. This danger is however not
specific to roles but also applies to other repository objects, for example, modeled views.
● You can select all the roles and privileges that you plan to grant to the new role. For this, you need either the
system privilege CATALOG READ or the actual role or privilege to be granted.
● You have granted to the technical user _SYS_REPO those additional privileges on catalog-only objects that
you plan to grant to the role. For more information, see Roles as Repository Objects.
● If you're using the SAP HANA Web-based Development Workbench for either the system database or a
tenant database in a multiple-container system, you have configured the internal SAP Web Dispatcher so
that it can dispatch HTTP requests coming into the system to the correct database on the basis of alias
DNS names. Every tenant database needs an alias. For more information, see Configure HTTP Access to
Multitenant Database Containers in the SAP HANA Administration Guide.
Context
You can change the roles and privileges granted to an existing repository role. It is important that you do this in
the design-time version of the role, not the runtime version. Otherwise, any changes that you make will be
reverted the next time the design-time version is activated.
Note
The Editor tool of the SAP HANA Web-based Development Workbench also provides features for copying
and renaming repository objects. It is recommended that you use these features with care since users who
have already been granted the role may be impacted. For example, if you rename a role, it will still be
granted to users by its old name. This means that you will have to revoke the role by its old name and then
grant it again by its new name.
Procedure
1. Open the Editor tool of the SAP HANA Web-based Development Workbench.
The URL depends on whether you are connecting to a single-container system or to a database in a
multiple-container system.
Note
If the role definition contains syntax errors, it is not possible to open the form-based role editor. You
must edit the role in the text editor instead.
Note
By default, roles and other objects that you can create and edit in the Editor tool are activated on
saving. If you do not want objects to be activated on saving, then select the option Enable Inactive Save
in the Editor settings. Once this option is enabled, the (Save without activating CTRL+I) button
becomes available.
Results
The authorization of users who have been granted the role changes accordingly.
Related Information
A database role is a collection of privileges that can be granted to either a database user or another role in
runtime.
A role typically contains the privileges required for a particular function or task, for example:
● Business end users reading reports using client tools such as Microsoft Excel
● Modelers creating models and reports
Privileges can be granted directly to users of the SAP HANA database. However, roles are the standard
mechanism of granting privileges as they allow you to implement complex, reusable authorization concepts
that can be modeled on business roles.
Creation of Roles
Roles in the SAP HANA database can exist as runtime objects only (catalog roles), or as design-time objects
that become catalog objects on deployment (database artifact with file suffix .hdbrole).
In an SAP HANA XS classic environment, database roles are created in the built-in repository of the SAP HANA
database using either the SAP HANA Web Workbench or the SAP HANA studio. These are also referred to as
repository roles. In an SAP HANA XS advanced environment, design-time roles are created using the SAP Web
IDE and deployed using SAP HANA deployment infrastructure (SAP HANA DI, or HDI).
Note
Due to the container-based model of HDI where each container corresponds to a database schema, HDI
roles, once deployed, are schema specific.
SAP HANA XS advanced has the additional concept of application roles and role collections. These are
independent of database roles in SAP HANA itself. In the XS advanced context, SAP HANA database roles are
used only to control access to database objects (for example, tables, views, and procedures) for XS advanced
applications.
For more information about the authorization concept of XS advanced, see the SAP HANA Security Guide.
Role Structure
Note
There are no HDI or XS advanced equivalents in the SAP HANA authorization concept for package
privileges on repository packages and applications privileges on SAP HANA XS classic applications. For
more information about the authorization concept of XS advanced, see the SAP HANA Security Guide.
For best performance of role operations, in particular, granting and revoking, keep the following basic rules in
mind:
● Create roles with the smallest possible set of privileges for the smallest possible group of users who can
share a role (principle of least privilege).
● Avoid granting object privileges at the schema level to a role if only a few objects in the schema are relevant
for intended users.
● Avoid creating and maintaining all roles as a single user. Use several role administrator users instead.
Related Information
In an SAP HANA XS classic environment, role developers create database roles as design-time objects in the
built-in repository of the SAP HANA database using either the SAP HANA Web Workbench or the SAP HANA
studio.
Note
SAP HANA XS classic and the SAP HANA repository are deprecated as of SAP HANA 2.0 SPS 02. For more
information, see SAP Note 2465027.
Roles created in the repository differ from roles created directly as runtime objects using SQL in several ways.
According to the authorization concept of the SAP HANA database, a user can only grant a privilege to a user
directly or indirectly in a role if the following prerequisites are met:
A user is also authorized to grant object privileges on objects that he or she owns.
The technical user _SYS_REPO is the owner of all objects in the repository, as well as the runtime objects that
are created on activation. This means that when you create a role as a repository object, you can grant the
following privileges:
● Privileges that have been granted to the technical user _SYS_REPO and that _SYS_REPO can grant further
Note
It is recommended that you use a technical user to do this to ensure that privileges are not dropped when
the granting user is dropped (for example, because the person leaves the company).
SQL object privilege on runtime object (for example, repli Grant privilege to user _SYS_REPO with WITH GRANT OP
cated table) TION
Note
Technically speaking, only the user _SYS_REPO needs the privileges being granted in a role, not the
database user who creates the role. However, users creating roles in the SAP HANA Web-based
Development Workbench must at least be able to select the privileges they want to grant to the role. For
this, they need either the system privilege CATALOG READ or the actual privilege to be granted.
What about the WITH ADMIN OPTION and WITH GRANT OPTION
parameters?
When you create a role using SQL (that is, as a runtime object), you can grant privileges with the additional
parameters WITH ADMIN OPTION or WITH GRANT OPTION. This allows a user who is granted the role to grant
the privileges contained within the role to other users and roles. However, if you are implementing your
authorization concept with privileges encapsulated within roles created in design time, then you do not want
users to grant privileges using SQL statements. For this reason, it is not possible to pass the parameters WITH
ADMIN OPTION or WITH GRANT OPTION with privileges when you model roles as repository objects.
Similarly, when you grant an activated role to a user, it is not possible to allow the user to grant the role further
(WITH ADMIN OPTION is not available).
It is not possible to grant and revoke activated design-time roles using the GRANT and REVOKE SQL
statements. Instead, roles are granted and revoked through the execution of the procedures
GRANT_ACTIVATED_ROLE and REVOKE_ACTIVATED_ROLE. Therefore, to be able to grant or revoke a role, a
user must have the object privilege EXECUTE on these procedures.
It is not possible to drop the runtime version of a role created in the repository using the SQL statement DROP
ROLE. To drop a repository role, you must delete it in the repository and activate the change. The activation
process deletes the runtime version of the role.
The auditing feature of the SAP HANA database allows you to monitor and record selected actions performed
in your database system. One action that is typically audited is changes to user authorization. If you are using
roles created in the repository to grant privileges to users, then you audit the creation of runtime roles through
activation with the audit action ACTIVATE REPOSITORY CONTENT.
Related Information
The design-time definition of a role is specified in a text file with the extension .hdbrole. Roles are defined
using a domain-specific language (DSL).
Example
role Roles::example_role
role <package_name>::<role_name>
The keywords listed below are used to specify which roles and privileges are granted to the role.
Note
The following general conventions apply when modeling a role definition using the role DSL:
● Comments start with a double-slash (//) or double-dash (--) and run to the end of the line.
● When specifying a reference to a design-time object, you must always specify the package name as
follows:
○ <package>::<object> if you are referencing a design-time role
○ <package>:<object>.<extension> if you are referencing any other design-time object
● When specifying multiple privileges on the same object or the same privilege on multiple objects, you
can do so individually line by line, or you can group them on a single line. Separate multiple objects
and/or multiple privileges using a comma.
extends role
The keyword extends role allows you to include another design-time role in the role. If role A extends role B,
role B is granted to role A. This means that effectively A has all privileges that B has.
system privilege
{
system privilege: BACKUP ADMIN, USER ADMIN;
system privilege: LICENSE ADMIN;
}
The system privilege keyword allows you to grant a system privilege to the role.
For more information about all available system privileges, see System Privileges (Reference).
sql object
{
sql object sap.example:MY_VIEW.attributeview: DROP;
sql object sap.example:MY_PROCEDURE.hdbprocedure: DROP;
sql object sap.example:MY_VIEW.attributeview,
sap.example:MY_OTHER_VIEW.analyticview, sap.example:MY_THIRD_VIEW.analyticview:
SELECT;
}
The sql object keyword allows you to grant an object privilege on a design-time object (table, view,
procedure,sequence) to the role.
Tip
Many object types can be created in the repository. To verify the correct extension, refer to the object file in
the relevant package in the Project Explorer view (SAP HANA studio) or the file explorer (SAP HANA Web-
based Developer Workbench).
For more information about all available object privileges and to which object types they apply, see Object
Privileges (Reference).
{
catalog sql object "MY_SCHEMA"."MY_TABLE": SELECT;
}
The catalog sql object keyword allows you to grant an object privilege on a catalog object (table, view,
procedure, sequence) to the role.
Catalog objects must always be qualified with the schema name. Catalog objects must also be referenced
within double quotes, unlike design-time objects.
Caution
Do not grant object privileges on a catalog object if it was created in design time. If you do, the next time the
design-time object is activated (which results in the creation of a new version of the catalog object), the
privilege on the original catalog object will be removed from the role. Therefore, grant privileges on design-
time objects.
For more information about all available object privileges and to which object types they apply, see Object
Privileges (Reference).
catalog schema
{
catalog schema "MY_SCHEMA": SELECT;
}
The catalog schema keyword allows you to grant a catalog schema to the role.
For more information about the object privileges that apply to schemas, see Object Privileges (Reference).
schema
{
schema sap.example:MY_OTHER_SCHEMA.hdbschema: SELECT;
}
The schema keyword allows you to grant a design-time schema to the role.
Note
You must still use the deprecated extension .schema if you are referring to a repository schema that uses
this extension.
For more information about the object privileges that apply to schemas, see Object Privileges (Reference).
package
{
package sap.example: REPO.READ;
}
For more information about all available package privileges, see Package Privileges.
analytic privilege
{
analytic privilege: sap.example:sp1.analyticprivilege,
sap.example:AP2.analyticprivilege;
}
The analytic privilege keyword allows you to grant a design-time analytic privilege to the role.
{
catalog analytic privilege: "sp3";
}
The catalog analytic privilege keyword allows you to grant an activated analytic privilege to the role.
application privilege
{
application privilege: sap.example::Execute;
}
The application privilege keyword allows you to grant an application privilege to the role.
Note
Related Information
Create a custom role for developers so that you can to grant developers all required privileges quickly and
efficiently.
A role enables you to assign various types of privileges to a user, for example: SQL privileges, analytic
privileges, system privileges, as well as application and package privileges. You can also restrict the type of
privilege, for example, to SELECT, INSERT or UPDATE statements (or any combination of desired statements).
You can use an existing role as the basis for a new, extended, custom role. The privileges granted by an
extended role include all the privileges specified in all the roles that are used as the basis of the extended role
plus any additional privileges defined in the new extended role itself.
Note
It is not possible to restrict the privileges granted by the existing role that you are extending. For example, if
role A extends role B, role A will always include all the privileges specified in role B.
The following example shows how to create a DEVELOPMENT role as a design-time object. Note that a role-
definition file must have the suffix .hdbrole, for example, MyRoleDefinition.hdbrole.
Tip
File extensions are important. If you are using SAP HANA studio to create artifacts in the SAP HANA
repository, the file-creation wizard adds the required file extension automatically and, if appropriate,
enables direct editing of the new file in the corresponding editor.
After activating the design-time role definition, you can grant the resulting runtime role object to application
developers, for example, by executing the _SYS_REPO procedure GRANT_ACTIVATED_ROLE. The call requires
the parameters: ROLENAME (the name of the runtime role object you want to assign) and USERNAME (the name
of the user to whom you want to assign the new runtime role).
call “_SYS_REPO”.“GRANT_ACTIVATED_ROLE”
('acme.com.data::MyUserRole','GranteeUserName');
The example role illustrated in this topic defines the following privileges for the SAP HANA application
developer:
● Schema privileges:
○ _SYS_BIC
SELECT and EXECUTE for all tables
● Object privileges:
○ Schema _SYS_BI
○ SELECT privilege for all BIMC_* tables
○ UPDATE, INSERT, and DELETE privilege for M_* tables
Note
It is also possible to grant application privileges in SAP HANA studio, for example, using the list of
privileges displayed in the Application Privileges tab in the Security [Users | Roles] runtime area.
To grant (or revoke) application privileges, the granting (or revoking) user must also have the object
privilege Execute for the GRANT_APPLICATION_PRIVILEGE or REVOKE_APPLICATION_PRIVILEGE
procedure respectively.
● Additional privileges
User _SYS_REPO requires the SELECT privilege on <schema_where_tables_reside> to enable the
activation and data preview of information views.
Example:
Application-Development Role-Definition Example
role <package_name>::DEVELOPMENT
// extends role com.acme::role1
// extends catalog role "CATROLE1", "CATROLE2"
{
// system privileges
// system privilege: BACKUP ADMIN, USER ADMIN;
// schema privileges
catalog schema "_SYS_BIC": SELECT, EXECUTE;
// sql object privileges
// privileges on the same object may be split up in several lines
catalog sql object "SYS"."REPOSITORY_REST": EXECUTE;
catalog sql object "_SYS_BI"."BIMC_ALL_CUBES": SELECT;
catalog sql object "_SYS_BI"."BIMC_CONFIGURATION": SELECT;
catalog sql object "_SYS_BI"."BIMC_DIMENSIONS": SELECT;
Related Information
Several privilege types are used in SAP HANA (system, object, analytic, package, and application).
System privilege System, database Administrators, devel System privileges control general system activi
opers ties. They are mainly used for administrative
purposes, such as creating schemas, creating
and changing users and roles, monitoring and
tracing.
Object privilege Database objects End users, technical Object privileges are used to allow access to and
(schemas, tables, users modification of database objects, such as tables
views, procedures and and views. Depending on the object type, differ-
so on) ent actions can be authorized (for example, SE
LECT, CREATE ANY, ALTER, DROP).
Analytic privilege Analytic views End users Analytic privileges are used to allow read access
to data in SAP HANA information models (that
is, analytic views, attribute views, and calcula
tion views) depending on certain values or com
binations of values. Analytic privileges are evalu
ated during query processing.
Package privilege Packages in the classic Application and con Package privileges are used to allow access to
repository of the SAP tent developers work and the ability to work in packages in the classic
HANA database ing in the classic SAP repository of the SAP HANA database.
HANA repository
Packages contain design time versions of vari
ous objects, such as analytic views, attribute
views, calculation views, and analytic privileges.
Note
With SAP HANA XS advanced, source code
and web content are not versioned and
stored in the SAP HANA database, so pack
age privileges are not used in this context.
For more information, see Authorization in
SAP HANA XS Advanced.
Package privilege Packages in the classic Application and con Package privileges are not relevant in the SAP
repository of the SAP tent developers work Cloud Platform, SAP HANA service context as
HANA database ing in the classic SAP the SAP HANA repository is not supported.
HANA repository
Application privilege SAP HANA XS classic Application end users, Developers of SAP HANA XS classic applica
applications technical users (for tions can create application privileges to author
SQL connection con ize user and client access to their application.
figurations) They apply in addition to other privileges, for ex
ample, object privileges on tables.
Note
With SAP HANA XS advanced, application
privileges are not used. Application-level
authorization is implemented using OAuth
and authorization scopes and attributes.
For more information, see Authorization in
SAP HANA XS Advanced.
Application privilege SAP HANA XS classic Application end users, Application privileges are not relevant in the
applications technical users (for SAP Cloud Platform, SAP HANA service context
SQL connection con as SAP HANA XS classic is not supported.
figurations)
Note
There are no HDI or XS advanced equivalents in the SAP HANA authorization concept for package
privileges on repository packages and applications privileges on SAP HANA XS classic applications. For
more information about the authorization concept of XS advanced, see the SAP HANA Security Guide.
Privileges on Users
An additional privilege type, privileges on users, can be granted to users. Privileges on users are SQL privileges
that users can grant on their user. ATTACH DEBUGGER is the only privilege that can be granted on a user.
For example, User A can grant User B the privilege ATTACH DEBUGGER to allow User B debug SQLScript code
in User A's session. User A is only user who can grant this privilege. Note that User B also needs the object
privilege DEBUG on the relevant SQLScript procedure.
For more information, see the section on debugging procedures in the SAP HANA Developer Guide.
Related Information
GRANT
Cross-Database Authorization in Tenant Databases
Authorization in SAP HANA XS Advanced
Debug an External Session
System privileges are mainly used to authorize users to perform administrative actions, including:
System privileges are also used to authorize basic repository operations, for example:
System privileges granted to users in a particular database authorize operations in that database only. The only
exception is the system privileges DATABASE ADMIN, DATABASE STOP, and DATABASE START, DATABASE
AUDIT ADMIN. These system privileges can only be granted to users of the system database. They authorize
the execution of operations on individual tenant databases. For example, a user with DATABASE ADMIN can
create and drop tenant databases, change the database-specific properties in configuration (*.ini) files, and
perform database-specific or full-system data backups.
Related Information
System privileges restrict administrative tasks. The following table describes the supported system privileges
in an SAP HANA database.
ALTER CLIENTSIDE ENCRYPTION KEYPAIR Authorizes a user to add a new version of a client-side en
cryption key pair (CKP), or to drop all older versions of the
CKP.
ATTACH DEBUGGER Authorizes debugging across different user sessions. For ex
ample, userA can grant ATTACH DEBUGGER to userB to al
low userB to debug a procedure in userA’s session (userB
still needs DEBUG privilege on the procedure, however).
CATALOG READ Authorizes unfiltered access to the data in the system views
that a user has already been granted the SELECT privilege
on. Normally, the content of these views is filtered based on
the privileges of the user. CATALOG READ does not allow a
user to view system views on which they have not been
granted the SELECT privilege.
CLIENT PARAMETER ADMIN Authorizes a user to override the value of the CLIENT param
eter for a database connection or to overwrite the value of
the $$client$$ parameter in an SQL query.
CREATE CLIENTSIDE ENCRYPTION KEYPAIR Authorizes a user to create client-side encryption key pairs.
CREATE REMOTE SOURCE Authorizes the creation of remote data sources by using the
CREATE REMOTE SOURCE statement.
CREATE SCHEMA Authorizes the creation of database schemas using the CRE
ATE SCHEMA statement.
DATA ADMIN Authorizes reading all data in the system views. It also ena
bles execution of Data Definition Language (DDL) state
ments in the SAP HANA database.
DATABASE AUDIT ADMIN Authorizes a user to CREATE, ALTER, DROP select audit pol
icies from SYSTEMDB for a specific tenant.
DATABASE START Authorizes a user to start any database in the system and to
select from the M_DATABASES view.
DATABASE STOP Authorizes a user to stop any database in the system and to
select from the M_DATABASES view.
DROP CLIENTSIDE ENCRYPTION KEYPAIR Authorizes a user to drop other users' client-side encryption
key pairs.
ENCRYPTION ROOT KEY ADMIN Authorizes all statements related to management of root
keys:
EXTENDED STORAGE ADMIN Authorizes the management of SAP HANA dynamic tiering
and the creation of extended storage.
IMPORT Authorizes the import activity in the database using the IM
PORT statements. The user must also have the INSERT priv
ilege on the target tables to be imported.
LDAP ADMIN Authorizes the use of the CREATE | ALTER | DROP | VALI
DATE LDAP PROVIDER statements.
LICENSE ADMIN Authorizes the use of the SET SYSTEM LICENSE statement
to install a new license.
LOG ADMIN Authorizes the use of the ALTER SYSTEM LOGGING [ON |
OFF] statements to enable or disable the log flush mecha
nism.
MONITOR ADMIN Authorizes the use of the ALTER SYSTEM statements for
events.
OPTIMIZER ADMIN Authorizes the use of the ALTER SYSTEM statements con
cerning SQL PLAN CACHE and ALTER SYSTEM UPDATE
STATISTICS statements, which influence the behavior of the
query optimizer.
ROLE ADMIN Authorizes the creation and deletion of roles by using the
CREATE ROLE and DROP ROLE statements. It also author
izes the granting and revoking of roles by using the GRANT
and REVOKE statements.
TABLE ADMIN Authorizes LOAD, UNLOAD and MERGE of tables and table
placement.
TRUST ADMIN Authorizes the use of statements to update the trust store.
VERSION ADMIN Authorizes the use of the ALTER SYSTEM RECLAIM VER
SION SPACE statement of the multi-version concurrency
control (MVCC) feature.
WORKLOAD ANALYZE ADMIN Used by the Analyze Workload, Capture Workload, and Re
play Workload applications when performing workload anal
ysis.
<identifier>.<identifier> Components of the SAP HANA database can create new sys
tem privileges. These privileges use the component-name as
the first identifier of the system privilege and the compo
nent-privilege-name as the second identifier.
Note
The following privileges authorize actions on individual packages in the SAP HANA repository, used in the
SAP HANA Extended Services (SAP HANA XS) classic development model. With SAP HANA XS advanced,
source code and web content are no longer versioned and stored in the repository of the SAP HANA
database.
REPO.MAINTAIN_DELIVERY_UNITS Authorizes the maintenance of delivery units (DU, DU vendor and system vendor
must be the same
REPO.CONFIGURE Authorize work with SAP HANA Change Recording, which is part of SAP HANA
Application Lifecycle Management
REPO.MODIFY_CHANGE
REPO.MODIFY_OWN_CONTRIBUTION
REPO.MODIFY_FOREIGN_CONTRIBU
TION
Related Information
GRANT
Developer Authorization in the Repository
Object privileges are SQL privileges that are used to allow access to and modification of database objects.
For each SQL statement type (for example, SELECT, UPDATE, or CALL), a corresponding object privilege exists.
If a user wants to execute a particular statement on a simple database object (for example, a table), he or she
must have the corresponding object privilege for either the actual object itself, or the schema in which the
object is located. This is because the schema is an object type that contains other objects. A user who has
object privileges for a schema automatically has the same privileges for all objects currently in the schema and
any objects created there in the future.
Object privileges are not only grantable for database catalog objects such as tables, views and procedures.
Object privileges can also be granted for non-catalog objects such as development objects in the repository of
the SAP HANA database.
Initially, the owner of an object and the owner of the schema in which the object is located are the only users
who can access the object and grant object privileges on it to other users.
Caution
The database owner concept stipulates that when a database user is deleted, all objects created by that
user and privileges granted to others by that user are also deleted. If the owner of a schema is deleted, all
objects in the schema are also deleted even if they are owned by a different user. All privileges on these
objects are also deleted.
The owner of a table can change its ownership with the ALTER TABLE SQL statement. In this case, the new
owner becomes the grantor of all privileges on the table granted by the original owner. The original owner is
also automatically granted all privileges for the table with the new owner as grantor. This ensures that the
original owner can continue to work with the table as before.
The authorization check for objects defined on other objects (that is, stored procedures and views) is more
complex. In order to be able to access an object with dependencies, both of the following conditions must be
met:
● The user trying to access the object must have the relevant object privilege on the object as described
above.
● The user who created the object must have the required privilege on all underlying objects and be
authorized to grant this privilege to others.
If this second condition is not met, only the owner of the object can access it. He cannot grant privileges on it to
any other user. This cannot be circumvented by granting privileges on the parent schema instead. Even if a user
has privileges on the schema, he will still not be able to access the object.
Note
This applies to procedures created in DEFINER mode only. This means that the authorization check is run
against the privileges of the user who created the object, not the user accessing the object. For procedures
created in INVOKER mode, the authorization check is run against the privileges of the accessing user. In
this case, the user must have privileges not only on the object itself but on all objects that it uses.
Tip
The SAP HANA studio provides a graphical feature, the authorization dependency viewer, to help
troubleshoot authorization errors for object types that typically have complex dependency structures:
stored procedures and calculation views.
Related Information
Object privileges are used to allow access to and modification of database objects, such as tables and views.
The following table describes the supported object privileges in an SAP HANA database.
REMOTE TABLE ADMIN DDL ● Remote sources Authorizes the creation of ta
bles on a remote source ob
ject.
Related Information
GRANT
Analytic privileges grant different users access to different portions of data in the same view based on their
business role. Within the definition of an analytic privilege, the conditions that control which data users see is
either contained in an XML document or defined using SQL.
Standard object privileges (SELECT, ALTER, DROP, and so on) implement coarse-grained authorization at
object level only. Users either have access to an object, such as a table, view or procedure, or they don't. While
this is often sufficient, there are cases when access to data in an object depends on certain values or
combinations of values. Analytic privileges are used in the SAP HANA database to provide such fine-grained
control at row level of which data individual users can see within the same view.
Example
Sales data for all regions are contained within one analytic view. However, regional sales managers should
only see the data for their region. In this case, an analytic privilege could be modeled so that they can all
query the view, but only the data that each user is authorized to see is returned.
Although analytic privileges can be created directly as catalog objects in runtime, we recommend creating
them as design-time objects that become catalog objects on deployment (database artifact with file
suffix .hdbanalyticprivilege).
In an SAP HANA XS classic environment, analytic privileges are created in the built-in repository of the SAP
HANA database using either the SAP HANA Web Workbench or the SAP HANA studio. In an SAP HANA XS
advanced environment, they are created using the SAP Web IDE and deployed using the SAP HANA
deployment infrastructure (SAP HANA DI).
Note
HDI supports only SQL-based analytic privileges (see below). Furthermore, due to the container-based
model of HDI, where each container corresponds to a database schema, analytic privileges created in HDI
are schema specific.
Before you implement row-level authorization using analytic privileges, you need to decide which type of
analytic privilege is suitable for your scenario. In general, SQL-based analytic privileges allow you to more easily
formulate complex filter conditions using sub-queries that might be cumbersome to model using XML-based
analytic privileges.
Recommendation
SAP recommends the use of SQL-based analytic privileges. Using the SAP HANA Modeler perspective of
the SAP HANA studio, you can migrate XML-based analytic privileges to SQL-based analytic privileges. For
more information, see the SAP HANA Modeling Guide (For SAP HANA Studio).
As objects created in the repository, XML-based analytic privileges are deprecated as of SAP HANA SPS
02. For more information, see SAP Note 2465027.
The following are the main differences between XML-based and SQL-based analytic privileges:
● Attribute views
● Analytic views
● Calculation views
Note
This corresponds to development in an SAP HANA XS
classic environment using the SAP HANA repository.
Design-time modeling using the SAP Web IDE for SAP HANA Yes No
Note
This corresponds to development in an SAP HANA XS
advanced environment using HDI.
All column views modeled and activated in the SAP HANA modeler and the SAP HANA Web-based
Development Workbench automatically enforce an authorization check based on analytic privileges. XML-
based analytic privileges are selected by default, but you can switch to SQL-based analytic privileges.
Column views created using SQL must be explicitly registered for such a check by passing the relevant
parameter:
Note
It is not possible to enforce an authorization check on the same view using both XML-based and SQL-based
analytic privileges. However, it is possible to build views with different authorization checks on each other.
Related Information
Create Static SQL Analytic Privileges (SAP Web IDE for SAP HANA) [page 522]
Create Dynamic SQL Analytic Privileges (SAP Web IDE for SAP HANA)
Create Analytic Privileges Using SQL Expressions (SAP Web IDE for SAP HANA)
Create Classical XML-Based Analytic Privileges (SAP HANA Web Workbench) [page 524]
Create Static SQL Analytic Privileges (SAP HANA Web Workbench) [page 522]
Create Classical XML-based Analytic Privileges (SAP HANA Studio)
Create SQL Analytic Privileges (SAP HANA Studio)
Convert Classical XML-based Analytic Privileges to SQL-based Analytic Privileges (SAP HANA Studio)
SAP Note 2465027
Package privileges authorize actions on individual packages in the SAP HANA repository.
Privileges granted on a repository package are implicitly assigned to the design-time objects in the package, as
well as to all sub-packages. Users are only allowed to maintain objects in a repository package if they have the
necessary privileges for the package in which they want to perform an operation, for example to read or write
to an object in that package. To be able perform operations in all packages, a user must have privileges on the
root package .REPO_PACKAGE_ROOT.
If the user authorization check establishes that a user does not have the necessary privileges to perform the
requested operation in a specific package, the authorization check is repeated on the parent package and
recursively up the package hierarchy to the root level of the repository. If the user does not have the necessary
privileges for any of the packages in the hierarchy chain, the authorization check fails and the user is not
permitted to perform the requested operation.
In the context of repository package authorizations, there is a distinction between native packages and
imported packages.
● Native package
A package that is created in the current system and expected to be edited in the current system. Changes
to packages or to objects the packages contain must be performed in the original development system
where they were created and transported into subsequent systems. The content of native packages are
regularly edited by developers.
● Imported package
A package that is created in a remote system and imported into the current system. Imported packages
should not usually be modified, except when replaced by new imports during an update. Otherwise,
Note
The SAP HANA administrator can grant the following package privileges to an SAP HANA user: edit,
activate, and maintain.
Related Information
In SAP HANA Extended Application Services (SAP HANA XS), application privileges define the authorization
level required for access to an SAP HANA XS application, for example, to start the application or view particular
functions and screens.
Application privileges can be assigned to an individual user or to a group of users, for example, in a user role.
The user role can also be used to assign system, object, package, and analytic privileges, as illustrated in the
following graphic. You can use application privileges to provide different levels of access to the same
application, for example, to provide advanced maintenance functions for administrators and view-only
capabilities to normal users.
If you want to define application-specific privileges, you need to understand and maintain the relevant sections
in the following design-time artifacts:
Application privileges can be assigned to users individually or by means of a user role, for example, with the
“application privilege” keyword in a role-definition file (<RoleName>.hdbrole) as illustrated in the following
role acme.com.hana.xs.app1.roles::Display
{
application privilege: acme.com.hana.xs.appl::Display;
application privilege: acme.com.hana.xs.appl::View;
catalog schema "ACME_XS_APP1": SELECT;
package acme.com.hana.xs.app1: REPO.READ;
package ".REPO_PACKAGE_ROOT" : REPO.READ;
catalog sql object "_SYS_REPO"."PRODUCTS": SELECT;
catalog sql object "_SYS_REPO"."PRODUCT_INSTANCES": SELECT;
catalog sql object "_SYS_REPO"."DELIVERY_UNITS": SELECT;
catalog sql object "_SYS_REPO"."PACKAGE_CATALOG": SELECT;
catalog sql object "ACME_XS_APPL"."acme.com.hana.xs.appl.db::SYSTEM_STATE":
SELECT, INSERT, UPDATE, DELETE;
}
The application privileges referenced in the role definition (for example, Display and View) are actually
defined in an application-specific .xsprivileges file, as illustrated in the following example, which also
contains entries for additional privileges that are not explained here.
Note
The .xsprivileges file must reside in the package of the application to which the privileges apply.
The package where the .xsprivileges resides defines the scope of the application privileges; the privileges
specified in the.xsprivileges file can only be used in the package where the .xsprivileges resides (or
any sub-packages). This is checked during activation of the .xsaccess file and at runtime in the by the XS
JavaScript API $.session.(has|assert)AppPrivilege().
{
"privileges" : [
{ "name" : "View", "description" : "View Product Details" },
{ "name" : "Configure", "description" : "Configure Product Details" },
{ "name" : "Display", "description" : "View Transport Details" },
{ "name" : "Administrator", "description" : "Configure/Run Everything" },
{ "name" : "ExecuteTransport", "description" : "Run Transports"},
{ "name" : "Transport", "description" : "Transports"}
]
}
The privileges are authorized for use with an application by inserting the authorization keyword into the
corresponding .xsaccess file, as illustrated in the following example. Like the .xsprivileges file,
the .xsaccess file must reside either in the root package of the application to which the privilege
authorizations apply or the specific subpackage which requires the specified authorizations.
Note
If a privilege is inserted into the .xsaccess file as an authorization requirement, a user must have this
privilege to access the application package where the .xsaccess file resides. If there is more than one
privilege, the user must have at least one of these privileges to access the content of the package.
{
"prevent_xsrf": true,
"exposed": true,
"authentication": {
"method": "Form"
},
Related Information
This report allows you to check which application files are accessible to users at runtime based on a specific
role.
Prerequisites
● You are assigned the role sap.hana.ide.roles::SecurityTester (needed to use the Check File Access report).
● You have the system privilege USER ADMIN (needed to create and delete users).
● You have the EXECUTION privilege on GRANT_ACTIVATED_ROLE (needed to assign roles to users).
Context
In an SAP HANA XS application, files are organized in a package hierarchy. In order to access these files,
application privileges are assigned to users and roles, and the application privileges needed to access each
package are specified in the corresponding .xsaccess files.
This report allows you to check that the correct files are accessible through the roles defined in an application.
In order to do this, a temporary user is created for each selected role and an HTTP request is then generated
and sent to access each file contained in the application.
Files for which an HTTP status code 4xx is returned are not accessible. The results pane lists all application files
according to their HTTP status code.
Note that this report is available in the SAP HANA Web-based Development Workbench only.
1. In the application package hierarchy, select the hdbrole file and from the context menu choose Check File
Access.
You are prompted to confirm whether or not you want to execute the XSJS files contained in the
application when running the report.
Caution
Bear in mind that when XSJS files are executed they might change underlying data. This could
potentially be dangerous if there is a function, for example, that deletes data or even entire tables.
You can create analytic privileges based on either an XML document (the "classic" variation) or an SQL
definition.
Related Information
For creating static SQL analytic privileges, you use attribute columns from views to define fixed restrictions on
data access. These restrictions are defined in the analytic privilege editor at design time.
Prerequisites
1. If you want to use a SQL analytic privilege to apply data access restrictions on calculation views, set the
Apply Privileges property for the calculation view to SQL Analytic Privileges.
1. Open the calculation view in the view editor.
2. Select the Semantics node.
Procedure
Currently, there is only one type of project template available, namely: Multi-Target Application Project.
Select Multi-Target Application Project and choose Next.
c. Type a name for the new MTA project (for example, myApp and choose Next to confirm).
d. Specify details of the new MTA project and choose Next to confirm.
e. Create the new MTA project; choose Finish.
3. Select the SAP HANA Database Module in which you want to create the analytic privilege.
4. Browse to the src folder.
If you want to create an analytic privilege and apply the data access restrictions for selected list of models,
in the Secured Models section add the required models,
Note
You can only add calculation views and CDS views to the secured models list.
c. Choose Finish.
9. Define the validity.
In the Attribute pane, expand the PRIVILEGE VALIDITY section to specify the time period for which the
analytic privilege is valid. You can specify multiple time periods for which the analytic privilege is valid.
a. In the PRIVILEGE VALIDITY, choose + (Add).
b. In the Operator dropdown list, select the required operator.
c. Based on the selected operator, specify the time period (From and To) for which the analytic privilege
is valid.
10. Define the attribute restrictions.
The tool uses the restrictions defined on the attributes to restrict data access. Each attribute restriction is
associated with only one attribute, but can contain multiple value filters. You can define more than one
attribute restriction.
a. In the Associated Attribute Restrictions section, choose + (Add).
b. In the Attributes dropdown list, select the required attribute.
For example, if you have enabled SQL access to shared hierarchies and if SalesRepHierarchyNode is the
node column that the tool generates for a parent-child hierarchy, then "SalesRepHierarchyNode" =
"MAJESTIX" is a possible filter expression in analytic privileges.
Note
You can create hierarchical analytic privileges only for the following conditions:
○ All models in the Secured Models are star join calculation views with shared dimensions.
○ You have enabled SQL access to the shared hierarchies in star join calculation views.
Create analytic privileges for information views and assign them to different users to provide selective access
that are based on certain combinations of data.
Prerequisites
If you want to use a classical XML-based analytic privilege to apply data access restrictions on information
views, set the Apply Privileges property for the information view to Classical Analytic Privileges.
Context
Analytic privileges help restrict data access to information views based on attributes or procedures. You can
create and apply analytic privileges for a selected group of models or apply them to all models across
packages.
After you create analytic privileges, assign it to users. This restricts users to access data only for certain
combinations of dimension attributes.
Procedure
Use attributes from the secured models to define data access restrictions.
a. In the Associated Attributes Restrictions section, choose Add.
b. In the Attributes dialog, select the attributes.
Note
Select a model if you want to use all attributes from the model to define restrictions.
c. Choose OK.
Modeler uses the restrictions defined on the attributes to restrict data access. Each attribute
restriction is associated with only one attribute, but can contain multiple value filters. You can create
more than one attribute restrictions.
e. In the Restriction Type dropdown list, select a restriction type.
f. Select the required operator and provide a value using the value help.
g. For catalog procedure or repository procedure, you can also provide values using the syntax <schema
name>::<procedure name> or <package name>::<procedure name> respectively.
10. Activate analytic privileges.
a. If you want to activate the analytic privilege, in the menu bar, choose Save.
b. If you want to activate the analytic privilege along with all objects, in the menu bar, choose Save All.
Note
Activate the analytic privilege only if you have defined at least one restriction on attributes in the
Associated Attributes Restrictions section.
If you want to assign privileges to an authorization role, execute the following steps:
This opens a new tab in the browser where you can assign the analytic privileges to users.
c. Expand Users.
d. Select a user.
e. In the Analytic Privileges tab page, choose the add icon to add the privilege.
f. In the editor toolbar, choose Activate.
Related Information
Analytic privileges grant different users access to different portions of data in the same view based on their
business role. Within the definition of an analytic privilege, the conditions that control which data users see is
either contained in an XML document or defined using SQL.
Standard object privileges (SELECT, ALTER, DROP, and so on) implement coarse-grained authorization at
object level only. Users either have access to an object, such as a table, view or procedure, or they don't. While
this is often sufficient, there are cases when access to data in an object depends on certain values or
combinations of values. Analytic privileges are used in the SAP HANA database to provide such fine-grained
control at row level of which data individual users can see within the same view.
Example
Sales data for all regions are contained within one analytic view. However, regional sales managers should
only see the data for their region. In this case, an analytic privilege could be modeled so that they can all
query the view, but only the data that each user is authorized to see is returned.
Before you implement row-level authorization using analytic privileges, you need to decide which type of
analytic privilege is suitable for your scenario. In general, SQL-based analytic privileges allow you to more easily
formulate complex filter conditions that might be cumbersome to model using XML-based analytic privileges.
● Attribute views
● Analytic views
● Calculation views
Design-time modeling in the Editor tool of the SAP HANA Yes Yes
Web Workbench
All column views modeled and activated in the SAP HANA modeler and the SAP HANA Web-based
Development Workbench automatically enforce an authorization check based on analytic privileges. XML-
based analytic privileges are selected by default, but you can switch to SQL-based analytic privileges.
Column views created using SQL must be explicitly registered for such a check by passing the relevant
parameter:
SQL views must always be explicitly registered for an authorization check based analytic privileges by passing
the STRUCTURED PRIVILEGE CHECK parameter.
Note
It is not possible to enforce an authorization check on the same view using both XML-based and SQL-based
analytic privileges. However, it is possible to build views with different authorization checks on each other.
An analytic privilege consists of a set of restrictions against which user access to a particular attribute view,
analytic view, or calculation view is verified. In an XML-based analytic privilege, these restrictions are specified
in an XML document that conforms to a defined XML schema definition (XSD).
Note
As objects created in the repository, XML-based analytic privileges are deprecated as of SAP HANA SPS
02. For more information, see SAP Note 2465027.
Each restriction in an XML-based analytic privilege controls the authorization check on the restricted view
using a set of value filters. A value filter defines a check condition that verifies whether or not the values of the
view (or view columns) qualify for user access.
● View
● Activity
● Validity
● Attribute
The following operators can be used to define value filters in the restrictions.
Note
The activity and validity restrictions support only a subset of these operators.
All of the above operators, except IS_NULL and NOT_NULL, accept empty strings (" ") as filter operands.
IS_NULL and NOT_NULL do not allow any input value.
The following are examples of how empty strings can be used with the filter operators:
● For the IN operator: IN ("", "A", "B") to filter on these exact values
● As a lower limit in comparison operators, such as:
○ BT ("", “XYZ”), which is equivalent to NOT_NULL AND LE "XYZ”"GT "", which is equivalent to
NOT_NULL
○ LE "", which is equivalent to EQ ""
○ LT "", which will always return false
○ CP "", which is equivalent to EQ ""
The filter conditions CP "*" will also return rows with empty-string as values in the corresponding attribute.
This restriction specifies to which column views the analytic privilege applies. It can be a single view, a list of
views, or all views. An analytic privilege must have exactly one cube restriction.
Example
IN ("Cube1", "Cube2")
Note
When an analytic view is created in the SAP HANA modeler, automatically generated views are included
automatically in the cube restriction.
Note
The SAP HANA modeler uses a special syntax to specify the cube names in the view restriction:
_SYS_BIC:<package_hierarchy>/<view_name>
For example:
<cubes>
<cube name="_SYS_BIC:test.sales/AN_SALES" />
<cube name="_SYS_BIC:test.sales/AN_SALES/olap" />
</cubes>
Activity Restriction
This restriction specifies the activities that the user is allowed to perform on the restricted views, for example,
read data. An analytic privilege must have exactly one activity restriction.
Example
EQ "read", or EQ "edit"
Note
Currently, all analytic privileges created in the SAP HANA modeler are automatically configured to restrict
access to READ activity only. This corresponds to SQL SELECT queries. This is due to the fact that the
attribute, analytic, and calculation views are read-only views. This restriction is therefore not configurable.
Validity Restriction
This restriction specifies the validity period of the analytic privilege. An analytic privilege must have exactly one
validity restriction.
GT 2010/10/01 01:01:00.000
Attribute Restriction
This restriction specifies the value range that the user is permitted to access. Attribute restrictions are applied
to the actual attributes of a view. Each attribute restriction is relevant for one attribute, which can contain
multiple value filters. Each value filter represents a logical filter condition.
Note
The SAP HANA modeler uses different ways to specify attribute names in the attribute restriction
depending on the type of view providing the attribute. In particular, attributes from attribute views are
specified using the syntax "<package_hierarchy>/<view_name>$<attribute_name>", while local
attributes of analytic views and calculation views are specified using their attribute name only. For example:
<dimensionAttribute name="test.sales/AT_PRODUCT$PRODUCT_NAME">
<restrictions>
<valueFilter operator="IN">
<value value="Car" />
<value value="Bike" />
</valueFilter>
</restrictions>
</dimensionAttribute>
● A static value filter consists of an operator and either a list of values as the filter operands or a single value
as the filter operand. All data types are supported except those for LOB data types (CLOB, BLOB, and
NCLOB).
For example, a value filter (EQ 2006) can be defined for an attribute YEAR in a dimension restriction to
filter accessible data using the condition YEAR=2006 for potential users.
Note
Only attributes, not aggregatable facts (for example, measures or key figures) can be used in
dimension restrictions for analytic views.
● A dynamic value filter consists of an operator and a stored procedure call that determines the operand
value at runtime.
For example, a value filter (IN (GET_MATERIAL_NUMBER_FOR_CURRENT_USER())) is defined for the
attribute MATERIAL_NUMBER. This filter indicates that a user with this analytic privilege is only allowed to
access material data with the numbers returned by the procedure
GET_MATERIAL_NUMBER_FOR_CURRENT_USER.
It is possible to combine static and dynamic value filters as shown in the following example.
Example
An analytic privilege can have multiple attribute restrictions, but it must have at least one attribute restriction.
An attribute restriction must have at least one value filter. Therefore, if you want to permit access to the whole
content of a restricted view, then the attribute restriction must specify all attributes.
Similarly, if you want to permit access to the whole content of the view with the corresponding attribute, then
the value filter must specify all values.
The SAP HANA modeler automatically implements these two cases if you do not select either an attribute
restriction or a value filter.
Example
<dimensionAttributes>
<allDimensionAttributes/ >
</dimensionAttributes>
Example
<dimensionAttributes>
<dimensionAttribute name="PRODUCT">
<all />
</dimensionAttribute>
</dimensionAttributes>
The result of user queries on restricted views is filtered according to the conditions specified by the analytic
privileges granted to the user as follows:
● Multiple analytic privileges are combined with the logical operator OR.
● Within one analytic privilege, all attribute restrictions are combined with the logical operator AND.
● Within one attribute restriction, all value filters on the attribute are combined with the logical operator OR.
You create two analytic privileges AP1 and AP2. AP1 has the following attribute restrictions:
● Restriction R11 restricting the attribute Year with the value filters (EQ 2006) and (BT 2008, 2010)
● Restriction R12 restricting the attribute Country with the value filter (IN ("USA", "Germany"))
Given that multiple value filters are combined with the logical operator OR and multiple attribute restrictions
are combined with the logical operator AND, AP1 generates the condition:
((Year = 2006) OR (Year BT 2008 and 2010)) AND (Country IN ("USA", "Germany"))
Restriction R21 restricting the attribute Country with the value filter (EQ "France")
(Country = "France")
Any query of a user who has been granted both AP1 and AP2 will therefore be appended with the following
WHERE clause:
((Year = 2006) OR (Year BT 2008 and 2010)) AND (Country IN ("USA", "Germany"))) OR
(Country = "France")
Related Information
The attribute restriction of an XML-based analytic privilege specifies the value range that the user is permitted
to access using value filters. In addition to static scalar values, stored procedures can be used to define filters.
By using storing procedures to define filters, you can have user-specific filter conditions be determined
dynamically in runtime, for example, by querying specified tables or views. As a result, the same analytic
privilege can be applied to many users, while the filter values for authorization can be updated and changed
independently in the relevant database tables. In addition, application developers have full control not only to
design and manage such filter conditions, but also to design the logic for obtaining the relevant filter values for
the individual user at runtime.
Procedures used to define filter conditions must have the following properties:
In static value filters, it is not possible to specify NULL as the operand of the operator. The operators IS_NULL
or NOT_NULL must be used instead. In dynamic value filters where a procedure is used to determine a filter
condition, NULL or valid values may be returned. The following behavior applies in the evaluation of such cases
during the authorization check of a user query:
Filter conditions of operators with NULL as the operand are disregarded, in particular the following:
If no valid filter conditions remain (that is, they have all been disregarded because they contain the NULL
operand), the user query is rejected with a “Not authorized” error.
Example
Dynamic analytic privilege 1 generates the filter condition (Year >= NULL) and dynamic analytic privilege 2
generates the condition (Country EQ NULL). The query of a user assigned these analytic privileges
(combined with the logical operator OR) will return a “Not authorized” error.
Example
Dynamic analytic privilege 1 generates the filter condition (Year >= NULL) and dynamic analytic privilege 2
generates the condition (Country EQ NULL AND Currency = “USD”). The query of a user assigned these
analytic privileges (combined with the logical operator OR) will be filtered with the filter Currency = ‘USD’.
In addition, a user query is not authorized in the following cases even if further applicable analytic privileges
have been granted to the user.
● The BT operator has as input operands a valid scalar value and NULL, for example, BT 2002 and NULL or
BT NULL and 2002
● The IN operator has as input operand NULL among the value list, for example, IN (12, 13, NULL)
If you want to allow the user to see all the values of a particular attribute, instead of filtering for certain values,
the procedure must return "*" and '' '' (empty string) as the operand for the CP and GT operators respectively.
These are the only operators that support the specification of all values.
Implementation Considerations
When the procedure is executed as part of the authorization check in runtime, note the following:
● The user who must be authorized is the database user who executes the query accessing a secured view.
This is the session user. The database table or view used in the procedure must therefore contain a column
to store the user name of the session user. The procedure can then filter by this column using the SQL
function SESSION_USER. This table or view should only be accessible to the procedure owner.
Caution
Do not map the executing user to the application user. The application user is unreliable because it is
controlled by the client application. For example, it may set the application user to a technical user or it
may not set it at all. In addition, the trustworthiness of the client application cannot be guaranteed.
● The user executing the procedure is the _SYS_REPO user. In the case of procedures activated in the SAP
HANA modeler, _SYS_REPO is the owner of the procedures. For procedures created in SQL, the EXECUTE
privilege on the procedure must be granted to the _SYS_REPO user.
● If the procedure fails to execute, the user’s query stops processing and a “Not authorized” error is
returned. The root cause can be investigated in the error trace file of the indexserver,
indexserver_alert_<host>.trc.
When designing and implementing procedures as filter for dynamic analytic privileges, bear the following in
mind:
● To avoid a recursive analytic privilege check, the procedures should only select from database tables or
views that are not subject to an authorization check based on analytic privileges. In particular, views
activated in the SAP HANA modeler are to be avoided completely as they are automatically registered for
the analytic privilege check.
● The execution of procedures in analytic privileges slows down query processing compared to analytic
privileges containing only static filters. Therefore, procedures used in analytic privileges must be designed
carefully.
Use the CREATE STRUCTURED PRIVILEGE statement to create an XML-based analytic privilege that contains a
dynamic procedure-based value filter and a fixed value filter in the attribute restriction.
Context
Note
The analytic privilege in this example is created using the CREATE STRUCTURED PRIVILEGE statement.
Under normal circumstances, you create XML-based analytic privileges using the SAP HANA modeler or
the SAP HANA Web-based Development Workbench. Analytic privileges created using CREATE
STRUCTURED PRIVILEGE are not owned by the user _SYS_REPO. They can be granted and revoked only by
the actual database user who creates them.
Assume you want to restrict access to product data in secured views as follows:
To be able to implement the second filter condition, you need to create a procedure that will determine which
products a user is authorized to see by querying the table PRODUCT_AUTHORIZATION_TABLE.
Procedure
1. Create the table type for the output parameter of the procedure:
2. Create the table that the procedure will use to check authorization:
3. Create the procedure that will determine which products the database user executing the query is
authorized to see based on information contained in the product authorization table:
The session user is the database user who is executing the query to access a secured view. This is
therefore the user whose privileges must be checked. For this reason, the table or view used in the
procedure should contain a column to store the user name so that the procedure can filter on this
column using the SQL function SESSION_USER.
Caution
Do not map the executing user to the application user. The application user is unreliable because it is
controlled by the client application. For example, it may set the application user to a technical user or it
may not set it at all. In addition, the trustworthiness of the client application cannot be guaranteed.
Results
Now when a database user requests access to a secured view containing product information, the data
returned will be filtered according to the following condition:
SQL-based analytic privileges are created using the CREATE STRUCTURED PRIVILEGE statement:
The FOR clause is used restrict the type of access (only the SELECT action is supported). The ON clause is
used to restrict access to one or more views with the same filter attributes.
The <filter condition> parameter is used to restrict the data visible to individual users. The following
methods of specifying filter conditions are possible:
A fixed filter clause consists of an WHERE clause that is specified in the definition of the analytic privilege
itself.
You can express fixed filter conditions freely using SQL, including subqueries.
By incorporating built-in SQL functions into the subqueries, in particular SESSION_USER, you can define an
even more flexible filter condition.
Example
Note
A calculation view cannot be secured using an SQL-based analytic privilege that contains a complex filter
condition if the view is defined on top of analytic and/or attributes views that themselves are secured with
an SQL-based analytic privilege with a complex filter condition.
Remember
If you use a subquery, you (the creating user) must have the required privileges on the database objects
(tables and views) involved in the subquery.
Comparative conditions can be nested and combined using AND and OR (with corresponding brackets).
Tip
To create an analytic privilege that allows either access to all data or no data in a view, set a fixed filter
condition such as 1=1 or 1!=1.
With a dynamically generated filter clause, the WHERE clause that specifies the filter condition is generated
every time the analytic privilege is evaluated. This is useful in an environment in which the filter clause changes
very dynamically. The filter condition is determined by a procedure specified in the CONDITION PROVIDER
clause, for example:
Sample Code
Procedures in the CONDITION PROVIDER clause must have the following properties:
Tip
A procedure that returns the filter condition 1=1 or 1>1 can be used to create an analytic privilege that
allows access to all data or no data in a view.
● The procedure must be executable by _SYS_REPO, that is, either_SYS_REPO must be the owner of the
procedure or the owner of the procedure has all privileges on the underlying tables/views with GRANT
OPTION and has granted the EXECUTE privilege on the procedure to the _SYS_REPO user.
If errors occur in procedure execution, the user receives a Not authorized error, even if he has the analytic
privileges that would grant access.
Related Information
Use the CREATE STRUCTURED PRIVILEGE statement to create SQL-based analytic privileges for different
scenarios.
Context
The examples provided here take you through the following scenarios:
● Example 1: Securing a column view using an SQL-based analytic privilege with a fixed filter clause [page
539]
● Example 2: Securing an SQL view using an SQL-based analytic privilege with a complex filter clause
(subquery) [page 541]
● Example 3: Securing a column view using an SQL-based analytic privilege with a dynamically generated
filter clause [page 543]
Note
The analytic privileges in these examples are created using the CREATE STRUCTURED PRIVILEGE
statement. Under normal circumstances, you create SQL-based analytic privileges using the SAP HANA
Web IDE. They can be granted and revoked only by the actual database user who creates them.
Prerequisites
The database user TABLEOWNER has set up a calculation scenario based on the table SALES_TABLE, which
contains the data to be protected.
Context
All sales data is contained in a single view. You want to restrict user access so that sales managers can see only
information about the product "car" in the sales region UK and Germany. You want to do this by creating an
analytic privilege with a fixed filter clause.
A fixed filter clause consists of an SQL WHERE clause that is specified in the definition of the analytic privilege
itself.
In the following procedure, you might find it easier to use the graphical editors to create the calculation view
and analytic privilege.
Procedure
Note
You can see above that the authorization check using XML-based analytic privileges is disabled with
'REGISTERVIEWFORAPCHECK'='0', while the authorization check using SQL-based analytic
privileges is enabled with STRUCTURED PRIVILEGE CHECK. Both checks cannot be enabled at the
same time.
Remember
Remember
Only the view owner or a user who has the SELECT privilege WITH GRANT OPTION on the view can
perform the grant.
Remember
Prerequisites
The database user TABLEOWNER has created a table TABLEOWNER.SALES, which contains the data to be
protected.
Context
All sales data is contained in a single view. You want to restrict access of user MILLER so that he can see only
product information from the year 2008. You want to do this by creating an analytic privilege with a complex
filter clause.
With a complex filter clause, the SQL WHERE clause that specifies the filter condition includes an SQL
statement, or a subquery. This allows you to create complex filter conditions to control which data individual
users see.
Tip
In the following procedure, you might find it easier to use the graphical editors to create the calculation view
and analytic privilege.
1. Create the view containing the sales data which needs to be secured:
Remember
The user creating the view must have the SELECT privilege WITH GRANT OPTION on the table
TABLEOWNER.SALES.
Remember
○ Subqueries allow you to create complex filter conditions, but remember: A calculation view cannot
be secured using an SQL-based analytic privilege that contains a complex filter condition if the
view is defined on top of analytic and/or attributes views that themselves are secured with an SQL-
based analytic privilege with a complex filter condition.
○ The user creating the analytic privilege must have the SELECT privilege on the objects involved in
the subquery, in this case table VIEWOWNER.AUTHORIZATION_VALUES.
○ The session user is the database user who is executing the query to access a secured view. This is
therefore the user whose privileges must be checked. For this reason, the table containing the
authorization information needs a column to store the user name so that the subquery can filter on
this column using the SQL function SESSION_USER.
Caution
Do not map the executing user to the application user. The application user is unreliable because it is
controlled by the client application. For example, it may set the application user to a technical user or it
may not set it at all. In addition, the trustworthiness of the client application cannot be guaranteed.
Only the view owner or a user who has the SELECT privilege WITH GRANT OPTION on the view can
perform the grant.
Remember
Prerequisites
The database user TABLEOWNER has set up a calculation scenario based on the table SALES_TABLE, which
contains the data to be protected.
Context
All sales data is contained in a single view. You want to restrict access of user ADAMS so that he can see only
information about cars bought by customer Company A or bikes sold in 2006. You want to do this by creating
an analytic privilege with a dynamically generated filter clause.
With a dynamically generated filter clause, the SQL WHERE clause that specifies the filter condition is
generated every time the analytic privilege is evaluated. This is useful in an environment in which the filter
clause changes very dynamically.
Tip
In the following procedure, you might find it easier to use the graphical editors to create the calculation view
and analytic privilege.
Procedure
INSERT
INTO "AUTHORIZATION"."AUTHORIZATION_FILTERS" VALUES('(CUSTOMER=''Company A''
AND PRODUCT=''Car'') OR (YEAR=''2006'' AND PRODUCT=''Bike'')',
'ADAMS')
;
Remember
4. Create the database procedure that provides the filter clause for the analytic privilege and grant it to object
owner of the project:
Remember
When using procedures as the condition provider in an SQL-based analytic privilege, remember the
following:
On evaluation of the analytic privilege for user ADAMS, the WHERE clause (CUSTOMER='Company A' AND
PRODUCT='Car') OR (YEAR='2006' AND PRODUCT='Bike'), as provided by the procedure
GET_FILTER_FOR_USER, will be used.
6. Grant the SELECT privilege on the view TABLEOWNER.VIEW_SALES to user ADAMS:
Remember
Only the view owner or a user who has the SELECT privilege WITH GRANT OPTION on the view can
perform the grant.
Remember
When a user requests access to data stored in an attribute, analytic, calculation, or SQL views, an authorization
check based on analytic privileges is performed and the data returned to the user is filtered accordingly. The
EFFECTIVE_STRUCTURED_PRIVILEGES system view can help you to troubleshoot authorization problems.
Access to a view and the way in which results are filtered depend on whether the view is independent or
associated with other views (dependent views).
The authorization check for a view that is not defined on another column view is as follows:
Note
The user does not require SELECT on the underlying base tables or views of the view.
○ The user has been granted an analytic privilege that is applicable to the view.
Applicable analytic privileges are those that meet all of the following criteria:
A view restriction that includes the accessed view An ON clause that includes the accessed view
A validity restriction that applies now If the filter condition specifies a validity period (for ex
ample, WHERE (CURRENT_TIME BETWEEN ...
AND ....) AND <actual filter>) ), it must
apply now
An action in the activity restriction that covers the ac An action in the FOR clause that covers the action re
tion requested by the query quested by the query
Note Note
All analytic privileges created and activated in the All analytic privileges created and activated in the
SAP HANA modeler and SAP HANA Web-based De SAP HANA Web-based Development Workbench
velopment Workbench fulfill this condition. The only fulfill this condition. The only action supported is
action supported is read access (SELECT). read access (SELECT).
An attribute restriction that includes some of the view’s A filter condition that applies to the view
attributes
Note
When the analytic privilege is created, the filter is
checked immediately to ensure that it applies to the
view. If it doesn't, creation will fail. However, if the
view definition subsequently changes, or if a dy
namically generated filter condition returns a filter
string that is not executable with the view, the au
thorization check will fail and access is rejected.
If the user has the SELECT privilege on the view but no applicable analytic privileges, the user’s request is
rejected with a Not authorized error. The same is true if the user has an applicable analytic privilege but
doesn't have the SELECT privilege on the view.
2. The value filters specified in the dimension restrictions (XML-based) or filter condition (SQL-based) are
evaluated and the appropriate data is returned to the user. Multiple analytic privileges are combined with
the logical operator OR.
For more information about how multiple attribute restrictions and/or multiple value filters in XML-based
analytic privileges are combined, see XML-Based Analytic Privileges.
The authorization check for a view that is defined on other column views is more complex. Note the following
behavior.
● The user has been granted the SELECT privilege on the view or the schema that contains the view.
● The user has been granted analytic privileges that apply to the view itself and all the other column views in
the hierarchy that are registered for a structured privilege check.
A user can access a calculation or SQL view based on other views if both of the following prerequisites are met:
If a user requests access to a calculation view that is dependent on another view, the behavior of the
authorization check and result filtering is performed as follows:
Calculation views and SQL views can be defined by selecting data from other column views, specifically
attribute views, analytic views, and other calculation views. This can lead to a complex view hierarchy that
requires careful design of row-level authorization.
Analytic views
An analytic view can also be defined on attribute views, but this does not represent a view dependency or
hierarchy with respect to authorization check and result filtering. If you reference an attribute view in an
analytic view, analytic privileges defined on the attribute view are not applied.
This represents a view hierarchy for which the prerequisites described above for calculation views also apply.
If an analytic view designed in the SAP HANA modeler contains one of the elements listed below, it will
automatically be activated with a calculation view on top. The name of this calculation view is the name of the
analytic view with the suffix /olap.
● Which analytic privileges apply to a particular view, including the dynamic filter conditions that apply (if
relevant)
● Which filter is being applied to which view in the view hierarchy (for views with dependencies)
● Whether or not a particular user is authorized to access the view
Related Information
For information about the capabilities available for your license and installation scenario, refer to the Feature
Scope Description for SAP HANA.
Hyperlinks
Some links are classified by an icon and/or a mouseover text. These links provide additional information.
About the icons:
● Links with the icon : You are entering a Web site that is not hosted by SAP. By using such links, you agree (unless expressly stated otherwise in your
agreements with SAP) to this:
● The content of the linked-to site is not SAP documentation. You may not infer any product claims against SAP based on this information.
● SAP does not agree or disagree with the content on the linked-to site, nor does SAP warrant the availability and correctness. SAP shall not be liable for any
damages caused by the use of such content unless damages have been caused by SAP's gross negligence or willful misconduct.
● Links with the icon : You are leaving the documentation for that particular SAP product or service and are entering a SAP-hosted Web site. By using such
links, you agree that (unless expressly stated otherwise in your agreements with SAP) you may not infer any product claims against SAP based on this
information.
Example Code
Any software coding and/or code snippets are examples. They are not for productive use. The example code is only intended to better explain and visualize the syntax
and phrasing rules. SAP does not warrant the correctness and completeness of the example code. SAP shall not be liable for errors or damages caused by the use of
example code unless damages have been caused by SAP's gross negligence or willful misconduct.
Gender-Related Language
We try not to use gender-specific word forms and formulations. As appropriate for context and readability, SAP may use masculine word forms to refer to all genders.
SAP and other SAP products and services mentioned herein as well as
their respective logos are trademarks or registered trademarks of SAP
SE (or an SAP affiliate company) in Germany and other countries. All
other product and service names mentioned are the trademarks of their
respective companies.