Manual - SmartAnalytics Scene
Manual - SmartAnalytics Scene
SmartAnalytics Scene
Manual
Release 24.1
(T^N=2)
3646301302
Version 24.1
© Rohde & Schwarz SwissQual AG
Niedermattstrasse 8b, 4528 Zuchwil, Switzerland
Phone: +41 32 686 65 65
Fax:+41 32 686 65 66
E-mail: sq-info@[Link]
Internet: [Link]
Subject to change – Data without tolerance limits is not binding.
R&S® is a registered trademark of Rohde & Schwarz GmbH & Co. KG.
Trade names are trademarks of the owners.
Throughout this manual, products from Rohde & Schwarz are indicated without the ® symbol , e.g. R&S®___ is indicated as R&S___.
R&S SwissQual AG® Contents
Contents
1 Introduction.......................................................................................... 11
1.1 SmartAnalytics Scene User Interface at a Glance................................................... 11
4 Machine Learning.................................................................................29
4.1 Machine Learning Integration in SmartAnalytics Scene......................................... 29
4.2 Call Stability Score......................................................................................................29
4.3 Anomaly Detection......................................................................................................30
4.3.1 VoLTE Call Establishment............................................................................................. 31
4.3.2 HTTP Capacity Download.............................................................................................31
5 CM360°.................................................................................................. 35
6 User/Role Management....................................................................... 36
6.1 SmartAnalytics Scene Roles......................................................................................36
6.2 Related Documents.....................................................................................................36
7 Database Management........................................................................ 37
7.1 Create Database.......................................................................................................... 37
7.2 Add an Existing Database.......................................................................................... 39
7.3 Manage Database........................................................................................................39
7.3.1 Database Commands and Data Processing Options....................................................41
7.4 Delete Database.......................................................................................................... 43
7.5 Add Server................................................................................................................... 43
10 Filtering Capabilities..........................................................................135
13 Operators............................................................................................ 232
Annex.................................................................................................. 234
I Customer Support..............................................................................293
Index....................................................................................................294
1 Introduction
SmartAnalytics Scene is a data analytics software developed by R&S SwissQual AG.
This web-based application aims at analyzing the quality of mobile networks' service
measurements. It performs various tasks after data collection, also called post-pro-
cessing.
The main advantages of SmartAnalytics Scene are:
● The ability to have fast and flexible statistical analysis and event-driven detailed
analysis available at the same time.
● The possibility to discuss and validate data without the need of multiple data
imports.
● The capability to share settings and to switch to other R&S SwissQual applications
hosted on the same platform.
This document guides you through your first steps in SmartAnalytics Scene and deals
with:
● Mastering the data
● Managing users/roles through the Web Portal
● Creating/populating databases
● Working with the dashboard
● Working with scenarios and handling workspaces
● Filtering the workspaces.
● Understanding the SmartAnalytics Scene settings, i.e., the configuration of colors
and thresholds used in the application.
For additional information concerning SmartAnalytics Scene hardware, licenses, etc.,
please refer to Installation Manual - SmartAnalytics [Link] and
Manual - [Link].
For SmartAnalytics Scene subscription packages, please refer to
Manual - SmartAnalytics Scene Subscription Service [Link]
.
1 = Dashboard. It is a predefined workspace answering general questions about the data in the currently
selected database, such as: “Where and when did we collect data?”, “How much data did we collect?”,
“What type of data did we collect?” or “How good is the overall performance of the networks?"
2 = Scenarios. They act as folders into which you group the workspaces.
3 = Workspaces. Open a workspace to visualize the content of a database. Workspaces adjust dynamically
to the selected database. The content and the layout of a workspace does not change when you select
another database; the content only is updated.
4 = Tabs. A workspace contains at least one tab. Tabs contain many different visualizations for your data.
5 = Visualizations. Most of them are generic and operate with different types of data that you can define.
6 = Filter bar. Next to the name of the workspace, SmartAnalytics Scene displays the filter chips represent-
ing the current and globally applied filters. Whenever you add or remove the filters, SmartAnalytics
Scene updates the chips accordingly.
7 = Active database status indicator. This element is always present and displayed on the title bar, inde-
pendently of the application's page you currently display. It allows you to see which database you are
connected to as well as the status of said database. Clicking on this element will take you directly to the
database management page.
8 = Workspace's actions. Actions that apply to the whole workspace's content, such as: "Secondary Win-
dow" – opens the workspace in a second, connected browser window; "Share Link" – create a link to
share the current data analysis; "Session Overview" – toggle the session overview. "Filter" – add cus-
tom data filters. "Export to PDF" – export the current workspace tab into a PDF file.
9 = Tab actions. Actions that apply to the tab, e.g., for adding visualizations. The available options mainly
depend on your rights to modify the corresponding workspace.
10 = Database. Under this menu, you manage the content and configuration of databases, e.g., by importing
files, checking the status of the data processing activities, adjusting the settings for the Network Perfor-
mance Score NPS, enabling Machine Learning or defining the thresholds below/above which SmartA-
nalytics Scene reports problems for the data.
11 = Settings. Under this menu, you find the options for managing the database, e.g., checking their status
or editing the ownership. You can also define color profiles.
12 = Session Overview. This visualization is well known from NQDI and displays the sessions for a given
point in time. It offers the technology context of a test at a glance. By clicking on the device item, you
can toggle the selected UE for visualizations that can only show data from a single UE at a time (read
more on Device Selection later). When looking at statistical dashboards, the session overview is best
toggled off to save space.
For your convenience, you can find the application overview screen at a better visible
scale below.
Important
Statistics are a complex field and SmartAnalytics Scene enables you to create count-
less displays of the collected data. This flexibility of the data structure may lead to
questionable graphs unless you really understand what you are looking at.
Figure 2-1: Technology statistics (left) and technology-related readings for a point in time (right)
Figure 2-2: The measure Count of Tests in relation to the dimensions Service and Test Status
Important
Given the many different values in the value tree, there are countless potential combi-
nations of statistics. Even for databases storing hundreds of thousands of tests, the
chart above updates instantly as these combinations of values are precalculated during
data processing, in so-called OLAP cubes. For more details about OLAP cubes, refer
to Chapter 2.4, "OLAP Cubes", on page 20.
However, although we do have all combinations available, it does not mean that each
single combination of measures and dimensions is relevant. The application prevents
you from configuring technically impossible combinations but cannot prevent you from
creating misleading charts.
As a rule of thumb:
● Values stored in the same subsection of the value tree make a good match.
● Dimensions stored under "General" can be flexibly used in combination with almost
any measure.
You want to know how long a technology, band, or cell is used in the measurement
data.
● Chart translation: "How much time did the UEs spend in which technology?"
Counting tests per technology is more complex than it may appear: during a test, tech-
nology handovers can happen so that you have multiple technologies for a single test.
If a test uses LTE, UMTS and GSM alike, then it contributes to all three bars of the
chart. As a consequence, the sum of all bars is greater than the total number of execu-
ted tests. Bear in mind that between the dimension and the measure you use, there is
not always a one-to-one relation.
● Chart translation: "In how many tests did a certain technology appear?"
The following example mixes dimensions from different areas in the value tree. "LQ
(MOS)" and "Call Mode" come from "Voice Services" while "Home Operator" comes
from "General" section. "Home Operator" is valid for all data except scanner data,
since scanners do not have a home operator.
● Chart translation: "What is the average LQ MOS for each call mode and home
operator?"
Figure 2-5: LQ MOS average per call modes and split by home operator
Example:
The following cube has three dimensions: region, time, and status. All dimensions have
members, e.g., "March" is a member in the month level of time dimension. The cube is
designed to aggregate, analyze and find trends in the measures, e.g., calculating the
percentage of dropped calls that occurred in February in Geneva:
To materialize different views of the data, the OLAP tools embed operators to:
● Drill up the data performing aggregation and summarization on a data cube.
Figure 2-8: Drilling up from the canton level to the country level
Figure 2-9: Drilling down from the monthly level to the daily level
Figure 2-10: Slices of OLAP cube, e.g., focus on failed calls only, or on February data only
● Pivot the data by changing the dimensional orientation of the cube, viewing the
data from different perspectives.
● Drill across to execute queries involving multiple cubes with a common dimen-
sion.
● Drill through to retrieve the data from the underlying input data sources.
The Dashboard workspace already features many relevant views to analyze the
strengths and weaknesses of particular mobile networks.
The NPS view is a dedicated visualization that provides you with an immediate statisti-
cal breakdown of the various KPIs that constitute the overall score.
Figure 3-3: Network Performance Score KPI breakdown – based on statistical OLAP cubes
4 Machine Learning
Machine Learning can be an enabler to the market to, on one side, access deep
insights that otherwise would have remained hidden, and, on the other side, provide
big efficiencies in automation of manual tasks by creating a smarter system that guides
the user through the usual work process, instead of forcing them to repeat each and
every step once again.
R&S SwissQual has focused on providing Machine Learning use cases that narrow
down relevant insights in drive testing data and provide tangible benefits to the user.
Call Stability Score (CSS) is currently supported for UMTS and LTE calls.
Figure 4-1: Example view of the Call Stability Score in a country and problematic pockets
Figure 4-2: Example benchmarking aggregations of the Call Stability Score: From left to right and top
to bottom, the best performing operator for each bin, the average CSS by operator and
by radio access technology.
Optimization analysis has traditionally used one-dimensional filters to point out tests
with poor-performing metrics. In contrast, Anomaly Detection performs a multi-dimen-
sional analysis that reveals uncommonly under- or over-performing tests, which other-
wise would have been hidden to the user. This enables the detection of scenarios in
which, for example, network investments are not providing the expected improvement
in performance. The "Anomaly Detail View" displays the metric values that were taken
into account by the model as features along with their nearest known "Reference"
value and the resulting "Feature Deviation". This shows how much each metric contrib-
utes to the anomaly and guides the user to understand why the sample stands out.
See examples in the anomaly categories below.
The VoLTE call establishment anomaly detection automatically detects calls with espe-
cially uncommon establishments. These calls are presented to the user along with the
relevant information used by the model.
The "Anomaly Detail View" displays the metric values that were taken into account by
the model as features in the "Value" fields, along with their nearest "Reference" learned
by the model and the resulting normalized "Feature Deviation" between the latter two.
This shows how much each metric contributes to the anomaly and guides the user to
understand why the sample stands out.
Figure 4-3: Example of an anomalous VoLTE call establishment where an uncommonly high setup
time has been measured despite having a strong signal
Understanding poor performance in data is a very complex task, as there are multiple
dimensions and they can all be relevant. As a result, it takes a lot of time to check
every single dimension manually. The Anomaly Detection module solves this issue by
performing a multidimensional analysis and pointing us directly to the causes of the
data degradation.
Figure 4-4: Example of an anomalous HTTP Capacity Download test. The Anomaly Detail View shows
how the signal-to-interference-plus-noise ratio is much lower than expected during
first few seconds of the test.
Figure 4-5: Dashboard for HTTP Capacity Download test anomalies. On the right, the new visualiza-
tion for time-based models is visible and helps interpreting the nature of the anomaly.
Parts of the sequences can be highlighted in red if they differ heavily from what the
trained model is used to. A small indicator in the anomaly detail view displays a small
red indicator to point out KPIs that have such unexpected curves.
Figure 4-6: A closer look at the anomaly detail view for time-based models
5 CM360°
With CM360°, a network emulator can virtually re-create the network environment cap-
tured in the field during drive tests. The application produces the captured network
configurations and CMsequencer test scripts which simulate the network’s cells and
their behavior including layer 3 signaling messages of LTE and 5G NR. The network
configurations and the scripts are ready to be executed on the R&S CMX500 using the
CMsequencer and CMsquares.
Testing a wireless device (e. g. smartphone, TCU) with the generated network configu-
rations and scripts allows you to reproduce a problem scenario or to verify different
KPIs and device functionalities under the lab conditions.
The CM360° processing is based on the field logs collected during drive tests. These
logs can have various formats and consist of different levels of information, depending
on the method used to collect the data. Usually, during drive tests, diagnostic tools
(e.g. from R&S QualiPoc or chipset-specific) are used to capture the data from the
wireless device. Solution allows multiple KPI views to easily decide which type of data
(signaling, mobility, cell properties, etc.) shall be selected to simulate the network.
Once you select the type and data of interest, this data is converted to a network con-
figuration and test script for the R&S CMX500.
For more information on CM360° in SmartAnalytics Scene, refer to Chapter 9.4,
"CM360°", on page 107.
6 User/Role Management
Administrator In addition to full access to SmartAnalytics Scene and its features' set, including
managing databases and workspaces, a user having this role:
● Manages all databases, all workspaces independently of any ownership or
visibility restrictions defined by (any) other users.
To visualize permissions, check the icon status next to the database or workspace
name.
Icon Description
The filled person icon indicates that the database/workspace is private and that you
have full access and control.
The filled group icon indicates that the database/workspace is public and that you
have editing rights.
The outlined group icon indicates that the database/workspace is public but read-
only.
7 Database Management
Manage your database under "Database > Databases".
Set the map base bin size and the number of bin levels.
The bin settings can only be defined at creation time.
The NPS version setting can also be adjusted later, even after data was imported.
Check the progress of creating the database in the "Notifications" pane. When the cre-
ation process terminates, SmartAnalytics Scene displays, e.g.:
The "Notifications" pane shows also the initials of the application that has sent the noti-
fication.
5. Click "OK".
SmartAnalytics Scene adds the database to the list of databases and indicates if
an upgrade is required, e.g., if the SmartAnalytics Scene data processing is miss-
ing.
SmartAnalytics Scene uses the active database as the only datasource for all workspa-
ces created under "Scenarios", and for the "Dashboard".
The following table lists the possible status of a database and the available actions
depending on this status.
Table 7-1: Database status and corresponding actions
"Ready" The database is ● "Set active": declare the database as the active one. This option
operational is not visible if the database is already the active one.
● "Edit": edit the database settings:
– Check/uncheck "Private database"
– Update the ownership
● "Process": run one of the reprocessing tasks.
You can see the description and time estimation for each task.
Which task is active, depends on the data available in the data-
base.
● "Remove from list": remove the connection to the database.
● "Delete database": delete the database.
● "Import custom regions": import the custom regions files. Refer
to Chapter C.3, "Importing the Files", on page 257.
"Not ready The version of the ● "Edit": edit the database settings:
(Newer)" database is newer – Check/uncheck "Private database"
than the version of – Update the ownership
SmartAnalytics ● "Remove from list": remove the connection to the database.
Scene. Update
SmartAnalytics
Scene to make the
database usable.
"Not ready The version of the ● "Edit": edit the database settings:
(Upgrade database is older – Check/uncheck "Private database"
required)" than the version of – Update the ownership
SmartAnalytics ● "Upgrade": performs a database upgrade.
Scene. Upgrade the ● "Remove from list": remove the connection to the database.
database to make it ● "Delete database": delete the database.
ready for use.
"Not ready The database is in ● "Remove": remove the connection to the database.
(Processing an inconsistent ● "Edit": edit the database settings:
failed)" state. You cannot – Check/uncheck "Private database"
use it unless you – Update the ownership
process it again. ● "Process": run one of the reprocessing tasks.
You can see the description and time estimation for each task.
If the problem per- Which task is active, depends on the data available in the data-
sists, contact an base.
administrator.
Not ready The current user is ● "Edit": edit the database settings:
(Permission no longer allowed to – Check/uncheck "Private database"
denied) access the data- – Update the ownership
base. ● "Remove from list": remove the connection to the database.
"Not ready SmartAnalytics ● "Remove from list": remove the connection to the database.
(Unable to Scene cannot
connect)" determine the state
of the database,
e.g., when the
server or the data-
base are offline.
Options "Process" and "Update" are not available when the user is a viewer.
Depending on the state of the database, context menu provides some or all of the fol-
lowing operations:
● "Set Active": changes the user’s currently active database.
● "Edit": opens a dialog to modify access rights to this database.
● "Process database": opens a dialog to run one of the reprocessing tasks.
– "Auto processing": launches a process that will automatically determine what
tasks are open or required for the database. As an example, in case a process-
ing fails, this option will attempt to resume the previous processing from the
stage where it failed.
– "Full processing": enforces a long running task to reprocess all statistics, which
is normally only required when a previous processing failed.
– "Custom KPIs / Events re-processing": runs a reprocessing of custom events
and KPIs, for example when a custom KPI has just been added to the system
and we would like to see a newly defined KPI in the application.
● "Remove from list": simply removes the database from the user’s list, no data is
deleted and other users can still see and use that database.
● "Delete database": truly deletes the database and all data, there is no going back.
To check if a database in "Ready " status contains measurement files, click "About"
page under "Settings":
4. Click "OK".
4. Click "Import".
At the end of the import process, SmartAnalytics Scene sets the status of the data-
base to "Ready".
During the import, follow the process under "Database > Import-progress" pages:
You can import voice measurements, if the following requirements are fullfilled:
● IMSI and phone number information is present in the files.
● A side filename must contain “MO” and B side filename must contain “MT” .
● A and B side filenames must contain the same timestamp.
If voice measurement file does not contain IMSI and phone number information, you
need to place "[Link]" file to the FileCache folder:
3. Add IMSI / phone number pair to the [ue_devices] section, as in the example
below:
[ue_devices]
414096019999998=9670944898
IP trace and speech quality details are not reported for voice calls.
At the end of the deletion process, SmartAnalytics Scene sets the status of the
database to "Ready". While the deletion is going on, the database is temporarily
blocked.
7.8 SuperCubes
SuperCubes allow you to see the historical trends and statistics across different data-
bases in a fast and efficient manner. They contain only statistical data (no time-based
data).
Currently, SuperCubes are only used to observe NPS trends on the "L0 Insights" work-
space. For more information, refer to Chapter 9.5, "L0 Insights", on page 112.
To create a SuperCube:
1. Follow the same steps as for the database creation, defined in Chapter 7.1, "Cre-
ate Database", on page 37.
After creation, the SuperCube database is available in the database list and it is
identified by the icon .
To configure a SuperCube:
1. Click the overflow menu of the SuperCube, then click "Manage source DBs".
4. Repeat it for all the source databases you want to add to the SuperCube.
6. Process the SuperCube by selecting the "Process" from its overflow menu. This
takes only a couple of seconds.
7. Set the SuperCube as an active database, by selecting the "Set active" from its
overflow menu.
You cannot simply remove data from the SuperCube. To achieve that you need to
delete the SuperCube and recreate it with the new set of data.
3. Process the SuperCube by selecting the "Process" from its overflow menu.
Import only measurement files with the same NPS template and campaign type as this
is the only way to guarantee the correct calculation of the scores.
You cannot simply remove data from the SuperCube. To achieve that you need to
delete the SuperCube and recreate it with the new set of data.
You can perform the drilldown analysis from the level L0 to other levels (L1, L2 or Cus-
tom workspaces). To do so:
1. Click anywhere on a L1, L2 or Custom workspace.
2. Select the source database from the pop-up window, and click "OK". Now you can
analyse the data from the selected source database.
The drilldown is also possible by clicking any KPI available on the "L0 Insights -> NPS
trends". For more information, refer to Chapter 9.5.2, "Trends", on page 113.
To return to the SuperCube analysis, simply click back the "L0 Insight" workspace.
Figure 7-10: Shows Predefined (1), Custom (2), and Legacy profiles (3)
Click the icon on the top right of the page in "Database > Profiles" to create a new
database profile, Figure 7-10 (4).
The "Add database profile" option contains all the configuration settings to be applied
to a database when creating one, or after a reprocessing.
The "Add database profile" option contains the following tabs:
● Import Options
● Processing Options
● The first tab of the database profile contains the list of data to be imported in the
database.
● Users can check/uncheck each option to enable or disable its import into the data-
base.
● The import options cannot be recovered if missed during the import process.
● By default, the creation of a new profile enables all the import options.
● The "Processing options" Tab contains data information that must be processed
during an import or a full processing.
● It is divided into four categories:
– Processing restriction
– Horizontal validation
– Voice calls
– Processing customization
● Users can enable/disable the options with a switch button.
"Analysis options" allows users to enable or disable an analysis module and configure
the thresholds for coverage analysis and classification, base station evaluation and
RRC triggers processing.
"Machine learning options" allows users to enable and configure the machine learning
module analysis.
Options available are:
● Call Stability Score (CSS) analysis.
● Anomalies analysis.
● Network Utilization Rating (NUR) analysis.
● Processing (timeout of sql command execution).
In "Database > Profiles" click on the overflow menu to access profile menu options.
In the overflow menu users can:
● Set the profile as default
● Edit the profile
● Duplicate the profile
● Delete the profile
Predefined profiles cannot be edited or deleted. However, they can be duplicated and
then edited.
3. Click "Edit".
4. Select the profile from the "Profile" drop down menu in "Edit database > General"
and click "OK".
3. Click "OK".
The name of the assigned profile is displayed in the Database management page
inside the information drop down area.
To view:
1. Navigate to "Database > Databases > Database Management".
If the profile is modified or a different profile is assigned, the database will show an
update pending icon and the information area will show the message "Processing
settings outdated".
If a profile is deleted, the relative database will also show the “Processing settings out-
dated” status and a full process will ask to apply a new profile.
SmartAnalytics Scene uses the active database as the only datasource for all workspa-
ces created under "Scenarios", and for the "Dashboard".
8.1 Overview
The "Overview" tab shows how the data are distributed regionally and over time. In
addition, it provides an overview of the available campaigns, technologies and opera-
tors.
NPS reliability
The NPS requires a high number of results to be reliable, e.g., 1000 calls are required
for a statistically trustworthy call drop rate. By filtering or splitting the data as done in
the dashboard, you have a high probability to undercut these thresholds.
8.3 Services
The "Services" tab provides an overview of the covered service types, e.g, voice or
data services.
8.8 Messaging
The "Messaging" tab provides a detailed view of messaging services usage.
8.9 Technology
The "Technology" tab provides an enhanced view of technology usage.
b) In the drop-down list below the name, select the scenario to which the work-
space belongs.
Alternatively, create a scenario.
d) Check "Private workspace" to make the workspace only visible to its assigned
owners.
4. Under "Ownership" tab, define the "Assigned owners". Click "+" icon to add owners
and icon to remove them.
Note: You cannot remove the last assigned owner.
Changing the ownership depends on the user role/access rights.
5. Click "OK".
SmartAnalytics Scene displays the new workspace with an empty "Default" tab.
a) Choose template.
b) Enter a title.
c) Add a description.
d) Enter date and author.
e) Select a window size.
f) Check box to center the images in the slides.
4. Once complete, click the download button to save the report locally.
The report contains a slide for each tab of the selected workspace.
Tip: The position of images, titles, descriptions and other information depends on
the template used.
SmartAnalytics Scene has a default template that can be exported and copied.
Custom templates can also be added.
2. Ensure the template contains at least one slide with a generic image.
Placeholder Description
{TITLE} Title of the report (set by the user in the report generation dialog)
{DESCRIPTION} Description of the report (set by the user in the report generation dialog)
{DATE} Date of the report (set by the user in the report generation dialog)
{AUTHOR} Author of the report (set by the user in the report generation dialog)
4. Set footer with date and pages in the second slide, with generic image:
a) Go to "Insert > Header & Footer".
b) Check "Footer".
c) Insert "{DATE} ... Page {PAGE} of {PAGES}" in the box provided.
d) Click "Apply".
e) (Optional) Adjust manually to get desired spacing in the final result.
2. Click the .
A drop-down menu opens.
3. Select "PowerPoint".
"Create new report template (PowerPoint)" pop-up window opens.
4. Click "Open".
SmartAnalytics Scene allows you to move a workspace from one scenario to another.
1. Hover over the workspace you want to move.
The overflow icon is displayed.
4. Click "OK"
You cannot remove a workspace if you are not part of the assigned owners.
You cannot edit a workspace if you are not part of the assigned owners.
You can change the "General" workspace type into "Selection/Analysis" type, but not
vice versa.
9.1.9 Tabs
4. Scroll down and move the tab left or right using the "Move tab" options.
Note: Tabs can also be moved within tab groups (see "Moving tabs within tab
groups" on page 76).
You can group several workspace tabs under the common group in order to reduce the
total length of the tab bar.
2. Select the existing tab group from the "Tab group" drop-down list. You can also cre-
ate a new group by simply entering the new tab group name under the "Tab group"
field.
To make the tab group name visible you need to have at least two tabs assigned to
that group.
2. Alternatively, click "Move right" or "Move left" from the tab context menu.
3. (Optional) Enable "Password-protect PDF" to set a password for the pdf report.
4. Click "Export".
You will receive a notification of the report generation in the notification panel.
9.2 L2 Analysis
Analyze your measurements in the time domain, by navigating through the sessions
and tests.
Click the icon to display the session overview:
Device icons:
● A-side, B-side and single-sided device icons have "A", "B" and "S" letters respec-
tively.
● Scanner device has its own icon.
● By hovering over the device icon, you display a tooltip with the following phone's
information: "Name", "Operator", and "IMEI".
You can jump directly to the specific session. To do so, enter the session ID into the
input field and click the arrow button.
● Both RAW session ID and session ID are supported.
● In the general workspace, the session search is executed without filters. If there is
session with a given ID in the database, the session will be found.
● In the selection/analysis workspace, the search is performed only inside the current
session list in the sidebar.
The session/test status is displayed via the icon and the tooltip:
● Failed
● Dropped
● System release
● Invalid (greyed-out)
You can download the PCAP file, if available for the session:
● From the overflow menu :
You can use the same procedure to invalidate a whole file, or just a single test in a ses-
sion.
To invalidate a session
1. Select a session and open the session overview, as in Chapter 9.2.1, "Session
Overview", on page 80.
2. Click the arrow on the right side of the session overview, to open the "Selected
Device" info:
3. Click "Data Invalidation" to open a "Mark data as valid / invalid" dialog box:
Bulk invalidation allows users to mark as invalid/valid a set of files, sessions or tests.
● Alternatively navigate to "L2 Analysis > Default > UE Drill Down" and click .
A new side panel opens.
Note: Bulk invalidation is not available for Viewer and Workspace-Manager users.
7. Hover the cursor over to view the listed number of active files, sessions or tests,
with current filters applied.
The database shows the pending update and asks for a Statistics reprocessing.
After re-processing, the invalidation will also be available in the aggregated data
statistics.
9.2.2 Selection/Analysis
"Session" and "Analysis" tabs can be used only in the "Selection/Analysis" type of
workspaces.
Sessions sidebar
To display the sidebar with sessions (and their related tests), click sessions icon on
the right top corner of the workspace. 100 sessions are shown by default.
4. On the "Analysis" tab, you see the selected session and its details in the tab
header.
Overview session toolbar has less options than the default one. Refer to Fig-
ure 9-8.
Now you can analyze session data in the panels. The tests belonging to the
selected session are shown in the "Tests" part of the sessions sidebar.
The panels on the analysis tab are synchronized with the session overview.
Although you cannot create session selection on the "Analysis" tab, you can still create
local panel filters to customize your display. Refer to "Local filtering" on page 151.
● On the other hand, you cannot change "Selection/Analysis" to the "General" work-
space type.
Hint: In case you still want to do it, create a copy of the "General" workspace
beforehand, in order to preserve it.
● You can change the tab type from "Selection" to "Analysis" and vice versa.
● You can paste the tabs from the "General" to the "Selection/Analysis" workspace.
● Last "Selection" and "Analysis" tabs cannot be removed.
● "Selection" and "Analysis" tabs are always grouped on one side of the workspace
and cannot be mixed.
Under "Analysis", open the "Call Error Analysis" workspace; this workspace aims at
drilling down through the analysis of failed and dropped calls.
● "Overview" tab. Shows how dropped and failed calls are distributed in general,
and by campaign.
● "Failed Calls" tab. Shows a statistical analysis of failed call setup failures for a
specific network operator.
● "Dropped Calls" tab. Shows a statistical analysis of dropped calls for a specific
network operator.
● "Drill Down" tab. Allows investigating the root cause of individual failed or drop-
ped calls; e.g., you can identify failed handovers by looking at the related tables
and the layer 3 information.
SmartAnalytics Scene analyzes the A-side and the B-side of the call separately, and
provides you, among others, with the possible reasons for the failed and dropped calls.
You can access the call analysis information:
● In the values "Description A Side" and "Description B Side" from "Voice Services ->
Call Results -> Info":
Under "Analysis", open the "Call Setup Analysis" workspace; this workspace aims at
drilling down through the analysis of call setups with a focus on the call setup time.
● "Call" tab. Shows a statistical analysis of call setup times for a specific network
operator.
● "Drill Down" tab. Allows investigating the root cause of individual call setup time;
e.g., you can look at the time elapsed between different KPIs of the call setup
sequence.
Under "Analysis", open the "Data Validation" workspace; this workspace aims at
reviewing the automatic or manual validation performed on a database, which is often
required as data cleansing before reporting.
Under "Analysis", open the "Throughput Analysis" workspace; this workspace aims at
drilling down through the analysis of various data services that deliver throughput
results, such as capacity or browser tests.
Test-based aggregation values are statistical HTTP capacity test radio values (aver-
age, maximum and minimum), aggregated to a test level.
The aggregated values are calculated for the reporting period based on the service
KPIs:
Data service Start trigger End trigger Start trigger End trigger
according to NPS according to NPS according to NPS according to NPS
1.x 1.x 2.x 2.x
HTTP DL Capacity Start trigger of KPI End trigger of KPI Start trigger of KPI End trigger of KPI
30465 or 30495 30465 or 30495 30486 30486
HTTP UL Capacity Start trigger of KPI End trigger of KPI Start trigger of KPI End trigger of KPI
30466 or 30496 30466 or 30496 30485 30485
Test based aggregation values are supported in SQL Server 2019 only.
To calculate the time based aggregation values, a full reprocessing of the database is
needed. To enable it, you need to disable "Skip calculation of Capacity Test Based
Aggregation values in ETL" under "Database > Configuration > Processing":
● The CDR is a data structure that aggregates information based on the Test level.
● The set of values gives information about dates, coordinates, radio levels and qual-
ity, and transfer duration.
● The list of CDR combined values for a time-based table is shown in figure Fig-
ure 9-26
● The set of CDR combined values is in the value tree items under:
– "CDR > Data Services Combined".
– Move the cursor over the icon to view a description.
● CDR combined is available for NPS services, interactivity and ping tests:
– Capacity UL/DL
– Browsing
– YouTube
– HTTP Transfer UL/DL
– Messaging
– Dropbox
– Interactivity
– Ping
Panels
The CDR combined values can be used in:
● Time-Based Table
● Bubble Chart
● Time Based Map
● Line Chart
● Customizable CDF Chart
● Data-Validation Chart
● Time-based Value List
● The CDR is a data structure that aggregates information based on the Test level.
● The set of values under "Voice Calls" gives information about dates, coordinates,
sessions, radio levels and quality, and listen quality MOS (reported mainly at the
start of a test for A side and B side devices in a voice call).
● The full set of Voice CDR values is available in the value tree items under: "CDR >
Voice Calls".
2. Open the context menu next to the chosen database and select "Process".
3. Select the "Full processing" option in "Process database" and click "OK".
9.4 CM360°
This section helps you familiarizing with CM360° solution and functionality.
Go to the "Signaling" tab and analyze the relevant layer 3 messages using decoded
view.
Use KPI table to visualize the identified procedures. You can also use the time syn-
chronize feature to synchronize specific KPI with associated protocol message in the
protocol view:
To generate CMsequencer script or network parameters from the field logs, you need
to select one or more layer 3 messages first.
Each message entry in the protocol view has a selection box, which is enabled only for
the field-to-lab supported messages.
Go through the logs and select appropriate message(s) to reproduce the signaling
behavior:
Select "Clear selection" from the overflow menu, to clear your selection:
Script generation status is available in the notification view. Go to the notifications area
and download the generated script and the network parameters.
9.5 L0 Insights
"L0 Insights" enables you to understand the trends and gain valuable insights into your
network performances.
"L0 Insights" consists of two main groups:
● "Aggregations"
● "Trends"
9.5.1 Aggregations
9.5.2 Trends
"NPS Trends" gives you an overview of the NPS total, voice and data results per oper-
ator with an easy way to filter the main high-level criteria.
The "NPS Trends" panel consists of several tiles. Tiles are displayed as gauge or bullet
charts and are divided per operators. They show:
● Total NPS
● Voice NPS
● Data NPS
Figure 9-31: NPS Trends panel shows you NPS results per operator
You can also filter the data according to the "Country", "Campaign", "Collection", "Cate-
gory" and "Operators".
Click the chart or any of the lines to open a pop-up window with detailed information.
The pop-up window contains four different tabs:
● "OVERVIEW": gives you the NPS points overview based on the comparison crite-
ria.
● "BEST/WORST": shows you the best and worst KPIs of the provider.
The corresponding "L1 NPS" tab opens up where you can perform further analysis:
The Y-axis reports the trend of a statistic measure from the value tree item:
Map panel
In a map panel, you can only add a bin layer:
Compare criteria do not apply to the map panel and filtering can be done only through
the top header panel of the Level 0 workspace.
1. Select "Edit Insights Filter" from the L0 workspaces context menu to open the "Edit
Insights Filter - Aggregations Workspace" dialog.
4. Drag and drop the items from the left-hand side ("Predefined filters" or "Custom fil-
ters") to the empty containers on the right-hand side. Click again to add the item to
the global filters. Note: You can configure up to eight such global filters.
To create a custom filter, click the "Select item" from the "Customer filter" items,
than choose the value item from the "Select value" [Link]: You can define up
to three custom filters
9.6 L1 Statistics
Scanner coverage classification values are based on the signal level for each available
technology (5G SS-RSRP, 4G RSRP, 3G RSCP and 2G RxLev).
Each class value is calculated per map bin level. For the best results, we recommend
setting 100 m as the base bin size. For more information, refer to Map bin settings.
The following classes are available:
● Deep Indoor (> -60 dBm)
● Indoor (from -60 dBm to -75 dBm)
● In car (from -75 dBm to -90 dBm)
● Outdoor (from -90 dBm to -115 dBm)
● Limited or No Service (from -115 dBm to -135 dBm)
● No Coverage (< -135 dBm or no measurements)
● Inconclusive. This class occurs when the information is not reliable due to insuffi-
cient data available in the map bin. To improve results, increase the base bin size.
To customize the coverage levels, go to "Database > Configuration > Analysis" tab.
You can display scanner coverage classification values on the bar chart, pie chart, cdf-
pdf line chart, maps and in the value list.
Channels/frequencies that are not found on air but are configured for the measurement
are classified as "No coverage".
Without administrator rights, BTS Manager will be visible, but users will only see a
"Permissions required to access BTS Manager" message.
The "List" tab shows the list of cells available in a selected container.
The container can be changed from a drop-down menu on the top right corner.
Click on a cell to show a pop-up window containing information about the following:
● Cell Name / Cell ID
● Technology
● MCC, MNC, TAC
● EARFCN, PCI
● Direction
● Longitude, Latitude
● User data
Selecting a container
1. Select a container from the drop-down list available in the Container field (see Fig-
ure 9-40).
– A "Map overview" dialog box is open, offering a preview of the mapping, indi-
cating column numbers and names. An option to remove unwanted mappings
before the import is also provided.
● Set the default Technology, MCC and MNC.
● Set the time validation for an imported list (valid from now or from a specific date).
● Select or create an Import profile.
Import Settings
● The settings fields allow users to set the default Technology, MCC and MNC.
● The cell list must be valid from a specific date. Users can choose whether to vali-
date from the time of import or from a selected date.
● A toggle switch allows users to outdate cells already present in the container at the
time of the new import or from the selected date.
● It is also possible to create and select an "Import profile" in the drop-down list.
► Toggle "Add header to CSV file" to add the column header to the CSV file.
10 Filtering Capabilities
The following rules apply for filters:
● You can apply filters on all workspaces, including the dashboard.
● The filter affects the entire workspace (global filter), i.e., it persists if you switch
from one tab to another and affects all panels present on a page.
● SmartAnalytics Scene saves the applied filter when you leave the workspace and
reopens this workspace with the same filter applied.
● You can save the filter in the "Filter favorites" and use it across different databases.
Filters are saved per user.
3. Disable "Apply filters immediately" if you do not want to apply the filter immediately
after having defined it. The toggle is enabled by default.
b) Click the value item displayed in the green box to validate your selection.
c) Alternatively, select another value item (in the left part) and replace your initial
selection by clicking in the orange box to validate the change.
If you disable "Apply filters immediately" and reach step 6, SmartAnalytics Scene dis-
plays:
Bars that would have been removed by the immediate application of the filter are still
displayed, but with a hatching. Click to apply the filter, i.e., display Figure 10-5 view.
Note: Current week (month or year) includes the whole current week (month or
year).
5. For the "previous" options, enter the "X" value. The time range starts with the cur-
rent day (week, month or a year) and extends "X" days (weeks, months or years) to
the past.
At the end, the configured dynamic filter is added as a global filter to the work-
space.
You can configure only one dynamic filter. If the dynamic filter is already present the
option "Add dynamic filter" is greyed out.
You can also configure a dynamic filter locally for the single panel.
Alternative
As an alternative to the filtering procedure explained above ("To add and apply a filter"
on page 135), create a filter directly, by clicking:
● Panels' elements like bars, pie charts, and legend items.
● Maps' elements like bins and regions.
● Session overview's elements like technology in use, current test, session status.
● Filter icon next to the dimension's value in the time-based table and the time-
based value list.
– The "Operator" is set to "Equals". Press CTRL before selecting the value, to
create the filter with the "Operator" "Not equals".
– To create several filters at once, press SHIFT beforehand. Releasing SHIFT
key makes the filters active (filters' color change from green to blue).
– Use CTRL + SHIFT to create multiple filters with the "Operator" "Not equals".
To edit a filter
1. Click the edit icon:
3. Click "OK".
To remove a filter
The behavior differs depending on whether you enable or disable "Apply filters immedi-
ately".
► "Apply filters immediately" enabled:
● Click "X" for each filter you want to remove.
"Apply filters immediately" disabled:
● Click "X" for each filter you want to remove. The filter turns red.
Filtering by device
1. Open a workspace/the dashboard.
By displacing the yellow line, i.e., the time synchronization line, you update all time-
based views.
To apply the saved filter on the workspace, click the filter name from the "Filter favor-
ites":
3. Click "OK".
Geofence polygon filter acts as a global filter.
11 Data Visualization
On any workspace, click the edit icon and then use the plus icon "+" to add a new
visualization panel. These panels are also referred to as widgets or views.
There are many different widgets available. Some examples:
● Table
● Detail views like protocol view
● Map
● Line charts
● UE Views
Most importantly, there are two types of widgets:
● Time-based: For displaying data as in an NQDI session analysis (e.g. L3, time-
based table, map point layers).
● Statistical: For compiling dashboards as in a report (e.g. bar chart, statistics table,
map tile layers).
The color of the line chart is derived from the color of the split by value (defined in "Set-
tings -> Colors") and a scope option.
The tooltip of a bar shows, besides the actual value, additional information such as
"Count", "Average", "Minimum" and "Maximum" of all the (possibly filtered) data in the
database.
To configure a bar chart or most other widget types, simply assign values from the
value tree to the highlighted fields in the configuration area, as shown.
Hovering over a value in the tree, shows a short explanation of the value at hand. By
clicking an already assigned field, the value tree jumps to the location of the value in
the tree structure.
The preview of your chart updates automatically whenever you assign a new value to
the configuration and when there is data that can be displayed. This applies for all
widget types.
Clicking "OK" saves the configured panel and adds it to the workspace.
Further customization options for all chart types can be found on the "OPTIONS " tab.
The available options may vary, depending on the selected values (e.g. stacking
requires that a split-by criterion has been defined).
All bar chart options:
● Vertical/horizontal orientation
● Stacking options
● Stacking options
● Value labels
● Maximum number of bars shown
● Color by
● Min/max ranges
● Bar ordering
Local filtering
In more recent versions of SmartAnalytics Scene, we introduced local filtering of values
for most widgets. This greatly enhances your possibilities to customize charts for a
specific analysis context e.g. by restricting data to only failed sessions for error analy-
sis.
The possibility to add such filters is accessible from the dedicated tab "Filters". Click
"ADD FILTER" to add a filter that reduces the data shown on the bar chart, e.g. to
show only bars for specific categories of tests.
Please note that these filters are part of the workspace configuration and are always
applied. Read-only users of the workspace are not able to disable or change this filter.
Figure 11-8: A simple statistics table showing the correlation between throughputs and block errors
To configure a statistics value list, simply assign values from the value tree to the high-
lighted fields in the configuration area as shown.
Figure 11-11: Select and assign values to the statistics value list
Figure 11-12: Another example of how the value list can be used
More customization options can be found on the "OPTIONS" tab. All available options:
● Vertical layout
● Group split-by items
● Scope:
– "All"
– "Current UE"
– "Current session". The "S" sign is displayed in the left corner of the panel
header, if the "Show title bar" is disabled in the panel settings).
– "Current test". The "T" sign is displayed if the panel header is hidden.
For the "Current UE" scope, you can filter which devices to show in the panel:
● "S or A & B side"
● "S or A side"
● "B side"
● Min/max ranges
● Bar ordering
11.7 Maps
The map widget in SmartAnalytics Scene supports three different types of map layers
such as point layers for displaying time domain data and bin and region layers for dis-
playing statistical results. Thus, the map is the only widget that can be used to visual-
ize time domain data and aggregated statistical data within the same panel.
2. Select a value from the value tree and assign it to the layer. Depending on the
value type ("Time" or statistical like "Average"), you can choose a different visuali-
zation type.
Note: You cannot change the type of a layer after creation, so you should be clear
about whether a bin, region or point layer is required.
Regarding general settings, the map supports changing the panel title and overriding
the default background map tiles with a different style.
The first layer type to address in detail is the region layer. It displays a statistical value
for a whole area like a country, province or city. By clicking a highlighted area of the
region layer, you can drill into the next lower level of region shapes, e.g., from country
to region. At the same time, a global filter is added to limit the data to the chosen
parent region, therefore aligning the visible child regions with other views like bar
charts.
The customization options for region layers are limited to choosing a dimension value
like "Home Operator" to color the regions. Also, under "OPTIONS" you can specify
whether to use the best or worst value for the shading of the area, e.g., the color of the
operator with the highest or the lowest throughput in a region.
The bin layer type displays statistical value similar to the region layer. However, in
addition to selecting a highlighted raster bin area for filtering and drilling into the next
lower level, the bin layer also supports switching to a smaller bin size automatically by
zooming into the panel using the mouse wheel or the "+" and "-" button in the map
area. Furthermore, the bin layer comes with some additional customization options,
like the "DELTA MODE".
The last and most complex customization option for bin layers is the delta mode. It
allows you to plot the delta of two values on the map, e.g. the difference between the
average throughput of Operator A and Operator B. Activate this mode by first selecting
the tab "DELTA MODE". Then, choose from further options:
● Use the absolute delta or the delta in percent.
● The theme profile to use for coloring the delta values (note that zero means that
both values are equal).
To plot a delta, you also need to define what two values you would like to compare.
This happens on the "VALUES" tab:
1. First, select the "Delta value" to split by the grid value.
2. Select Value A and Value B from the delta value for comparison.
When this is done, the map shows you for each bin the difference between Value A
and Value B for each bin.
If Value A or Value B is missing in a bin entirely, the bin is colored in light grey. You
can also see it from the bin tooltips.
Figure 11-22: A delta plot with absent values for the second selected PCI
All map views in SmartAnalytics Scene will automatically display floorplans on the map
for indoor measurement data, collected with ROMES4, SmartBenchmarker and Quali-
Poc.
● It is possible to keep drive test and indoor data in the same database.
● It is highly recommended to use a georeferenced floorplan in the measurement.
Please consult the corresponding data collection tool’s manual on how to create a
georeferenced indoor measurement.
While zoomed out on the map you will see a building indicator that allows you to select
a specific site, for which you would like the corresponding floor plan and indoor data.
Once you zoomed in on a specific building, a floor level indicator allows you to switch
between different floors or levels of the selected building or site.
The indoor feature is compatible with any type of layer and works with UE and scanner
measurements alike. Furthermore, the indoor measurement data is automatically
aggregated by building and floor, which allows for a statistical comparison between
separate floors or buildings.
Figure 11-25: Example of a point layer with UE SINR shown on top of an indoor floor plan
6. Click "OK".
A configuration panel is opened.
Tip: Move the mouse over the route to display an arrow showing the direction and
the timestamp of the route.
9. Enable "Render layer with predefined offset" .
Offline maps functionality allows you to access the map data without internet connec-
tion.
SmartAnalytics Scene first checks if the offline map server is running, otherwise the
online maps are used.
2. Select "My Products > Mobile Network Testing" in the drop down menu.
3. Select "SmartAnalytics".
You find the customization options for point layers on the " OPTIONS" tab. The options
allow you to:
● Choose which value to display among "Best", "Worst", "Avarage", if there are multi-
ple values available at the same point - "OPTIONS" / "Value priority".
● Choose a different shape for the points - "OPTIONS" / "Icon".
● Choose whether to display the actual value as a label right next to each point -
"OPTIONS" / "Show lables".
● Choose the size of the point icon - "OPTIONS" / "Size".
● Enable plotting more point layers on the same map with a predefined offset -
"OPTIONS" / "Render layer with predefined offset". In addition, you can customize
the scaling factor of the offset by changing the "Point layer offset factor" in the
"SETTINGS".
● Choose whether to display points in a fixed color instead of the normally assigned
color - "OPTIONS" / "Default color".
When you add the first point layer to a map, two additional layers appear automatically.
● The first of those layers is the "Base Stations " layer. It works in the same way as
the base station layer on the BTS Manager map, but with some extra options. By
default, it only displays cells that are linked to the active measurement data on the
point layer. However, you can choose to display all imported BTS.
● The other layer, "Line to Cell", draws lines between cells and the related measured
points on the point layer. You can configure when to display these lines (showing
all possible lines can be overwhelming) and by which criteria (e.g. the carrier index
or EARFCN of the BTS) to color the lines. Otherwise, lines appear by default in the
color of the point they connect to.
For more information on BTS manager, please refer to
Manual - [Link].
You can export the map points only. The export of map bins is not supported.
If you click a line in the table, all the other time-based views synchronize to the nearest
possible point in time to the time stamp selected with the record. If that particular row
also belonged to a different UE, also the selected device in the session overview and
any other device-specific views like the Layer 3 view switches to show data from the
newly selected device.
If you want to show the data from a single device only, you need to configure the chart
with "Depend on device selection" enabled on the "OPTIONS" tab. This makes particu-
larly sense for data like IP Trace or other trace information.
Figure 11-30: Show only data from the currently selected device on a time-based table
Figure 11-31: A value for PDSCH throughput last reported 0.167s before the current time refer-
ence
Figure 11-32: Fade-out behavior for time scoped values as the 3rd carrier vanishes
To configure a value list, simply assign values from the value tree to the highlighted
fields in the configuration area as shown.
Figure 11-33: Select and assign values to the value list monitor
More customization options can be found on the "OPTIONS" tab. All available options:
● Vertical layout
● Group split-by items
The line chart operates with time based for session analysis only.
● If you add more than a single value to the chart, time series are shown only for the
currently selected UE.
● If you choose only a single value, the chart shows data from multiple UEs.
For values that need to be split by carrier index, like LTE RSRP, a corresponding split
by criteria is automatically added and used.
Figure 11-36: A line chart automatically splitting a value by the required criteria
To configure a line chart, simply assign values from the value tree to the highlighted
fields in the configuration area as shown.
● "Style"
● "Shade"
● "Size"
● "Opacity"
● "Span gaps"
● "Stepped line"
● Clicking the eye icon to hide the value on the chart. This setting is not persistent
on a tab change.
● Linking the mouse wheel zooming with the line chart.
Click the link icon . This setting is persistent.
● Set boundaries for both axes in the chart view with the arrow icons available in the
corner of the chart.
● Chart view also has a zoom function and can be navigated at different zoom sizes.
Panel configuration
● The X axis shows the frequency and the Y axis the power value. Blue and yellow
vertical lines are upper and lower limits.
● To show the data, users must select a test of type "speech".
● The values are preconfigured and can not be changed. Only local filters can be set
and the custom panel title and description.
● The "Data Validation chart" is a new panel available in the Line charts group.
● It allows users to display a measure value over an entire day and not just for the
duration of a test or session.
● The Data Validation chart is available in L1 Statistics or Custom workspace.
Data visualization
● The time focus of the chart is a single day, displayed in the chart title.
● The range of time is the largest for that day. It can be adjusted with the slider at the
top of the panel.
● If there are no date filters, the chart will take the data from the latest day visible in
the DB.
● Filters can be configured globally in the workspace or locally in the panel.
● Users can configure the measure for the Y axis, a dimension for Group by and a
dimension for Split by in the "Values" tab of the configuration panel.
– Split by dimension is supported for data bars only, not for line (solid chart type).
● Users can configure the "Chart line type", the "Data decimation type" and "Scale
groups individually along the Y axis, relative to their data" in the "Options" tab.
– In "Chart line type" choose between "Data bars" and "Solid" (lines).
– Reduce data when there is not enough space to display all the data using the
"Data decimation type" option. Options include "None", "Min/Max"and "Largest
Triangle Three Bucket".
– Check "Scale groups individually along the Y axis, relative to their data" to give
each group an individual height within the Y axis.
11.13 UE Views
"UE Views" is composed of four panel options:
● "LTE Neighbors"
● "5G NR Neighbors"
● "RLC/MAC"
● "PDCP"
● "5GNR RLC"
● "5GNR PDCP"
The panel gives you the information about LTE neighbors for a specific timestamp.
The panel is already formatted with predefined default information about "Carrier
Index", "DL EARFCN", "PCI LTE", "RSRP", "RSRQ" and "RSSI".
You can add additional information to the "Optional items" from the value tree items.
"SETTINGS" allows you to customize the panel title.
11.13.2 5G NR Neighbors
The panel gives you the information about 5G NR neighbors for a specific timestamp.
The panel contains default predefined items with information about "DL NR-ARFCN",
"PCI 5G NR", "Cell Type", "SSB Index", "SS-RSRP", "SS-RSRQ".
You can add additional information to the "Optional items" from the value tree items.
"SETTINGS" tab allows you to customize the panel title.
11.13.3 RLC/MAC
RLC/MAC (5GNR RLC) panel shows you LTE RLC/MAC (5GNR RLC) data that is
available for the time based analysis.
You can also create a value list or line chart with RLC and MAC data from the value
tree items.
Panels are already configured to report the values in time and the "Current UE" scope.
You can customize "Mode" and "Direction" settings:
● "Mode"
– "RLC Configurations"
– "RLC Statistics"
– "MAC Statistics"
● "Direction"
– "Uplink"
– "Downlink"
– "Uplink & Downlink"
11.13.4 PDCP
PDCP (5GNR PDCP) panel shows you LTE PDCP (5GNR PDCP) data that is available
for the time-based analysis.
Panels are already configured to report values in time and for the "Current UE" scope.
You can customize "Mode" and "Direction" settings:
● "Mode"
– "PDCP Configurations"
– "PDCP Statistics"
● "Direction"
– "Uplink"
– "Downlink"
– "Uplink & Downlink"
Regarding customization, you have the possibility to add a few additional columns to
the table:
● WB RSRQ and RSSI, for TopN 4G view.
● SS-RSRQ, RSSI and general information about MCC, MNC and Operator, for TopN
5G view.
Figure 11-42: Configuring additional columns for the scanner TopN 5G view
You can add three different types of detail views to your workspace:
● Protocol view with UE layer 3 information.
● Anomaly Detail view for machine learning algorithms results.
● UE Capabilities view with supported UE capabilities.
Protocol view allows you to see the layer 3 message flow of the currently selected
device and decode the selected message. Both areas of the panel can be searched
separately.
Protocol view panel is already formatted and it contains information about the time,
layer and the message info.
● You can add other columns from the value tree items.
● Resize left and right panels with adjustible vertical splitter.
● You can customize, under "OPTIONS", settings such as device side to show, the
aspect ratio and the information levels to expand.
● You can customize the panel title under "SETTINGS".
11.15.3 UE Capabilities
The UE capabilities view displays information about supported RAT, device category,
supported bands / technologies, etc.:
The Information is taken from the Layer 3 "Capability Information" message and repor-
ted in a dedicated panel. Apart from adjusting the title and selecting the default device
side, no other customization options are available.
11.16 Divider
Divider is a horizontal bar that groups panels in the workspace.
● The headers of all panels below the divider are colored with a similar color as
divider itself.
● By collapsing the panels below the divider, you gain some extra space on your
workspace:
Waveform player replays speech samples in the test scope and call setup recordings in
the session scope:
11.18 ACD
Automatic detection channel panel shows information about detected channels during
ACD scanning procedure:
● The panel shows frequency, bandwidth, channel number and operator information
of the found channels. Hover with a mouse over the channel to open the tooltip
with basic information.
● Frequency overview shows you all channels per technology on one axis only. You
can zoom in the area of interest.
To activate it, toggle the switch at the top right corner of the panel.
11.19 Data-selection
Data-selection panel enables you to create and configure a set of filters from the value
items and apply them in one step. You can add the data-selection panel to all work-
space types except to the L0 scenario and to the Analysis tab of the Selection/Analysis
workspace.
● Click "Apply Filters" to apply the configured filters as global filters for the work-
space. The state of the data-selection filter persists for the current user / work-
space / database.
● Click “Clear Selected Filters” to clear the effect of the data-selection filters.
The bucket size of both the X and Y axis is now customizable in "Bubble chart."
● The "Values" tab of the "Configure Bubble-Chart panel" window has two "bucket
size" fields for X and Y values.
● Users can customize the size, and the number of occurrences, of each bubble in
the chart by changing the bucket size.
● To customize the color and transparency of the chart, click on the "Options" tab in
the "Configure Bubble-Chart panel".
● In the scatter chart each point has the same size, and the count-information is only
available on the mouse-hover tooltip.
● The scatter chart allows better readability in cases where some of the bubbles are
too large and may block the overall view.
● To enable split by dimension, navigate to the "Values" tab in the "Configure Bubble-
Chart panel"and add a dimension item.
● If a split-by dimension is configured, the chart data will be split into more datasets,
where each dataset corresponds to a dimension value with its own legend item.
● In this case the color of the bubbles is the thematic color of the split by dimension
value and cannot be configured in the color option of the chart.
To export to CSV:
1. Navigate to the context menu.
11.21 Text
A new "Text" panel type is available for annotations that are directly visible on the
workspace.
This allows text to appear on screenshots or PDF exports.
To add text:
1. Select the edit icon and click on the "plus" icon to open the "Data Visualiza-
tion" panels.
2. Select "Text".
11.22 Events
Events can be visualized in a new table panel, as markers in a line chart, and in a map
panel.
To view all values click the "Options" tab and activate "Show additional columns".
5. Click "OK".
Tip: Delete any empty rows using , then click "OK".
This color is shown with the chosen event under the value column.
RRC events
RRC events are processed if the option is active in the database profile.
5. Select the group, or any subgroups of, "SA Event Markers" using the check boxes.
Selected events are added to the line chart.
● The markers on the line chart can be used with the Events panel for analysis.
● Users can time synchronize a marker in the line chart, and the corresponding row
in the event panel is highlighted.
● The coloring rules configured in the Events panel are also applied to the line chart
markers.
3. Click Save.
5. Activate "Show event markers" toggle to show event markers in the map panel.
8. [Optional] Configure a color for an event (see "Events panel color" on page 209).
The event will be shown on the map with a flag placeholder.
It contains the following subgroups that allow you to limit the processing of data during
the import:
● Chapter [Link], "The Common Subgroup", on page 216
This subgroup contains options for video tests, interactivity, Browsing, and IP through-
put.
There are additional options to enable the Capacity Test Based Aggregation and the
new CDR Combined data.
The Data/Video Tests subgroup enables/disables the following options:
● Video Interframe Details
● Interactivity Packet Details
● Intermediate Browsing IP Ramp Up
● Intermediate IP Throughput
● Capacity Test Based Aggregation Data
● CDR Combined data
12.2 Analysis
Navigate to "Database > Profiles" click the icon to "Add database profile" and select
"Analysis Options". There you can define the threshold below/above which SmartAna-
lytics Scene reports problems.
Example:
SmartAnalytics Scene reports a base station issue if the RSRP power level is above
the value you define.
We recommend that you adjust these values before importing data. Updating these
values on an already populated database triggers the reprocessing of the database.
The following table lists the configuration groups and the values for which you define
the thresholds.
Table 12-1: Analysis thresholds
Mobile Coverage General Min Raster Sample Minimum number of samples per bin to produce an entry in
Analysis the final analysis result for the serving cell statistics.
GSM Analysis Min Coverage Power Coverage problems are reported if the averaged level of
the raster is below this threshold.
UMTS Analysis Min Coverage Power Coverage problems are reported if the averaged level of
the raster is below this threshold.
LTE Analysis Min Coverage Power Coverage problems are reported if the averaged level of
the raster is below this threshold.
5G NR Analysis Min Coverage Power Coverage problems are reported if the averaged level of
the raster is below this threshold.
PUSCH TxPower Threshold Mobile coverage problems are reported only if the mea-
sured PUSCH TxPower is higher than or equal to this
value.
Scanner Cover- General Report no 2nd in TopN Report network problem, if there is no second best in TopN.
age Analysis
Report Problem - Minimum Report problem, if there are at least N measurements (min-
Measurements imum) as basis.
Analysis up to this TopN All TopN entries below this position are not taken into
position account for all analyses.
GSM Analysis Perform analysis for this Enable/disable the aggregation of scanner measurements
technology - ON/OFF and problem analysis.
Coverage - RxLev Limit Coverage problems are reported if the averaged level of
the Top entry in a raster is below this threshold.
Interference - RxLev Power An interference issue is reported only if the power value is
Minimum above this value and the quality criterion is below the Inter-
ference Quality Maximum threshold.
Interference - Ctol Quality An interference issue is reported only if the power value is
Maximum above the Interference Power Minimum value and the qual-
ity criterion is below this threshold.
Network Problem - First A network problem is reported if the power value of the first
TopN RxLev Max TopN in a cell is below this and the second below the Sec-
ond TopN RxLev Max threshold.
Network Problem - Second A network problem is reported if the power value of the first
TopN RxLev Max TopN in a cell is below the First TopN RxLev Min value and
the second below this threshold.
Pollution - First TopN RxLev A pollution issue is detected when the first TopN provides
Threshold good coverage (checked with this value) and the power of
the second best TopN to the first is within the delta below.
Pollution - Second TopN A pollution issue is detected when the first TopN provides
RxLev Delta good coverage and the power of the second best TopN to
the first is within this delta.
Low number of measure- Report a problem spot, if the number of cells for this TopN
ments is lower than the threshold
UMTS Analysis Perform analysis for this Enable/disable the aggregation of scanner measurements
technology - ON/OFF and problem analysis.
Coverage - RSCP Limit Coverage problems are reported if the averaged level of
the Top entry in a raster is below this threshold.
Interference - RSCP Power An interference issue is reported only if the power value is
Minimum above this value and the quality criterion is below the Inter-
ference Quality Maximum threshold.
Interference - Ec/lo Quality An interference issue is reported only if the power value is
Maximum above the Interference Power Minimum value and the qual-
ity criterion is below this threshold.
Network Problem - First A network problem is reported if the power value of the first
TopN RSCP Max TopN in a cell is below this and the second below the Sec-
ond TopN RSCP Max threshold.
Network Problem - Second A network problem is reported if the power value of the first
TopN RSCP Max TopN in a cell is below the First TopN RSCP Min value and
the second below this threshold.
Pollution - First TopN RSCP A pollution issue is detected when the first TopN provides
Threshold good coverage (checked with this value) and the power of
the second best TopN to the first is within the delta below.
Pollution - Second TopN A pollution issue is detected when the first TopN provides
RSCP Delta good coverage and the power of the second best TopN to
the first is within this delta.
LTE Analysis Perform analysis for this Enable/disable the aggregation of scanner measurements
technology - ON/OFF and problem analysis.
Carrier combined analysis - Enable/disable problem analysis per network operator (all
ON/OFF carriers combined).
Coverage - RSRP Limit Coverage problems are reported if the averaged level of
the Top entry in a raster is below this threshold.
Interference - RSRP Power An interference issue is reported only if the power value is
Minimum above this value and the quality criterion is below the Inter-
ference Quality Maximum threshold.
Perform signal quality analy- Activating this setting enables the signal quality analysis
sis for RSRQ - ON/OFF based on RSRQ from LTE Scanner data.
Interference - RSRQ Nar- An interference issue is reported only if the power value is
rowband Quality Maximum above the Interference Power Minimum value and the qual-
ity criterion is below this threshold and RSRQ is selected
above.
Perform signal quality analy- Activating this setting enables the signal quality analysis
sis for SINR - ON/OFF based on SINR from LTE Scanner data.
Interference - SINR Narrow- An interference issue is reported only if the power value is
band Quality Maximum above the Interference Power Minimum value and the qual-
ity criterion is below this threshold and SINR is selected
above.
Perform network problem Activating this setting enables the analysis for network
analysis - ON/OFF problems. In case of a low coverage situation with no suita-
ble second best server for a potential handover, a network
problem is reported.
Network Problem - First A network problem is reported if the power value of the first
TopN RSRP Max TopN in a cell is below this and the second below the Sec-
ond TopN RSRP Max threshold.
Network Problem - Second A network problem is found if the power value of the first
TopN RSRP Max TopN in a cell is below the First TopN RSRP Min value and
the second below this threshold.
Pollution - First TopN RSRP A pollution issue is detected when the first TopN provides
Threshold good coverage (checked with this value) and the power of
the second best TopN to the first is within the delta below.
Pollution - Second TopN A pollution issue is detected when the first TopN provides
RSRP Delta good coverage and the power of the second best TopN to
the first is within this delta.
Perform Condition Number Enable/disable Condition Number and Rank Indicator prob-
and Rank Indicator analysis lem analysis.
- ON/OFF
Condition Number and Rank The Condition number and Rank Indicator problem analy-
Indicator - RSRP Power sis is executed only if the averaged power level of the top
Minimum entry in a raster is below this threshold.
Condition Number and Rank The time duration of the Condition Number problem or the
Indicator - Minimum Dura- Rank Indicator problem must be longer than the minimum
tion time duration.
Condition Number 2x2 Max- The Condition Number for MIMO mode 2x2 is calculated by
imum scanner. The calculated value is compared to the config-
ured threshold and if higher, the problem spot is created.
Condition Number 4x2 Max- The Condition Number for MIMO mode 4x2 is calculated by
imum scanner. The calculated value is compared to the config-
ured threshold and if higher, the problem spot is created.
Condition Number 4x4 Max- The Condition Number for MIMO mode 4x4 is calculated by
imum scanner. The calculated value is compared to the config-
ured threshold and if higher, the problem spot is created.
Rank Indicator 2x2 Maxi- The Rank Indicator for MIMO mode 2x2 is calculated by
mum scanner. The calculated value is compared to the config-
ured threshold and if lower, the problem spot is created.
Rank Indicator 4x2 Maxi- The Rank Indicator for MIMO mode 4x2 is calculated by
mum scanner. The calculated value is compared to the config-
ured threshold and if lower, the problem spot is created.
Rank Indicator 4x4 Maxi- The Rank Indicator for MIMO mode 4x4 is calculated by
mum scanner. The calculated value is compared to the config-
ured threshold and if lower, the problem spot is created.
LTE NETWORK Perform network plan analy- Activating this setting enables the analysis for network plan
PLAN ANALYSIS sis - ON/OFF problems. This algorithm analyzes the signal strength of
the best serving cell in relation to other cells.
Network Plan Analysis - This parameter allows you to define the signal strength for
Best Server Minimal RSRP the best server starting from which network plan analysis
would make sense. If the best signal is very low, this usu-
ally simply indicates an end of the coverage for a specific
EARFCN and does not make sense for this type of analy-
sis.
Network Plan Analysis - Cell High inter-cell interference is reported as a network plan
Geometry Factor Threshold problem if the average inter-cell cell geometry factor for the
best cell of a specific EARFCN is lower than or equal to the
value specified.
5G NR ANALY- Perform coverage analysis - Activating this setting enables the coverage analysis based
SIS ON/OFF on NR 5G Scanner data. Whenever the given threshold for
signal strength is not reached in average for a bin, a cover-
age problem will be reported.
Coverage - SS-RSRP Limit Coverage problems are reported if the averaged level of
the Top entry in a raster bin is below this SS-RSRP thresh-
old.
Interference - SS-RSRP This parameter allows you to define the minimal power
Power Minimum level at which an interference issues would be reported. If
the power value is above this value and the quality criterion
is below the Interference Quality Maximum threshold, an
interference issue will be reported for a bin.
Perform signal quality analy- Activating this setting enables the signal quality analysis
sis for SS-SINR - ON/OFF based on SS-SINR from NR 5G Scanner data. Whenever
the given threshold for SS-SINR is not reached for a bin, a
coverage problem will be reported - given that Interference
Power Minimum was not undercut.
Interference - SS-SINR Nar- An interference issue is reported only if the power value is
rowband Quality Maximum above the Interference Power Minimum value and the qual-
ity criterion is below this threshold and SS-SINR analysis is
activated above.
Perform signal quality analy- Activating this setting enables the signal quality analysis
sis for SS-RSRQ - ON/OFF based on SS-RSRQ from NR 5G Scanner data. Whenever
the given threshold for SS-RSRQ is not reached for a bin,
an interference problem will be reported - given that Inter-
ference Power Minimum was not undercut.
Perform network problem Activating this setting enables the analysis for network
analysis - ON/OFF problems. In case of a low coverage situation with no suita-
ble second best server for a potential handover, a network
problem is reported.
Network Problem - First A network problem is found if the power value of the first
TopN SS-RSRP Max TopN in a cell is below this and the second below the Sec-
ond TopN SS-RSRP Max threshold.
Network Problem - Second A network problem is found if the power value of the first
TopN SS-RSRP Max TopN in a cell is below the First TopN SS-RSRP Min value
and the second below this threshold.
5G NR NET- Perform network plan analy- Activating this setting enables the analysis for network plan
WORK PLAN sis - ON/OFF problems. This algorithm analyzes the signal strength of
ANALYSIS the best serving cell in relation to other cells or beams.
Network Plan Analysis - This parameter allows you to define the signal strength for
Best Server Minimal SS- the best server starting from which a network plan analysis
RSRP would make sense. If the best signal for a channel is very
low, this usually simply indicates an end of the coverage for
a specific NRARFCN and does not make sense for this
type of analysis.
Network Plan Analysis - Cell High inter-cell interference is reported as a network plan
Geometry Factor Threshold problem if the average inter-cell cell geometry factor for the
best cell of a specific NRARFCN is lower than or equal to
the value specified.
Network Plan Analysis - High inter-cell SSB beam interference is reported as a net-
Inter-cell Beam Geometry work plan problem if the average inter-cell beam geometry
Factor Maximum factor for the best beam of a specific NRARFCN is lower
than or equal to the value specified.
Network Plan Analysis - Poor intra-cell SSB beamforming gain is reported as a net-
Intra-cell Beam Geometry work plan problem if the average intra-cell beam geometry
Factor Maximum factor for the best beam of a specific NRARFCN and PCI is
lower than or equal to the value specified.
Network Plan Analysis - Poor inter-cell SSB beam dominance is reported as a net-
Inter-cell Beam Dominance work plan problem if the average number of inter-cell
Minimum beams within only 3dBm of the best serving beam of the
same NRARFCN is greater than or equal to the value
specified.
Coverage Classifi- General GPS Speed Limit for Cover- If the recorded GPS speed exceeds the limit specified
cation age Classification as Incon- here, scanner measurements become too blurred with
clusive Bin (km/h) regards to the bin classification and affected bins will be
classified as [Inconclusive].
GSM Deep Indoor - RxLev Mini- If the RxLev of the best server in a bin is greater than or
mum equal to this value, the bin will receive coverage class
[Deep Indoor].
Indoor - RxLev Minimum If the RxLev of the best server in a bin is greater than or
equal to this value, the bin will receive coverage class
[Indoor].
Incar - RxLev Minimum If the RxLev of the best server in a bin is greater than or
equal to this value, the bin will receive coverage class
[Incar].
Outdoor - RxLev Minimum If the RxLev of the best server in a bin is greater than or
equal to this value, the bin will receive coverage class [Out-
door].
Limited or No Service - If the RxLev of the best server in a bin is greater than or
RxLev Minimum equal to this value, the bin will receive coverage class
[Limited or No Service] If the RxLev is smaller than this
value, the bin will be classified as [No Coverage].
UMTS Deep Indoor - RSCP Mini- If the RSCP of the best server in a bin is greater than or
mum equal to this value, the bin will receive coverage class
[Deep Indoor].
Indoor - RSCP Minimum If the RSCP of the best server in a bin is greater than or
equal to this value, the bin will receive coverage class
[Indoor].
Incar - RSCP Minimum If the RSCP of the best server in a bin is greater than or
equal to this value, the bin will receive coverage class
[Incar].
Outdoor - RSCP Minimum f the RSCP of the best server in a bin is greater than or
equal to this value, the bin will receive coverage class [Out-
door].
Limited or No Service - If the RSCP of the best server in a bin is greater than or
RSCP Minimum equal to this value, the bin will receive coverage class
[Limited or No Service] If the RSCP is smaller than this
value, the bin will be classified as [No Coverage].
LTE Deep Indoor - RSRP Mini- If the RSRP of the best server in a bin is greater than or
mum equal to this value, the bin will receive coverage class
[Deep Indoor].
Indoor - RSRP Minimum If the RSRP of the best server in a bin is greater than or
equal to this value, the bin will receive coverage class
[Indoor].
Incar - RSRP Minimum If the RSRP of the best server in a bin is greater than or
equal to this value, the bin will receive coverage class
[Incar].
Outdoor - RSRP Minimum If the RSRP of the best server in a bin is greater than or
equal to this value, the bin will receive coverage class [Out-
door].
Limited or No Service - If the RSRP of the best server in a bin is greater than or
RSRP Minimum equal to this value, the bin will receive coverage class
[Limited or No Service] If the RSRP is smaller than this
value, the bin will be classified as [No Coverage].
5G NR Deep Indoor - SS-RSRP If the SS-RSRP of the best server in a bin is greater than or
Minimum equal to this value, the bin will receive coverage class
[Deep Indoor].
Indoor - SS-RSRP Minimum If the SS-RSRP of the best server in a bin is greater than or
equal to this value, the bin will receive coverage class
[Indoor].
Incar - SS-RSRP Minimum If the SS-RSRP of the best server in a bin is greater than or
equal to this value, the bin will receive coverage class
[Incar].
Outdoor - SS-RSRP Mini- If the SS-RSRP of the best server in a bin is greater than or
mum equal to this value, the bin will receive coverage class [Out-
door].
Limited or No Service - SS- If the SS-RSRP of the best server in a bin is greater than or
RSRP Minimum equal to this value, the bin will receive coverage class
[Limited or No Service] If the SS-RSRP is smaller than this
value, the bin will be classified as [No Coverage].
Base Station General Min Power RxLev Base station issue is reported only if the RxLev power level
Evaluation Analy- is above this value.
sis
Min Power RSCP Base station issue is reported only if the RSCP power level
is above this value.
Min Power RSRP Base station issue is reported only if the RSRP power level
is above this value.
Min Power SS-RSCP Base station issue is reported only if the SS-RSCP power
level is above this value.
LTE Wideband RSRP Antenna Difference Base station issue is reported if the difference of averaged
Antenna RSRP between two TX antennas is above this value.
You need a MIMO measurements to perform this analysis.
Sector Out Of Delta Angle Sweet Spot Base station issue is reported if delta angle between cell
Alignment antenna direction and measurement point direction is
above this value.
You need a BTS list to perform this analysis.
Too High Power Delta Angle High Power Base station too high power out of sector issue is reported
Out Of Sector if delta angle between cell antenna direction and measure-
ment point direction is above this value. Normalized power
difference must apply too.
Threshold Power High Base station too high power out of sector issue is reported
Power if normalized power between measurement point and
sweet spot is above this value. Delta angle between cell
antenna direction and measurement point direction must
apply too.
You need a BTS list to perform this analysis.
RRC Processing RRC Triggers Reprocess RRC triggers This option triggers a forced recalculation of RRC triggers
for all imported files in SmartAnalytics for the current data-
base for a single execution. Upon completion of the proc-
ess, this setting reverts back to disabled automatically.
This option is intended to repair missing or broken
advanced triggers in databases affected by interrupted
data processing or other issues. Depending on the data-
base size this can be a long running task depending on the
size of the database.
12.3 Colors
To standardize the use of colors among the Smart applications, define color profiles
and associate colors to dimensions' and measures' values, e.g., associate a unique
color to each technology.
For measures:
● Define the "Min". Alternatively, check "No min" box.
● Define the "Max". Alternatively, check "No max" box.
● Define the "Color":
– Either by entering its Hex value.
– Or by clicking the color sample to select the desired predefined color.
● Define the "Category".
e) Click "Apply".
By clicking icon, you display the list of value items to which the profile is assigned.
13 Operators
To create an operator:
1. Click "+".
2. Enter the operator name under "Display name". Alternatively, look up an operator
from the drop-down list by entering the operator name or by searching for a spe-
cific MCC / MNC.
3. Enter the operator channels for specific technology. The following rules apply:
● Separate individual channel numbers with semicolon (;).
● Use dash (-) for channel ranges.
● You can define a combination of channel numbers and ranges.
You can export and import the operator list as a CSV file. To import a CSV file, click
"Import operators" and chose one of the options:
● "Keep existing operators": add only new operators from the import file that are cur-
rently not present.
● "Replace existing operators": add new operators from the import file and update
the existing operators.
● "Replace entire custom operator database": remove all existing custom operator
entries and import all operators from the import file.
Annex
A NPS Tagging
This section describes how to set the NPS categories using the tagging feature in
SmartAnalytics Scene.
Typical NPS categories are city, road, highway, etc.
There are two possibilities to categorize the data:
● NPS category is set during data collection.
● NPS category is set using the tagging feature in SmartAnalytics Scene.
3. Add a tag with a pattern NPS:CategoryName (for example "NPS:Road"), and click
"Apply". Note: You can only mark the current set of sessions to a single NPS cate-
gory.
4. Repeat steps from step 1 to step 3 for all data you want to categorize.
5. On the navigation panel, click "Configuration > Network performance score (NPS)".
6. Configure "Weight" for each category. The sum of all "Weights" must be 100%.
7. Click "Save", and then click "OK" on the "Category configuration" pop-up to reproc-
ess the database.
In addition, reports created with NQDI KPI Report Generator are supported in SmartA-
nalytics Scene.
7. Copy your SQL query sections SELECT, FROM, WHERE and GROUP BY into the
Report Configuration SQL fields.
9. Click "Edit" to modify the excel report template according to the settings.
10. Modify the columns on the “Data Management” sheet, according to the columns in
the SQL query. Note: The ordering of the columns in the excel report must be con-
sistent with the SQL query.
11. Add/modify the pivot table on the"Average Report" sheet, with the data source from
the "Data Management" sheet.
12. For every additionally configured page in the Report Configurator, add the data
sheet in the Excel report template with the exact same name. Add new pivot tables
to the sheet "Average Report" with the data source from the new data sheets.
Note: If you create a new report, sheet changes in the VBA code are necessary.
1. Update the version of the report in the DB, using SQL Server Management Studio:
3. Overwrite the SQL script of the original report with the exported script.
3. Make sure that the fields B6 and F3 in the "Comment Sheet" are empty.
4. In this code part, "ReportMain" is the main macro, which is also called from NQDI.
You find this in the NQDI "Report Configurator":
A likely issue to happen is that images in the old report no longer appear in the upda-
ted report.
Reason - We cannot assume anymore that the target computer of the user has an
NQDI or Automation Agent installed locally. Therefore, we can no longer rely on refer-
encing images from an image folder on the local PC.
Solution - Images (e.g. logos) now need to be added within the Excel template before-
hand.
A likely issue to happen is that images in the old report no longer appear in the upda-
ted report.
Reason - We cannot assume anymore that the target computer of the user has an
NQDI or Automation Agent installed locally. Therefore, we can no longer rely on refer-
encing images from an image folder on the local PC.
Solution - Images (e.g. logos) now need to be added within the Excel template before-
hand.
2. A pop-up dialog shows which reports you can create. Click to open drop down
menus.
3. After successfully creating the report, a download button inside a new notification
allows you to download the newly generated report.
4. As soon as the downloaded report opens in Excel, the report will be formatted and
displayed.
Note: Tabs can be duplicated but tabs with the same name are not allowed.
7. Select values from the drop down options on the left.
Columns now show statistical values in the exported results.
2. Click "Edit" from the overflow menu of the report template you want to customize.
3. (Optionally) Select the workspace and the tab from the autocompleted "Select
Workspace and Tab" dialog:
This option adds a link on the "Comment Sheet" tab of the executed excel report,
so you can automatically navigate from there to the specified SmartAnalytics
Scene workspace.
C.1 Prerequisites
Following basic prerequisites are required:
● Shape file and corresponding mandatory and optional files
● Active database with the user role that allows the import of the custom regions
● To be able to import the custom regions into the database following files are
required:
– SHP -> shapefile with a geospatial vector data (mandatory)
– SHX -> shape index format; a positional index of the feature geometry to allow
seeking forwards and backwards quickly (mandatory)
– DBF -> attribute format; columnar attributes for each shape, in dBase IV format
(mandatory)
– PRJ -> projection description, using a well-known text representation of coordi-
nate reference systems (optional, if not given the EPSG: 4326 is used)
C.2.1 Regions
World
This is the topmost level for regions, all continent levels are mapped to "World". This
level is present by default.
Continent
The world level contains all continents. The continents are automatically provided by
the application, so there is no need for additional import. Continents are defined as
"Zoom Level 2".
The following continents are defined per default:
● Africa
● South America
● Europe
● North America
● Australia
● Central America
● Asia
Country
A large number of countries are automatically distributed with the SmartAnalytics
Scene application. A manual import is only necessary where a country is missing from
the default list. All countries correspond to "Zoom Level 5".
Sub District
Very granular region definition, in most cases not used at all. Defined as "Zoom Level
9".
All shapes used must have following attributes defined to be able to import them into
SmartAnalytics Scene:
● Parent
● Name
● Zoomlevel
● Category
The shapefile can have other attributes as well, they will be ignored during the import.
2. Search for a country in the search bar. In this case, we use Switzerland as an
example.
3. Click the "Shapefile" link to download the zip file with shapes.
Note: You can use another, appropriate tool. Just apply the changes with the corre-
sponding tool options.
Note: To be able to import the shapes into SmartAnalytics Scene later, the attribute
table must be modified prior to import.
3. Open the QGIS application and import the first layer (in this case we start with
country).
Note: Only import country shapes you do not see already in your country region
after initially importing the data.
4. In the user menu option, navigate to "Layer > Add Layer > Add Vector Layer".
5. Select the file from the shapefile zip folder you downloaded earlier
(gadm36_CHE_0.shp).
Note: In the popup dialog there is an option to select the encoding if you have
strange names in the attribute table after adding the layer to the list.
6. Locate the shape under "Layers" after importing.
Note: The downloaded data contain some fields that are not needed (i.e. GID_0)
and some that can be used directly (only a column rename needed).
2. Open the layer properties: "Layer > Layer Properties".
5. Delete (or rename) the field "GID_0" using the buttons above the table. Close the
window.
6. Open the attribute table again (F6) and change (or add) the following values for the
column headers:
● Name: Switzerland
● Category: Country
● Parent: Europe
● Zoomlevel: 5
After editing the table should look like this:
3. Set 0.0001 as Tolerance (you can also experiment with values here).
Note: The goal is to have the shapefile at least half the size of the original.
4. Click "OK".
A new layer with a simplified shape is created.
5. Right click the mouse and select "Export > Save features as..." to export the layer.
There are more columns in the attribute table. Some of these can be now renamed and
reused:
7. Insert "7" in the input field. This is the corresponding level for provinces.
8. Apply the value to all rows using the "update all" button.
Note: Use "replace( 'NULL','NULL','Province' )" to replace text or NULL.
This action opens the dialog where the user can either drop the files or select them
from the file system.
The list below the drop zone shows the check symbol for those files that are being
dropped (or selected) by the user. The ok button stays disabled as long as the manda-
tory files are missing:
The missing files can be dropped afterwards; there is no need to drop all files at once.
As soon as all the mandatory files are selected the "OK "button is enabled and the user
can start the actual import of the files into the database.
The import process does not actively trigger processing of the database. The user
needs to either start the incremental or full processing of the database for the imported
regions shapes to be active.
As long as the database is not being processed, the database entry in the database
management page shows the information that there are new regions in the database.
The above table shows how the duration of the ETL from the NQDI database structure
to the SSAS statistical cubes develops according to the accumulated database size.
The increase per import of the same size is linear, but wait times have to be consid-
ered after importing a couple of days worth of measurement with multiple devices.
As we can see, the incremental processing is only fraction of the time a full processing
requires, but produce the same result. Still, the overall duration of the full sequence is
usually longer than a single full processing. So in case you have weeks of measure-
ment to important at once, consider importing it in a single step using Automation
Agent, rather than importing portion after portion. If, on the other hand, you get new
data on a daily basis and do not want your databases being blocked by ongoing pro-
cessing, import those smaller chunks of daily data every night and you have the latest
data for reporting and analysis ready every morning.
On top of the listed duration for the incremental processing comes the import duration
into NQDI database structure, which is basically the same as importing data into an
NQDI only database. The NQDI database import grows with the amount of already
present data as well, but far less than the additional processing SmartAnalytics Scene
requires for it’s statistics.
To make a long story short: import your data when you do not need to work on your
data with SmartAnalytics Scene, usually, the best approach is to have Automation
Agent automatically download files and have it import and process the data outside the
working hours. This way databases (assuming a SmartAnalytics Scene Pro System
and exclusive usage of the database server) up to 1TB will finish by the next morning.
If processing time is critical, Automation Agent supports to configure which level of
detail is retained by the import. As an example, you will not require certain types of
data for a pure KPI and reporting database, so it is a good idea not to waste precious
resources (time, disk space, energy) on such records.
For more information on the topic of data processing optimization and detailed recom-
mendations based on your own corporate use cases, please contact our regional sup-
port.
Shared application and database server (significantly affects application usage, while
processing databases)
64-bit processor, Xeon Server Processor with 8 Cores
64GB RAM, 2TB SSD storage
TASK DURATION
Create Database 48 s
TASK DURATION
TASK DURATION
TASK DURATION
Shared application and database server (significantly affects application usage, while
processing databases)
64-bit processor, Intel Core i7 Processor with 4 Cores
32GB RAM, 1TB SSD storage
TASK DURATION
Create Database 48 s
E Value Customization
Under "Settings -> Value customization" you can create custom values, events and
KPIs.
You can also create custom KPIs using SQL scripts. For more information, refer to
Chapter E.4, "Custom KPIs via SQL Scripts", on page 273.
4. Click "OK".
You can also create a custom value from the custom event. Refer to Chapter E.2,
"Custom Events", on page 265.
3. Define event properties such as "Event Name", "Event Description", "Event Type":
4. (Optional) Select "Create Custom Value" to have the custom event also available
as the custom value item.
You can find the created custom event on the "Events" tab:
The "Status" of the custom event is set to "Inactive" by default. The "Status" refers to
the current database.
The "Source" of the custom event is "Protocol view" or "CM360º " (if created with the
CM360° solution).
The "Source" of the standard (default) events is "System". They cannot be altered or
deactivated.
After the database is being reprocessed, the custom event has the "Active" (green)
status.
You can skip the database reprocessing after the custom event creation and do it only
after the custom KPI activation.
SmartAnalytics Scene supports preconfigured set of ROMES4 events. You cannot cre-
ate new or edit the existing ones.
3. Name the KPI, then select start and end event from the available events:
4. (Optional) Select the error code for the detection of the end event:
● "0 - No Error": KPI status successful
● "1 - Reject": KPI status failed
● "2 - Failed": KPI status failed
● "Timeout": If the time difference between the start and end event exceeds the
timeout value, the KPI is marked as "Failed".
● "New start": If the additional start event is reported before the end event, it is
taken as the start event, and the previous start event is discarded.
The custom KPI with the status "Inactive" is added to the list:
Your custom KPI is visualized, together with standard KPIs, in the SmartAnalytics
Scene workspace:
● Create a .sql file starting with "CKPI" and a unique index number. "CKPI1000Cus-
[Link]" would be a valid script name.
● Consider whether you want that script to be active on all your databases or not. If
not, you will have to check for the database name in the script and only continue if
the name was as expected.
● Make sure that you delete all prior calculations of the same KPI, especially if you
have tried out prior versions of the script on the system.
Figure E-13: An example of a custom SQL script for creating custom KPIs
On the application server host instance paste these scripts into the folder
C:\Program Files\Rohde-Schwarz\SwissQual Smart\PostProcessing\
SmartAnalytics\ProcessingEngine\Scripts\sql\custom.
This step is not required if you have a new database and are only about to import the
first measurements. In general, the import of new files into an existing database will
trigger the execution of the custom scripts for all data and not just the new one.
● Under Settings – Database locate your DB on the database management page.
● Then click on the context menu and choose the command "Custom Events re-pro-
cessing".
● This will execute your script and also do some automatic precalculated statistics for
the new KPIs.
You can turn the generic "Results KPI" value columns like "Value 1" into properly
named measurement values with a unit by heading to "Settings / Custom Values" and
defining new measures in the value tree.
F.1 Scope
This appendix explains how to use custom scripts to create custom categories, which
can then be used in SmartAnalytics Scene as additional categorical dimension for val-
ues.
Custom categories are an efficient way to enrich the content of SmartAnalytics Scene
with custom-specific elements to allow for new charts and use cases unavailable in the
standard installation. With custom categories, some proficiency with SQL and a bit of
imagination, many possibilities unfold.
Custom categories are automatically integrated in the time domain data structure and
the statistical cubes. They can be used like any other dimension once the data has
been processed with your custom script included.
A limitation that applies is that you may only define one additional custom category per
base value, listening quality LQ in this case.
A custom script is a script written in SQL, which helps to manipulate or configure your
data during database processing.
The name of your custom script file is extremely important. The reason is that it influen-
ces important technical aspects of when and how it is executed.
Names of custom scripts for custom categories have to start with the prefix “Dw10” fol-
lowed by a two-digit number e.g. “03” which defines in which place of the ETL workflow
it is executed. For example, a valid custom script could be named like
[Link].
3. This should then look similar to the next screenshot following below:
4. You may want to execute and test your code on a non-critical DB first. You can
then use the following commands in another query window to verify that you
actually achieved a result:
Now, a short explanation about the code itself and some insights on how to change it
according to your own actual needs in the future.
The code in light blue from the previous example with “FactSpeech” is to be left alone,
you should never be required to change that.
● “FactSpeech” is the name of the table where SmartAnalytics Scene stores LQ test
results. It is essential to understand that all base data used by SmartAnalytics
Scene is stored in the tables starting with “Fact”.
– If your own next new custom category should account for 5G NR PDSCH
throughput rather than voice services test results simply use a different table
name in the script e.g. FactNR5GPDSCHStatisticsInfo and replace all
"FactSpeech" elements you see in the code above accordingly. There are 6
occurrences of that text to replace.
– A different fact table will have different column names and types for its data.
Again, referring to the PDSCH example, this would mean that you have to
replace the LQ column with something like
ScheduledPDSCHThroughputKbps.
– Finally, you would have to specify a different criteria and names for your cate-
gory. Here is how a modified working code would look like:
When your place a custom script, it is used for ALL databases which will be processed,
not only for a single one.
If the database is processed over an Automation Agent task, the custom script has to
be placed in the “custom” folder of the Automation Agent installation.
Find your Automation Agent installation path and place the script into the “custom”
folder (e.g.
C:\Program Files (x86)\SwissQual\NetQual\
Automation Agent Service\scripts\sql\custom).
If you process your database over the SmartAnalytics Scene application, you have to
place the script into the “_data” folder of your SmartAnalytics Scene installation path.
Find your SmartAnalytics Scene installation path and place the script into the “_data”
folder:
Finally, trigger the following database command on the Database Management page
for your database to have your new categories created:
You only need to run this process once and only if you need the new categories to be
added for an existing database with data. Whenever you are importing new files to a
DB this task is covered automatically anyway.
Execute the following steps to enable NQDI report to work with SmartAnalytics Scene:
1. Update the version of the report in the database:
2. Export SQL script with NQDI or using the SwissQual Report Handler.
On how to export SQL script with SwissQual Report Handler, refer to Chapter G.2,
"Import / Export Excel Reports", on page 288.
Note: This adds the correct version to the report script.
3. Overwrite the original report SQL script with the exported script.
4. Open the template in Excel. Make sure that the report has a sheet named "Com-
ment Sheet":
9. In this code part, "ReportMain" is the main macro which is called also from NQDI.
You find this in the "Report Configurator" of NQDI:
5. Click "Check connection and list DBs" to verify the connection to the server.
7. If you successfully log in, you can see the list of available reports in the selected
database.
8. To export a report, select the report you want to export, then click the export icon
.
Note: Two files get exported: SQL and XSLM file.
9. To import a report, click the import icon , then select SQL and XSLM files within
the corresponding dialogs.
Each class value is calculated per map bin level. For the best results, we recommend
setting 100 m as the base bin size. For more information, refer to Map bin settings.
The following classes are available:
● Deep Indoor (> -60 dBm)
● Indoor (from -60 dBm to -75 dBm)
● In car (from -75 dBm to -90 dBm)
● Outdoor (from -90 dBm to -115 dBm)
● Limited or No Service (from -115 dBm to -135 dBm)
● No Coverage (< -135 dBm or no measurements)
● Inconclusive. This class occurs when the information is not reliable due to insuffi-
cient data available in the map bin. To improve results, increase the base bin size.
To customize the coverage levels, go to "Database > Configuration > Analysis" tab.
You can display scanner coverage classification values on the bar chart, pie chart, cdf-
pdf line chart, maps and in the value list.
Channels/frequencies that are not found on air but are configured for the measurement
are classified as "No coverage".
I Customer Support
Technical support via GLORIS
Submit your support request via GLORIS. For detail instructions on how to do it, please
refer to Manual - MNT Support Services in [Link].
Up-to-date information and upgrades
To keep your instrument up-to-date and to be informed about new application notes
related to it, please send an e-mail to the customer support stating your instrument and
your wish. We will take care that you will get the right information.
For a direct communication with our support center, refer to Rohde & Schwarz Global
Contact page.
Index
A F
At a Glance ........................................................................ 11 Filters
Filtering charts .......................................................... 135
C Filtering phones ........................................................ 135
Create I
Custom categories .................................................... 276
Custom categories .......................................................... 276 Import data
Custom regions ............................................................... 249 Import TEMS files ....................................................... 46
Import reports
D SwissQual Report Handler ....................................... 288
Dashboard ......................................................................... 60 M
Data
Mastering the data ...................................................... 15 Machine Learning
Data Visualization Anomaly Detection ...................................................... 30
Anomaly Detail View ................................................. 196 Call Stability Score ...................................................... 29
Bar Chart .................................................................. 148 Management
Cell-based Table ....................................................... 153 Database .................................................................... 37
Detail Views .............................................................. 194 Migrate reports to SmartAnalytics Scene
Divider ....................................................................... 197 SwissQual Report Handler ....................................... 285
Drilldown Views .........................................................169
Indoor map ................................................................ 165 N
Line Chart ................................................................. 177
Map Bins ................................................................... 160 Network Performance Score ............................................. 26
Map Points and Cells ................................................ 170 NPS ................................................................................... 26
Map Regions ............................................................. 159 NQDI Excel Reports ........................................................ 238
Maps ......................................................................... 158 Adjust the template ................................................... 241
Pie Chart ................................................................... 157 Create new ............................................................... 238
Protocol View ............................................................ 195 Customize reports ..................................................... 247
Scanner ACD ............................................................ 200 Execute reports ......................................................... 244
Statistical View .......................................................... 148 Migrate reports .......................................................... 240
Statistics Table .......................................................... 152 Troubleshooting ........................................................ 243
Statistics Value List ................................................... 154
Time-based Table ..................................................... 173 R
Time-based Value List .............................................. 175
UE Capabilities View .................................................197 Region
UE Views .................................................................. 188 Import ........................................................................ 249
Views ........................................................................ 147 Preparation ............................................................... 249
Waveform Player ...................................................... 198
Database S
Add an existing database ........................................... 39
Scenarios .......................................................................... 67
Add Server .................................................................. 43
Settings ........................................................................... 214
Commands ................................................................. 41
Analysis .................................................................... 221
Create ......................................................................... 37
Colors ....................................................................... 228
Delete database .......................................................... 43
Database .................................................................... 37
Edit .............................................................................. 39
Operators .................................................................. 232
Import data .................................................................. 44
SwissQual Report Handler .............................................. 285
Import TEMS files ....................................................... 46
Management ............................................................... 37 T
Process ....................................................................... 39
Processing Performance .......................................... 260 Thematic Editor ............................................................... 228
Remove data .............................................................. 46
Set active .................................................................... 39 W
SuperCubes ................................................................ 47
Upgrade ...................................................................... 39 Workspaces .......................................................................67
E
Export reports
SwissQual Report Handler ....................................... 288