Follow Hybrismart | SAP hybris under the hood on Feedspot

Continue with Google
Continue with Facebook


Hello colleagues! My name is Maxim Bilohay and I would like to tell you about the new developer experience we brought in our SAP Commerce Cloud Eclipse plugin.


Back in 2016, EPAM released an Eclipse plugin that makes it easier for developers to work with SAP Commerce (Hybris that time) code. The first version was released in January 2017. Recently we introduced a version 2.0, and I will try to explain the details of the new features and improvements, besides reminding the existing functionality.

For those who are not familiar with the Eclipse, it is one of the most used and well-known IDEs for developing Java applications. It is also very popular among the SAP Commerce developers. Hybris, and later SAP mentioned Eclipse in their documentation as the recommended option for developing SAP Commerce applications.

IDE plugins help developers in building apps faster and easier. They automate different facets of the routine work and make the IDE work smoother with the framework, library, scripting syntax originally not supported by IDE out of the box (like SAP Commerce, for instance).

Because of the open nature of the IDE marketplaces, it is common to have more than one plugin available. For SAP Commerce, there are several options. Being developed independently, they have own pros and cons for a particular task.  There is no single answer on what is better fit your tasks. You should try and find it on your own. But we at EPAM did our best to combine the best aspects of every solution with our personal experience, which we gathered on a variety of different projects. Our goal was to empower the developers (and ourselves) with the best toolset. That’s why this product was built by developers for developers and we hope it has an “engineering” spirit.

Key features

The plugin lets users work with common SAP Commerce Cloud items and offer numerous features, which can be useful in SAP Commerce day to day work:

  • Easy import of specific SAP Commerce Cloud extensions/projects, along with resolving their dependencies.
  • A custom Impex editor with built-in auto-complete, syntax highlighting, auto auto-formatting and ability of folding/expanding Impex blocks.
  • Easy navigation in Impex Editor using Eclipse Outline View.
  • Ability to run and validate ImpEx on remote servers
  • Hyperlinks from Impex items to items.xml types/attributes declarations.
  • Flexible search editor with auto-complete and formatting, also including the ability to execute it right against the remote server and see the result inside your IDE.
  • Solr search console which allows the execution of Solr queries directly from Eclipse (NEW)!
  • SAP Commerce Cloud Log Watcher, which helps you easily troubleshoot your application in real-time (NEW)!
  • Quick creation of simple queries in Flexible Search Query Builder.
  • SAP Commerce extensions/modules generation.
  • Ability to generate SAP Commerce Cloud-specific classes and merge them into spring context.

A short video about core features you can watch here: YouTube video.


Our plugin is available for download on the Eclipse Marketplace.

New Features in details

Solr search console enables you to execute Solr queries directly from the Eclipse. If you work with the search module in SAP Commerce, you will find it handy to have a developer console closer to the code.

The plugin allows choosing one of the available cores. (You should configure the URL to your Solr server in Config tab, right after it will pull the cores from it). Also, you will be able to configure the query using familiar Solr search query parts like “q” or “fq”. The UI is very similar to what we get used on Solr Server Admin page, so it should not be difficult to understand it for those who already worked with Solr. Once you fulfill your query, you can run the request and you will able to see the results in Output tab.

Server Log Watcher/Analyzer was developed with the goal of helping you quickly determine issues and errors which are happening on your server during your development.

To describe this feature simply, it is a Tomcat log watcher, which will grab messages from the console log and will show them in your IDE. You can configure severity level, of course. The main feature of this log watcher is it will convert any stack trace messages into links to classes, and even to specific lines of code. Simply speaking, if there will be an error while the server is running, you will be able to see it immediately in the IDE and you will be able to navigate to the specific place where exception or error was occurred.

Just imagine this combined with the power of JRebel and automated tests suites! How easy will it be for you to be able to find bugs, repair them and ensure they are not happening again? You will never miss anything important, because you will see all errors you had, with specific severity, understand the root cause and easily fix them, all right within in your IDE.


I hope that article you get acknowledged about existing tools for SAP Commerce developers. For those who send suggestions/reports all this time I am saying “Thank you”. They are very useful to us.

Please feel free to contact hybriseclipseplugin@epam.com with any proposition, idea or problem you had or email me personally Maxim_Bilohay@epam.com.

Thank you!

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Today SAP officially released a new version of SAP Commerce Cloud, 1905. “05” stands for “May”, today is 30th, so SAP barely made it on time. What is new?

The most important changes are in Smartedit and Integration.

Smartedit now supports Spartacus and content authoring workflows.

The 1905 platform introduces new and enhanced integrations:

  • the recently announced SAP Cloud Platform Extension Factory (Templating for APIs and Events, Redesigned Integration Process, Standardized Event Library, and Automatic Certificate Renewal)
  • SAP S/4HANA cloud for B2B, 
  • SAP Sales Order Simulation module
  • SAP CPQ integration (Image Replication)
  • SAP Cloud for Customer integration (Customer Replication)
  • SAP Marketing Cloud integration (Enhanced replications)
  • SAP Customer Data Cloud (Session Management and Consent Template Replication)
  • External Providers of Personalization Segments (optimized integration flow for better performance)

Some of these integrations were available earlier. In the new version, they were enhanced.

In the new version, SAP introduced a new way of publishing data to external systems, Outbound Sync. It allows you to configure outbound data replication in SAP Commerce Backoffice or via ImpEx and publish only new and modified items, so-called “delta changes”.

It also introduces an Order Management External Consignment Fulfilment Framework for S/4HANA and ERP.

Commerce Web Services API now supports personalization functionality.

SAP introduced a new Polyglot Persistence feature which helps to relieve the load of the main database or to provide a non-SQL storage for some resource-intensive data, such as shopping carts.  To make it possible, the developers need to optimize the data structure and its related types as a single composed structure, or a document. Currently, Polyglot persistence offers a default implementation for the Cart type (ydocumentcart extension template) and can be used as a reference for other types. Additionally, there are transaction management and caching subsystem, which allows you to cache all modifications performed on a single item and flush them to the persistent storage when the main operation ends. Reading is realized through a query language similar to FlexibleSearch.

There is an important Generic Audit update. This mechanism keeps track of modifications of attribute values of the auditable set of hybris objects. It now provides a way to configure auditing on a finer level of granularity than type. Such narrowing down prevents unnecessary database size growth.

There are also interesting enhancements in Promotions. By default, Promotion Engine contains a mechanism called Order Entry Consumption. With it, only one promotion can be applied to each product. However, sometimes you might want to allow your customers to get multiple discounts on the same product. With the latest changes, it is now possible.

SAP also added a new selection strategy for “Most expensive” for bundle promotions and partner-product promotions. To exemplify this, “A multi-buy promotion offers three film rolls for $10. Let’s say the customer has five film rolls in the cart that could qualify for the promotion. With the new selection strategy “Most Expensive,” the most expensive three film rolls are included in the bundle.

The platform supports

  • MySQL 8 and Oracle 12c Release 2,
  • Java 11 and Spring 5
  • Solr 7.7.1

In general, the changes are not dramatic, more evolution than revolution. There are a lot of smaller changes, such as the folder structure, bug fixes, and small enhancements.

Stay tuned!

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

In this article, I give a technical overview of hardware security key based authentication and take you through a worked example of SAP Commerce and hardware security key integration.


Digital transformation is happening everywhere, and as a result, the enterprises are turning inside out. Formerly closed systems reside in the cloud, often outside of the traditional zone of control of security departments. When the company exposes its systems and internal information to agents, partners, and customers, the security risks are growing.

SAP Commerce Cloud is widely used as a platform for taking B2C experience into the B2B world.  You need to be able to provide peace of mind to your business partners and customers that their sensitive data won’t be compromised. When it comes to opening up such data to anyone outside the company, security becomes a key concern.

In this article, I give a technical overview of hardware security key based authentication and take you through a worked example of SAP Commerce and hardware security key integration.

In the experiments and PoC, I used the hardware keys from Yubico, YubiKey 5 NFC and YubiKey NFC Security Key, multi-protocol key-sized devices supporting FIDO2/WebAuthn.

Let’s start with a short overview of authentication methods, their advantages and drawbacks. What’s wrong with good old passwords?

Passwords Are No Longer Enough

According to the market survey from SecureAuth Corp.,  39% of companies use password-only authentication. OneLogin reports, that 93% of their respondents said their companies have guidelines around password complexity. Single authentication provided by passwords is no longer sufficient, and there are six reasons for that:

  • We forget these passwords. Users have problems remembering the passwords used for different web services; these services have different password policies that create password variations which are hard to remember. Easy passwords are guessable.
  • We are lazy. People reuse passwords at many sites and tend to use the same password for many accounts. One hacked website leads to others. The easier the password is for the owner to remember it, the easier it will also be for the attacker to guess.
  • We are gullible. We tend to think that everything around us is built properly, but in fact, the vast majority of websites is built with virtual duct tape and chewing gum approach. Cross-site scripting vulnerabilities may expose your password in the browser. Key loggers and other malware snoop passwords. Many websites store your password as plain text.

According to Market Pulse Survey, 75% of respondents said they reuse passwords across different accounts (compared with 56% in 2014), 47% reuse passwords for both personal and work accounts, and rarely changing passwords for work accounts (23% do this at most twice a year) and personal accounts (67%).

The service providers do not always follow the recommendations and security standards. The reasons are different. Last March, Facebook confirmed that they store millions of user passwords in the plain text for years. Although this did not result in the data breach, we see that even giants deviate from the best practice endangering user’s privacy and business reputation. Of course, for millions of smaller services, the situation is even worse: many of them cut corners and go with security-through-obscurity to release early.

In other words, a password is a weak link in this chain. Ok, if not passwords, then what? Phishing.

According to Gartner, phishing is the most common targeted method of a cyber attack. This method of trying to gather personal information using deceptive e-mails and websites is the fastest growing sport in the hacker community. Data protection and security implications in the corporates are fundamental today not to only ensure user privacy but to also potentially save a company from damages worth millions.

Adding the second authentication mechanism is becoming a must in all B2B solutions where security risk is high.

Passwordless authentication methods

Passwordless methods can be used for primary and/or secondary authentication steps as well as a single-factor authentication method (where only one step is present). Some methods are designed only for second-factor authentication.

The following methods are generally associated with a passwordless login:

  • Involving of communication devices and services.
    • SMS-based. One-time password sent as text messages. This approach is used by some mobile apps as the only authentication method. It is also widely used as a second-factor authentication.
    • A phone callback. You receive an automated phone call that requires you to press any key to confirm your identity.
    • Outgoing phone call. You are asked to dial out to confirm your identity.
    • E-mail link. You are asked to check the e-mail and click a link. Basically, the link contains a one-time password, but it is usually not human readable.
    • E-mail OTP. One-time password sent as an email message.
  • Involving of 3rd party services and special devices:
    • OTP (TOTP/HOTP). One-time passwords cryptographically generated by the remote services (Google Authenticator) or attachable devices (Yubikey OTP).
    • Cryptographic tokens. Hardware USB tokens (RSA SecurID, U2F Security Tokens)

Some of the methods above assume that the phone number is uniquely associated with a user. In fact, it is not so: in many countries, the phone numbers are reassigned to a new user when abandoned. Text messages and phone calls can be redirected to another device or service. Also, all listed methods require you to be connected to the communication network and have the specific communication services enabled. For example, being in roaming many prefer to turn these off that makes the authentication impossible or expensive.

SMS-based authentication is not safe especially if your mobile provider doesn’t provide the required level of security. Being locked to the cellular services, the attackers can use the social engineering of phone companies to redirect messages and even manipulation of the Signaling System 7 network.

Read more

Hardware security keys are currently the best solution for multi-factor authentication. However, they do not guarantee absolute immunity. In March 2011, RSA Security was hacked, compromising up to 40 million tokens which RSA have agreed to replace. Literally a couple of months ago, Google Titan announced free replacement for the keys because of the “misconfiguration in the pairing protocols” and risk of being hacked. The companies learn from their mistakes, and the latest products seem to be very secure and stable.

Passwordless methods can be used in combination with each other and password-based. Let’s have a closer look at these options.

Single-factor and Multi-factor authentication

Single-factor authentication (SFA) is a process for securing access to a system that identifies the party requesting access through only one category of credentials.

There are two common single-factor authentication methods:

  • Username/password, the most common method
  • One-time password, by e-mail or SMS (or a link with OTP sent by email)

The third pattern, “single-factor authentication with a hardware security key only”, is not supported by main browsers yet. At the time of writing (May 2019) this is only supported by Edge on Windows.

Two-factor authentication also called two-step verification, adds an extra step to the sign in process, the second factor.

The most common form of two-factor authentication is mixing a password-based authentication with a passwordless approach with the use of SMS services. This assumes that the user has a personal device, a mobile phone, nearby each time the system requests a second verification.

There are two common patterns for two-factor authentication:

  • First method: Username and Password authentication
    Second method: Passwordless, such as SMS/phone calls/e-mail links/hardware tokens
  • First method: Passwordless, such as hardware tokens
    Second method: Password (pin code)

So, hardware security keys can be used with ease for the second-factor authentication, as one-time password generators or as containers of private keys.

Multi-factor (MFA) requires two or more authentication methods combined to log in.

With this method, the authentication credentials are combined from

  • Something the user knows (a password or a PIN)
  • Something the user has (customer’s device)
  • Something the user is (biometric verification)

Last two components involve the hardware devices into play.  Let’s have a look at the hardware security keys in detail.

Hardware Security keys (tokens)

Many years one-time password hardware tokens along with X.509 browser certificates seemed unbreakable. Some of the keys, such as Sentinel’s, require installing OS-specific drivers.  The need to install drivers and/or use java applets resulted in a popular backlash.

There are three groups of protocols generally associated with the hardware security keys:

  • TOTP/HOTP. The hardware one-time-password generators. They are becoming less and less popular because everyone has a much more powerful computer in their pocket, a smartphone, which is capable of doing the exact the same thing. “Gartner’s expectation is that the hardware OTP form factor will continue to enjoy modest growth while smartphone OTPs will grow and become the default hardware platform over time.”
    • TOTP: Time-based One Time Passwords (OTP). Generates an OTP by taking uniqueness from the current time (30-sec periods). TOTP passwords can be phished, but requires attackers to proxy the credentials in near real-time (within 30 sec) rather than collect them later on in time.
    • HOTP: HMAC-based One Time Passwords. Uses two parameters: an increment and a private key. The key and the server increment the counter independently of each other. Because the passwords are not limited in time and smartphones can do basically the same with no cost,
  • FIDO U2F. Second-factor authentication. This type of token basically consists of a secure microprocessor, a counter which it can increment, and a secret key. For the website, the key creates a HMAC-SHA256 of the domain and a secret key, and uses both to generate a public/private key pair. This pair is used to authenticate. If the user gets phished, the browser would send a phished domain to the security key, and authentication would fail because of the domains mismatch. If the attacker somehow managed to clone the security key (it’s generally believed that it is impossible, but let’s say), the system would notice that your increments are no longer monotonically increasing, and at least detect you will be able to detect that the key has been cloned. See more information about U2F in the next chapters.
  • FIDO2/Webauthn. Passwordless authentication. Can be used for single-factor or first-factor authentication and implements the concept of passwordless authentication. See more information in the next chapters.

In this article, I focus mainly on the last two types of protocols. Both have FIDO in the name, what is that?

FIDO Alliance standards

FIDO is an organization, founded in 2012, that’s tasked itself with a mission of making the authentication simpler for consumers and easier for service providers to deploy and manage as well as addressing “the problems users face with creating and remembering multiple usernames and passwords.”

The alliance has published three sets of specifications for simpler, stronger authentication:

  • UAF: FIDO Universal Authentication Framework
  • U2F: FIDO Universal Second Factor
  • FIDO2, which includes
    • WebAuthn: W3C’s Web Authentication specification and
    • CTAP: FIDO Client to Authenticator Protocol
    • All these protocols are based on public key cryptography and strongly resistant to phishing (to varying degrees).
Universal Authentication Framework, or UAF

UAF is a standard for passwordless authentication. It involves the use of client-side authenticator which may sample biometric data such as a face image, fingerprint, or voice print to unlock private key used for singing the authentication response.

(taken from https://fidoalliance.org/specs/fido-uaf-v1.1-id-20170202/fido-uaf-overview-v1.1-id-20170202.html)

The biometric or PIN authentication in UAF happens locally on the device.

There are three key problems with biometric identification:

  • Biometrics are not private. Your ears, eyes, and facet are exposed. You leave fingerprints everywhere you go. Someone can record your voice.
  • Biometrics are hackable. There are plenty of videos on youtube demonstrating how iris recognition systems or fingerprints can be fooled if the biometrics data are compromised and fell into the hands of hackers.
  • Biometric characteristics are immutable. When stolen, it cannot be changed [without being re-born].
Universal Second Factor, or U2F

Universal Second Factor (U2F) is meant to replace second-factor authentication from non-secure SMS-based to secure U2F token based. This standard was developed by Google and Yubico. Currently, U2F seems to be the lightest, safest and phishing-resistant multi-factor authentication known today.

A U2F token like the Yubikey performs an authentication handshake with a relying party, such as a website. It not only proves to a website that it’s your unique key but also requires that the website proves its identity too, preventing lookalike sites from stealing credentials.

With this approach, the user logs in with a username and password as before. The app can also prompt the user to present a second-factor device, immediately after or before the sensitive operation is requested. The strong second factor allows the service to simplify its passwords (e.g. 4-digit PIN) without compromising security.

U2F is comprised of two basic components:

  • FIDO U2F authenticator which is basically a key fob.
  • U2F Javascript to be able to interact with and use Security Keys via browser API. There is a more capable extension of this interface, Webauthn, explained in the next chapter.

These components communicate with each other using a protocol CTAP1 (Client to Authenticator Protocol). CTAP1 tokens cannot be used without the user entering a username. Its newer version, CTAP2, is capable to store the username (see the next section), but this mode not supported by the majority of browsers yet.

The U2F spec contains three functions:

  • Register creates a new key-pair.
  • Authenticate signs with an existing key-pair, after the user confirms the physical presence (Yubikey – by tapping),
  • Check confirms whether or not a key-pair is known to a security key.

What is important is that the users can self-provision their own security keys. No need to get them registered upfront by admins as it was with first-generation keys.

Registration flow:

  • The user logs in using a username and password
  • The server hashes the password using the Scrypt, Argon2, or bcrypt hashing algorithms and saves it into the server’s database
  • The user adds their device. The devices generates private and public keys. The private key resides in the device, while the public key is sent to the server. The server saves the public key into the database and associates it with the user.

Authentication flow:

  • The user logs in using a username and password
  • The server hashes the password and compares the hash with the one in the database
  • The server generates a random challenge and sends it to the client along with the ID
  • the browser sends a challenge, ID, as well as token binding and the origin (domain) to the hardware key using a CTAP1 protocol.
  • The hardware key signs it with the private key (one for the ID) and sends back to the browser.
  • The server validates the signature against the public key for the user, and initiates the session if everything is in place

There is also a software “keyless” implementation of the authenticator, Soft U2F.

The following browsers currently support the use of U2F security keys:

  • Google Chrome, version 38 and later.
  • Opera, version 40 and later.
  • Mozilla Firefox, version 57 and later. Most Firefox versions that currently support U2F do not enable support by default. You need to activate security.webauth.u2f in about:config.
  • Microsoft Edge 11 build 17723 and later.
FIDO2 and Webauth

FIDO2 is an evolution of the U2F standard and backward compatible with it. It has two parts:

  • WebAuthn – JS API for account management based on public keys. This is a W3C standard, that means all browsers should eventually support it.
  • CTAP2 – Client-to-Authenticator 2, a standard of the communication protocol for USB, NFC and BLE devices. It utilizes a CBOR message format (RFC-7049) for security and performance. Backward compatible with U2F (CTAP1).

WebAuthn has two flows:

  1. U2F flow, also known as Client to Authenticator Protocol v1 (CTAP1), and
  2. Passwordless flow, CTAP2.

Authenticator generates and securely stores credentials. Private keys, PINs, and biometric information never leave the authenticator. It is a write-only device and onboard app with standard interfaces.

One of the major enhancements of CTAP2/Webauthn devices is that they are able to store the private keys in the persistent memory along with the username and domain. That makes it possible to provide an identity (username) and authenticate it in the same flow.

When a website prompts a user to get authenticated, and the user taps on the connected key, the website (browser->javascript->server) receives a signal that the user can be authenticated as well as the username and domain.

Because the standard is too young, Chrome and Firefox don’t yet support passwordless authentication. If you want to try it, use Windows 10 v.1809+ with Microsoft Edge. Yubico says that the latest MacOS with Safari Technology..

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Document-level access control ensures that the search results have only those products that a logged customer is authorized to see. This is a common request for B2B solutions with a large and sophisticated product and customer models. Many manufacturers and suppliers want to provide exclusive or restrictive access to products for particular partners. Such an approach reduces the number of incorrect or incomplete orders and makes navigation easier.

In this article, we are discussing in detail how to support product whitelisting/blacklisting per customer in SAP Commerce Cloud for large product sets and large customer base. We also present our solution and possible alternatives. The tests showed that the solution is capable to process millions of documents, tens thousands of customers and millions of access rules saying what product is blacklisted/whitelisted for what customer.

This article is a collaborative effort of EPAM solution architects,

Problem definition

SAP Commerce has limited support for fine-grained product access control access to work on large amounts of data. It is represented by Product Visibility configured on the level of categories which is limited to only work with the persistence layer but not through search and Apache SOLR. Another approach is Item-level access control. Both work fine when the amounts of data involved are relatively small. When it comes to millions of products and customers, the out-of-the-box solution won’t fit.

There are many facets and specific details in the original task we needed to take into account for solutioning. In this article, we discuss only one particular problem in isolation.

  • The Access Control Lists (ACL), the rules saying what product is whitelisted/blacklisted for what customers, are provided by the external system via data integration. The integration details are out of scope in the context of the article.
  • The key challenge is how to implement product search. For other components, the solution is trivial.
  • There are 1,000,000 products (P) in the product catalog.
  • There are 1,000 product groups (PG)
  • There are 30,000 customers (C)
  • There are 5,000 customer groups (CG).
  • There are 2,000,000 rules (C<->P, CG<->P, C<->PG, CG<->PG)

The goal is to work out an effective way hot to store and handle visibility rules full reload as quickly as possible.


As you know, SAP Commerce Cloud is tightly integrated with Apache SOLR. This search engine is used not only for full text search but also for populating product listing pages. There is no easy way to implement required functionality by re-configuring SAP Commerce or Apache Solr.

Additionally, because of the cloud nature of the new SAP Commerce and limitations which come with that, adding new third-party software capable to support document-level access is also not a solution.

Apache SOLR, as well as many other search engines, uses an inverted index, a central component of the almost all search engines and a key concept in Information Retrieval. Both full text search and facets are built on top of the inverted index. The limitations of the search engines are originated from the limitations of the inverted index.

The simplest and straightforward approach is listing relevant customer groups or customers in the designated product attribute and use it for the facet filtering by putting a customer id or customer group into the hidden facet. At the indexing phase, these ids are considered as terms for SOLR. However, it was obvious to us that such a straightforward approach won’t work with the millions of products and tens of thousands of customers and customer groups.

In this document, we’ll use the abbreviation ACL (Access Control List) to represent a list of customers and customer groups that can access a product or product group. Products have an ACL associated with them. The list is non-ordered. There are separate lists for allow and disallow rule groups.

There are four topics we needed to study:

  1. ACL format
  • ACL items, their order, and format.
  1. Where to store the ACLs
  • Should we store the ACL field along with other product information?
  1. How to store the ACLs
  • What changes should we make to the SOLR configuration?
  • How should the field type be configured in SOLR schema.xml for performance and scalability?
  1. What changes should we make in SAP Commerce? How scalable is the solution after making these changes?
ACL format

An ACL specifies allowed and disallowed customers as well as allowed and disallowed customer groups. Each customer or group is represented by a unique ID, up to eight characters in length.

The order of the items doesn’t matter.

There are two types of ACL:

  • whitelist
  • blacklist

So the simplest list can look like “C12,CG23,C45” (comma-separated) or “C12 CG23 C45” (whitespace-separated). In our tests, we used whitespace-separated strings.

How to store ACL

We experimented with two methods of storing ACLs for products:

  • in the Apache Solr
  • in the Redis DB

Both are elaborated below.

Apache Solr: StrField type vs TextField type

To store ACLs in Solr, we need to find the field type that would be best to store a simple list of IDs.

Apache SOLR out-the-box provides two types for the text fields:

The major difference between solr.StrField and solr.TextField is that the solr.StrField cannot have any tokenization, analysis or filters applied, and will only give results for exact matches.

SAP Commerce schema.xml defines two field types with those classes:

<fieldType name="string"  docValues="true" sortMissingLast="true"/>


<fieldType name="text"  positionIncrementGap="100">

<analyzer type="index">

<tokenizer  />

<filter  ignoreCase="true" words="stopwords.txt" />

<filter  />

<filter  />


<analyzer type="query">

<tokenizer  />

<filter  synonyms="synonyms.txt" ignoreCase="true" expand="true" />

<filter  ignoreCase="true" words="stopwords.txt" />

<filter  />

<filter  />




<dynamicField name="*_string" type="string" indexed="true" stored="true" />

<dynamicField name="*_string_mv" type="string" indexed="true" stored="true" multiValued="true" />


<dynamicField name="*_text" type="text" indexed="true" stored="true" />

<dynamicField name="*_text_mv" type="text" indexed="true" stored="true" multiValued="true" />

<dynamicField name="*_text_en" type="text_en" indexed="true" stored="true" />

<dynamicField name="*_text_en_mv" type="text_en" indexed="true" stored="true" multiValued="true" />


It is critical to draw attention to the statement docValues=”true” in the string type definition. According to Apache SOLR documentation, DocValues is a parameter that indicates should SOLR server use the column-oriented approach with a document-to-value mapping built at index time or standard row-oriented. In other words, values of DocValue fields are densely packed into columns instead of sparsely stored like they are with stored fields.

This feature was added to Lucene 4.0 to improve performance for faceting, sorting and highlighting. The faceting engine, for example, needs to look up each term that appears in each document that makes up the result set and pull the document IDs in order to build a list of facets. Of course, DocValues consumes significantly more memory than the regular inverted indexed type.

Before saying how we used DocValues in our final solution, let’s have a look at the tests and experiments we conducted to get more inputs and insights.

Load Tests

The purpose of this test is having a ballpark estimations of SOLR indexing performance for a large set of data with different text field types. We generated a test set with random ACL field values, and index the set using curl (see Uploading data with index handlers) on the regular MacBook. We needed a rough estimate and relative numbers. We used a standard SAP Commerce schema and server configuration (including JVM settings).

The setup includes:

  • 2.2 GHz Intel Core i7, 16Gb, MacOS X
  • 1,000,000 products
  • SOLR attribute containing a comma-separated list of customers/customer groups
    • Allowed to access the item
    • Not allowed to access the item

// Technically, for the simplified task, we need only one list, either allowed or not allowed. We used in our tests both of them because the actual business need of the client is more complex than explained in the task above. The challenge is originated from the fact that the same customer or customer group or their subgroups can address the items as used in both lists and it creates a new layer of complexity. There is a need to prioritize one rule against another and use constraints against both lists at the query phase by using the full power of SOLR filtering engine. We’ll come back to this point later in the article.

Conceptually, we decided to generate and load the following structure:

Product ID Customers and groups allowed Customers and groups not allowed
Product1 C3, C5, CG1, CG6 C2, C4
Product2 C1, CG3, C5 C2, CG6
Product3 CG1, C2 C3
  • A number of items in each list is random, from 0 to 1000.
  • The customer or customer groups IDs are random, from 0 to 10000.
  • Product name and code are random and unique
  • Product IDs is random and unique

The element of the dataset script:

GeSHi Error: GeSHi could not find the language json (using path /home/admin/web/hybrismart.com/public_html/wp-content/plugins/codecolorer/lib/geshi/) (code 2)

The tests showed the following results:

Field type Solr.TextField Solr.StrField
Loading the whole dataset 1388 items/sec Out-of-memory
Loading dataset in 20000-item chunks into an empty index 1333 items/sec Initially 2000-2500 items/sec.

But Out of memory!

As we expected, the initial processing might be slightly faster for docValue but it consumes a significantly higher amount of memory and even ended up with an out-of-memory exception.

‘Multivalued’ vs space-delimited field

The next challenge is how to represent a list of user/user group IDs. As Apache SOLR documentation says, there are two major options:

For ‘Multivalued’ option, the list is represented in an explicit way, as a list of string items. No need for tokenizers, because each item contains an ID and nothing more.

For the Text option, during indexing Solr splits the list-as-a-string into a list of terms using a tokenizer(s) and then it goes through the configured filters such as stemmers, duplication processors, etc.). After that, the list of terms is converted into a number of inverted indexes (or column-oriented indexes depends on field type configuration). In fact, in the second option SOLR naturally transforms one string value into a list of string value on behalf of the application logic.

Based on the fact that SOLR Apache Lucene under the hood and the fact that Lucene doesn’t support multivalued (but allows to use multiple fields with the same name), there was an assumption that the multivalued approach is slower. However, with the use of multivalued type, you can add or remove a single user/user group ID in ACL field without having others listed in the SOLR update request (see Atomic Updates). Of course, it doesn’t mean that SOLR re-indexes only the new values. SOLR needs to update its index for the whole field. We had an assumption that using Atomic Updates would help us to simplify the update operation. It is a matter of convenience, not performance.

It is noteworthy to point out that Atomic Update can be still applied to TextField type too. When the whole list (a field value) is provided, there is a way of reindexing a single field rather than the whole document (see Updating parts of documents).

Load and Update tests

This test was aimed to put the arguments above to the test. Additionally, we added a new field type Solr.StrField, with multivalued on and docValues configuration disabled.

The setup was the same as in the previous experiment.

Field type Space-delimited


Multivalued Solr.TextField Multivalued Solr.StrField (docValues=


Loading the whole dataset 1388 items/sec Out-of-memory 980 items/sec
Loading dataset in 20000-item chunks into an empty index 1333 items/sec 500 items/sec 1111 items/sec
Atomic update: Removing items from the list, only even groups are removed (~50%) N/A 444 items/sec. 645-700 items/sec
Atomic update: adding one item to the list N/A 606 items / sec 1333


Atomic update: removing one item from the list N/A 476 items / sec 1333


Atomic update: replacing the list with the shorter version (50% shorter, removed all even items) 1300 items/sec N/A 1333


As results show, replacing the whole ACL field takes roughly the same time as adding or removing items with use of Atomic Updates functionality. The ‘multivalued’ option is also two times slower for the initial data load. We saw no clear benefits of multivalued type against the comma-separated texts. The “convenience factor” seems to be the only reason to use multivalued, but the performance was more important to us.

Optimizing SOLR and OOTB Search Server Configuration

In the previous test, we used the OOTB configuration, which is expectedly not optimized for such volumes and data model.

From the whole list of properties on Field type definitions and properties page the following ones deserve attention:

  • Stored. “If true, the actual value of the field can be retrieved by queries.”

If this parameter is set to true, you will be able to retrieve the field data when you search. “Stored” field values are available for display or return with the Solr response while “Not stored” exist only as a term. Changing to “not stored” decreases memory consumption and I/O operations both during indexing and query time. However, troubleshooting and debugging will be more challenging.

  • OmitTermFreqAndPositions. “If true, omits term frequency, positions, and payloads from postings for this field. This can be a performance boost for fields that don’t require that information. It also reduces the storage space required for the index. Queries that rely on a position that is issued on a field with this option will silently fail to find documents.”.
  • omitNorms. “If true, omits the norms associated with this field (this disables length normalization for the field, and saves some memory). Defaults to true for all primitive (non-analyzed) field types, such as int, float, data, bool, and string. Only full-text fields or fields need norms.”

Since we need just to check a specific user or user group ID against the ACL lists, there is no need to store frequency and position. This should also improve memory consumption as well as indexing time.

Additionally, the index pipeline should be simplified as much as possible. This can be done by specifying analyzers in the type definition. If a user or user groups is specified as an item in the a whitespace-delimited string, you need only one tokenizer to split the string into parts, WhiteSpaceTokenizer.

Based on the ideas above, we came up with the following SOLR field type configuration:

<fieldType name="acl_text"  positionIncrementGap="0" omitNorms="true" omitTermFreqAndPositions="true">


<tokenizer />

<filter />




<dynamicField name="*_acl_nonstored" type="acl_text" indexed="true" stored="false"/>

<dynamicField name="*_acl" type="acl_text" indexed="true" stored="true"/>
Load tests

This time, the tests were run in AWS (r5x.large, 4 vCPU, 32Gb) with use of SolrJ library. SolrJ provides multi-threaded data load and uses CPU more efficiently than the simple curl-based loader. This time we use a dockerized Apache SOLR 7.2.1, with xmxm=2Gb. The enhanced field type was added to the SAP Commerce’s out of the box solr configuration.

The size of the generated JSON file with 1M products and upto 3,000 ACL elements was about 8Gb.

Dataset – 1M products and l_text_en

Single thread – curl


solrj, 4 threads


solrj, 4 threads

500 acl elements 42 min 31 min 5.3 min
1000 acls 86 min 57 min 6.5 min
2000 acls 163 min 111 min 9 min
3000 acls 252 min 168 min 12 min

The results showed that

  • The enhancements help a lot. The load time is 20x faster with the non-stored _acl fields.
  • Multi-threaded data load is 1.5x faster than single-threaded.
  • The changes in the field type configuration increases with the number of terms (ACL elements) in the documents is, but this increase is the smallest for the optimized field..
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Last week, the 30th annual SAP SAPPHIRE NOW conference took place March 7-9 in sunny Orlando, FL. This year, it was three in one: SAPPHIRE NOW, ASUG conference and CX LIVE. All three brought no fewer than 30,000 attendees to the Orlando’s Orange County Convention Center. More than 2400 sessions were held during the three days, spread throughout more than 1,5 million square feet of exhibition space, meeting rooms and halls. Here’s a look at a few of the conference highlights.

In 2019, for the first time, SAP CX Live takes place concurrently with the SAPPHIRE NOW/ASUG Annual Conference. The events were hosted in the different corners of the Orange County Convention Center in Orlando‎, the second largest convention facility in the United States. That was a big venue, but that’s even better — it helped me to hit my goal of 12k steps a day. However, after a number of 15-minute walks with a heavy backpack made me think twice whether I really want to go across the convention center to join another-end sessions.

These conferences were completely different in terms of atmosphere and tempo. As usual, SAPPHIRE NOW was traditionally overcrowded while CX LIVE was cozy and quiet.

Hundreds of exhibitors were segregated between the events on the basis of proximity to the Customer Experience topic. Everything that is related to the CX portfolio was grouped in the same location.

The event has attracted 30,000 people onsite from 100 countries, and over one million online. Even in far corners of the city, you could see the local bars with branded signs welcoming the guests of SAP.

  • Xs and Os. The main leitmotif and throughline of the conference. How important it is to combine both experience data and operational data (x+o) to have full situational awareness in your company
  • SAP Qualtrics. Ten new offerings to measure and improve the four core experiences of business – customer, employee, product, and brand.
  • SAP C/4HANA Foundation – a new Kyma-based an orchestration layer that enables administrators and developers to flexibly use enterprise applications and extensions from SAP.
  • SAP HANA Cloud Services and Data Warehouse Cloud. The single gateway that allows you to bring all SAP and non-SAP data together in a single logical data fabric resides accross distributed data environments.
  • SAP Intelligent Robotic Process Automation. The automation suite where bots imitate humans by replacing mouse clicks, interpret text-heavy communications or make process suggestions to end-users for definable and repeatable business processes.
  • SAP Data Intelligence. The comprehensive cloud solution to scale AI & ML across the enterprise. The product is used for coordination and cooperation between the business users and its customers, data engineers an data science teams
  • Partnership with Apple. Apple and SAP join efforts to make mobile experience even better. CORE ML will be used natively by SAP Cloud Platform SDK.
  • Partnership with OpenText. The OpenText content services will be delivered through SAP Cloud Platform and other SAP solutions.
  • SAP Customer Data Platform. SAP unveiled plans to evolve and complement SAP Customer Data Cloud (ex-Gigya) and SAP Marketing Cloud to deliver a full-scale enterprise-grade customer data platform.
  • Project “Embrace”. Microsoft and SAP join efforts in “Project Embrace”, a collaboration program with public cloud providers and Global Strategic Service Partners (GSSPs)
  • Training Cloud for SAP Litmos. The training cloud for the SAP Litmos is now part of the CX portfolio.
  • Start-ups
    • From SAP CX Labs. KOI, a prototype of the decentralized authentication based on blockchain. PoCs from SAP CX Labs: Codename ANTWERP, Yo!AvatART, Hands-Free Foeld Service Maintenance.
    • Constructor.io. The search-as-a-service start-up that uses AI & ML to increase usability and efficiency of the product search and make it personalized based on such data as customer behavior, product catalogs, product-related metadata, price changes, and purchases.
    • OTO.ai. The technology to measure the emotion behind words and get a much better understanding of the intent without performing a resource-intensive voice to text transcription
    • Rapitag. The security tag that unlocks automatically once the payment gets through
    • Ruum for SAP. Project management and collaboration for SAP
    • ThreeKit. This start-up provides a way to embed realistic 3D images within the commerce platform that allows buyers to rotate, flip, scale and look at a product from different angles.
    • Adverity. The platform for marketing data integration, a set of pre-built connectors to marketing data sources, such as social networks and web analytics.
  • Celebrity Guests. Karlie Kloss, Sandra Bullock, and Lady Gaga.
Xs and Os

This year, the main leitmotif and throughline of the conference is a concept of X-data and O-data.

The ‘X’ stands for Experience data and brings data on how someone feels and what is the context, the emotional component, and the ‘O’ stands for Operational data, such as costs, accounting, and sales.

“Experience is now the organizing principle of the global economy,” said Bill McDermott, SAP CEO, in his day-one keynote at SAPPHIRE NOW. “Every CEO I meet is trying to solve the experience gap… the difference between what people expect and what they actually receive. […] SAP was already the richest source of operational O-data, but we didn’t have experience X-data. The SAP must be the platform that combines X-data and O-data. X- plus O- [data] gives you the ability to deliver true personalization at mass scale so you can bridge the experience gap.”

Actually, this simple idea of bringing both experience data and operational data together resonates with me. It is a right and timely message and strategy for SAP. Putting the customer at the center of any solution is not a new strategy, but a strong foundation for success.

“We want everyone to remember their Xs and Os,” said SAP CEO Bill McDermott in his inspirational keynote. “Experience Management is the new frontier for the world’s best-run businesses. I have never seen SAP more fired up to help our customers be a driving force for growth, innovation, and optimism.”

This movement to a more user-focused approach is supported by its recent $8 billion acquisition of Qualtrics, the company famous for its solutions in tracking consumer sentiment. It is the most important and far-reaching announcement.

SAP Qualtrics

Qualtrics founded 17 years ago started as a survey company. Today it provides product research and customer surveys to more than 9,000 customers globally, including over 75% of the Fortune 100 companies. The acquisition became the largest VC-backed globally and the second largest in SAP’s history (CBInsights).

Having the survey automation in the core, the company enriched their services with powerful analytics and visualization capabilities, machine-learning techniques that automatically highlight key drivers and give recommendations. In the core, Qualtrics has a platform for building the complex omnichannel surveys, integrating them in the business process, executing them when events occur, collecting the results, analyzing the feedback, generating the insights and recommendations, and visualizing the results in different forms to let the analyst discover insights by their own based on data.

You can measure and collect NPS (Net Promoter Score) over time, and based on that, make predictions about customer behavior, such as whether they are at risk of churn.  The system can measure and track brand awareness, brand loyalty, and brand associations, and predict sales growth based on the collected data. Ad testing tools help you to validate advertising concepts and creative execution by testing them with a sample of the target auditory.

SAP announced 10 new offerings that combine experience data (X-data) with operational data (O-data) to measure and improve the four core experiences of business – customer, employee, product, and brand.

The most interesting solutions for the readers of Hybrismart seem to be the following solutions from SAP and Qualtrics:

Among other solutions announced, I’d highlight the following:

  • SAP Qualtrics Employee Engagement for analyzing open text responses and runs statistical analyses to predict the biggest engagement and impact drivers within an organization, such as company benefits or manager-employee relationships.
  • SAP Qualtrics Employee Lifecycle for creating and understanding personalized employee experiences by triggering surveys at specific milestones, such as first-day onboarding, training, promotion, exit and more.
  • SAP Qualtrics Employee Benefits Optimizer for simulating benefits trade-offs employees are willing to make, offers guided configurability and delivers real-time trends and reports on the most impactful and optimal benefits packages — all without the need for a data scientist or constant remodeling that exists in current solutions and capabilities.

McDermott introduced co-founder and CEO of Qualtrics, Ryan Smith. After the acquisition, Ryan will continue to run the company within SAP’s larger cloud business group. “We live in the experience economy where organizations are either intentionally racing to the top or unknowingly racing to the bottom,” said Ryan. “The difference between the companies that will win is they understand how X-data and O-data work together to tell the story of what is happening in an organization, why it’s happening and how to act in real time to deliver breakthrough business results.”

SAP C/4HANA Foundation

SAP announced SAP C/4HANA Foundation, an orchestration layer that enables administrators and developers to flexibly use enterprise applications and extensions from SAP, observability and application management tools, profoundly understand the customers’ businesses to offer individually customized products and services and offer a powerful extensibility framework, and, eventually, implement cloud solutions from SAP conveniently and quickly.

The C/4HANA Foundation product management group nominated three projects to be released in the first cycle of 2019:

  • SAP C/4HANA cockpit
  • Konduit, Event-Based Data Distribution for SAP C/4HANA 
  • Cloud Acceleration

For example, in the SAP C/4HANA cockpit (https://cockpit.cx.cloud.sap/), a part of C/4HANA Foundation available today, you can keep track of your entitlements, view implemented solutions and subscribed/purchased applications from the C/4HANA stack, manage user authorizations. By the end of the month, SAP C/4HANA Foundation will be automatically deployed for 1700+ SAP C/4HANA customers free of charge.

SAP C/4HANA Foundation is based on Kyma, an open-source project designed natively on Kubernetes. It is built natively on Google Cloud Platform. Over the next few months SAP plans to extend its integration with other IaaS providers as its own SAP Cloud Platform.

As a side note: Project Kyma is now freely available in the Google Cloud Platform Marketplace.

Useful links:

SAP HANA Cloud Services and Data Warehouse Cloud

SAP announced a new SAP HANA Cloud Services, a single gateway for all enterprise data. The idea behind the system is bringing all SAP and non-SAP data together in a single logical data fabric resides across distributed data environments. The goal is to provide one environment to manage data both on-premises and in the cloud.

This approach opens the HANA database for more widespread use. “SAP HANA is too good of a database just to lock it inside SAP enterprise applications,”  SAP founder and chairman Hasso Plattner said during the keynote. “HANA is for everyone. Now, with SAP HANA Cloud Services, by making it low TCO, delivered as a service, virtually everyone can get started immediately with their SAP HANA journey,” added Gerrit Kazmaier,  senior VP of SAP HANA and Analytics.

Hasso Platner shared the latest numbers on SAP HANA:

  • 50,0000+ customer licenses sold
  • 200+ peer-reviewed academic publications by SAP and HPI on in-memory databases for OLTP and OLAP
  • 72 terabytes – the largest customer on scale-out
  • 48 terabytes – the largest customer on a single node
  • 100+ million transactions processed daily for a single customer
  • 7 petabytes of SAP HANA licensed productively

As Kazmaier noted in his article, “With SAP HANA at the core, the new solution will be built on the fastest in-memory computing engine. By making this engine elastic, it will be possible to scale out indefinitely, enabling customers to add more resources as they require — and reduce them when they don’t need them anymore. Building on technologies such as SAP IQ, SAP HANA Cloud Services also will also offer an SQL data lake as a storage layer at minimum cost.”

SAP Data Warehouse Cloud, the first application planned to be built with HANA Cloud Services. It unites heterogeneous data in a single solution, maintaining security and trust.

Find out more on SAP Data Warehouse Cloud at http://www.sapdatawarehouse.cloud/

Useful links:

SAP Intelligent Robotic Process Automation

SAP Intelligent Robotic Process Automation (IRPA) is an automation suite where bots imitate humans by replacing mouse clicks, interpret text-heavy communications, or make process suggestions to end-users for definable and repeatable business processes. It helps to automate recurring work and dramatically save costs and time. You can create a chatbot and train it to ask questions and process natural language replies. The results can be posted by the bot automatically into SAP system by filling out the forms and entering the data collected by the bot from the interaction.

The solution is based on the robotic process automation software from Contextor SAS acquired by SAP in the end of 2018.

Find out more:

SAP Data Intelligence

SAP Data Intelligence is a comprehensive cloud solution to scale AI & ML across the enterprise. The product is used for coordination and cooperation between the business users and its customers, data engineers an data science teams. Using SAP Data Intelligence, the customers will be able to manage their data, data scientists – algorithms with the machine learning models, tools and frameworks, and engineers – deployments of the AI&ML tools on the cloud.

The product also allows them to adapt the latest trends and technology by combining the SAP Data Hub solution (don’t confuse that with 

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

When it comes to your overall website usability and providing good user-experience to your site visitors, the login component plays a very important role. This article explains how it should be built to provide the best customer experience.  

The topic is purposely narrow. I am focusing on how to implement login forms and authentication process in the e-commerce systems built on top of SAP Commerce Cloud. In fact,  general recommendations and findings are applicable for other platforms too.

What makes a good sign-in for your customers? There are two pillars: good security and good user experience. Having backed by the platform and accelerator built-in features, many developers and product owners prefer to minimize the number of customizations and stick out of the box. This strategy works well only if you need to release your product in the shortest time.

From the authentication perspective, the best practices recommend offering to the users the ability to check out as a guest and/or force or invite them to register before completing the purchase or after. Fundamentally, customer authentication is important for the next visits of a customer. Very often, the customers don’t see any benefits of being authenticated. Many of them prefer to check out as a guest and minimize the information they share about themselves. That is why it is getting more and more important for the merchants to respect and protect the customer data.

For B2C websites, forcing users to register first may lead to a decrease in conversion rates, because it breaks the buying process. In B2B solutions, it is a primary and widely used option because normally guest users are not involved in the B2B process at all.

The login form is a very important piece of the interface, and implementing it properly is crucial for success.

You may say, how one can make a mistake in such a simple thing! In this research, I analyzed the different implementations and technologies, common mistakes and strategies.

  • Overview
    • Session Handling Methods used with Authentication
      • Session-based authentication (stateful)
      • Token-based authentication (stateless)
    • Authentication methods
      • Certificate-Based Authentication
      • Password-based authentication
      • Passwordless authentication
    • Authentication stacking methods
      • Single Factor Authentication (SFA)
      • Two Factor Authentication (2FA)
      • Multi-Factor Authentication
    • Authentication in distributed systems
      • Single Sign-On (SSO)
  • Implementation
    • Login Form in SAP Commerce
    • Authentication Flow in SAP Commerce Cloud
    • Remaining Logged In / Remember Me
    • More details on j_spring_security_check implementation in hybris
      • Csrf Filter
      • Logout Filter
      • Authentication Processing filter
      • RememberMe Authentication Filter
      • Anonymous Authentication Filter
    • Importance of HTTPS
    • Feedback Messages
    • Password Strength
    • Brute Force Attack Protection
    • Storing Passwords
    • Third-party Identity Providers and Identity Federation
    • Browser APIs for authentication
Overview Session Handling Methods used with Authentication

Let’s start with a short overview of authentication methods from the perspective of session handling methods on the Web.

Session-based authentication (stateful)

A session is defined as the time during which a user is actively logged onto a server with a cookie. So, the session based authentication is one in which the user state is stored on the server’s memory. That is why it is also called as stateful.

The typical flow is the following:

  • User enters the login and password
  • Server authenticates the user, generate a random token, sends it to the browser/client app;
  • Server saves the token in the database or memory;
  • Browser/ClientApp receives the token, stores it in the cookies/local storage. Browser/ClientApp sends this token with each subsequent requests along with other relevant cookies.
  • Server uses the token to authenticate the user and return the requested data back to the Browser/ClientApp.
  • When the User logs out, the token is removed from the cookies/local storage, and the subsequent requests won’t have it, and the server won’t return the information protected back to the user until the user is authenticated again.

The advantages of this approach are

  • The session can be revoked anytime
  • Session parameters (such as user permissions) can be changed anytime
  • The solution is easy to implement, especially for a single-server setup

However, this approach doesn’t work well for a very large number of concurrent session. As the number of logged-in users increases, you need more and more server resources. Scaling and distributing the sessions over a cluster introduce new challenges, such as supporting sticky or distributed session management. This type of authentication has been elaborated well, and for every case, there is a solution, but every complication creates a bunch of new challenges.

SAP Commerce Cloud uses session based authentication for customer sign in and session management.

Token-based authentication (stateless)

Token-based authentication is used to address the disadvantages of session-based authentication. In brief, in token-based authentication, the user session data are stored on the client side, in the browser. These data are signed by the key of IdP to ensure the integrity and authority of the session data.

The process is the following:

  • User enters the login and password
  • Server authenticates the user, generate a signed token (usually JWT, pronounced ‘jot’), sends it to the browser/ClientApp; Unlike session-based authentication, the token does contain additional information, such as user_id and permissions. It is important that the token is cryptographically signed and changes in token made by 3rd party are easily discoverable. 
  • Browser/ClientApp receives the token, stores it in the cookies/local storage.  Browser/ClientApp sends this token with each subsequent requests as an Authorization header in the form of Bearer {JWT}.
  • Server decodes the token and uses the information from the token to authenticate the user and return the requested data back to the client application.
  • When the User logs out, the token is removed from the browser/ClientApp.

The session data is effectively distributed in this option. It allows storing more information about the user on the client side. Token generation is decoupled from token verification. It allows you to handle the signing of tokens on a separate server or service.

However, you can’t revoke the session easily in token-based authentication. Changing user data is challenging too. For example, you want to add a new user session property.  All new sessions will take this property but the existing sessions will keep sending the obsolete set until the session is re-established.

In SAP Commerce Cloud, token-based authentication is not used.

Authentication methods

The methods explained below can be used in different combinations. When the user is authenticated, the server initiates a session using the approaches explained above.

Certificate-Based Authentication

This authentication method is based on SSL certificates.

  • User enters private-key password
  • Browser signs the request with a digital signature
  • Browser sends certificate and digital signature across network
  • Server uses certificate and digital signature to authenticate the user’s identity
  • Server authorizes access if the signature is correct and the certificate is valid
Password-based authentication

This authentication method is based on the ‘secret’ pair of the login and password.

  • Browser’s dialog box requests the user’s name and password from the user.
  • User provides a name and password
  • Browser sends the name and password across the network
  • Server looks up the name and password in its local password database and, if they match, accepts them as evidence authenticating the user’s identity.
  • Server allows the user to access it if the identified user is permitted to access the requested resource.
Passwordless authentication

In the passwordless authentication, the user’s identity is verified without asking the user to provide a password. For this methods, it is implied that only the user has access to a personal physical device that is uniquely addressable and can communicate securely with the server over a distinct communications channel, referred to as the secondary channel.

  • User enters the e-mail or phone number or id in the authentication app
  • Server sends them a one-time-use link with the code to that email or texts a code to the phone number or sends a code to the authentication app. The user clicks on the link or enter the code received from the system. Basically, the trigger is the code returned back to the app. This operation can be done only if the User confirms the authentication intent on the personal device or mail client. The system assumes that you will get the login link from your inbox or a code from the text message only if the email provided is indeed yours.
  • User is automatically logged in to your website/application.
Authentication stacking methods Single Factor Authentication (SFA)

The most common form of this form of authentication is the password-based authentication explained above.

Two Factor Authentication (2FA)

Two-factor authentication also called two-step verification, is an upgrade to the password-based authentication by adding an extra step to the sign in process, the second factor.

The most common form of two-factor authentication is mixing a password-based authentication with a passwordless approach with the use of SMS services. This assumes that the user has a personal device, a mobile phone, nearby each time the system requests a second verification.

Multi-Factor Authentication

With this method, the authentication credentials are combined from

  • Something the user knows (a password or a PIN)
  • Something the user has (customer’s device)
  • Something the user is (biometric verification)
Authentication in distributed systems Single Sign-On (SSO)

The idea of Single Sign-On describes the ability to use single authentication to gain access to a number of websites. The products authenticate users in the single location and then create tokens which can be used by downstream web applications for authorization.

In this method, there is a central service which orchestrates the single sign-on between multiple clients. When a user first logs in, the central service creates a token, which persists with the user as they navigate to the various websites integrated with the same SSO service.

  • User enters the login and password
  • Server authenticates the user, generate a token, but the token is stored on the SSO domain, which is different from the app domain.
  • Server uses the token from the SSO domain to authenticate the user trying to access the particular app.

Social Sign-in is a form of SSO using existing account information from social networks such as Facebook, Twitter or Google+, to sign into a 3rd party website instead of creating a new account specifically for that website.

There is no SSO implementation for customers in the default SAP Commerce Cloud. The OOTB support of the SAML-based SSO was designed for backoffice users. The extension for that is provided with the Commerce Cloud.

By the way, I published an article on Okta/Hybris Integration to support customer SSO: https://hybrismart.com/2016/06/15/hybrisokta-sso-integration/

Implementation Login Form in SAP Commerce

The default login form is very basic in the default SAP Commerce. It contains three essential form controls:

  • Input box j_username for a customer id which is a customer’s e-mail
  • Input box j_password for a password
  • a submit button

The form action is “/j_spring_security_check”. This login processing URL is specified in the configuration.

There is also a link to “Forgotten password”. The form doesn’t contain a “Remember Me” link.


  • Consider adding a required attribute to the input fields. This way, a customer won’t be able to submit the form if the value of an input is empty.
  • Consider adding a minlength attribute to the password field to improve in-browser validation.
  • Consider adding an aria-describedby attribute to to provide an accessible description for a username and password fields
  • Consider adding a type=”emailif login is supposed to be an email. With this type, the on-screen keyboard will be automatically adjusted for entering the e-mail address.
  • Consider adding an autocomplete attribute for both fields so the password manager will offer autocomplete
  • If the username is an email address, tell that to a customer in a label next to the input.
  • Make sure a focused element is always anyhow highlighted. Despite browsers outline focusable elements by default, you might want to apply you own custom css style by using selector:focus {}
  • Keep the keyboard navigation order logical and intuitive. It should follows the visual flow of the page: left to right, top to bottom.
  • Use only the elements which can receive keyboard focus. See https://www.w3.org/TR/wai-aria-practices/#keyboard and https://webaim.org/techniques/keyboard/ for details.
  • To avoid mistyping:
    • Tell users if the caps lock is on.
    • Let users see their password by providing a show password checkbox
    • Show password constraints upfront and update it in real time
Authentication Flow in SAP Commerce Cloud

The default SAP Commerce storefront is built on top of Java Server Pages and Spring Framework. Almost everything the developer should care about login forms, authentication and system activity authorization involves Spring Security. SAP Commerce comes as a preconfigured solution implementing a common scenario of customer authentication.

For the sake of consistency, let’s overview the basic authentication flow.

In the SAP Accelerator, “/login” is configured as a login page URL. The website has a number of protected pages, listed in the configuration. When an anonymous user attempts to access the protected page, and Spring Security finds the user not authenticated, the system redirects them to the login page to be logged in.  Of course, the login page can be accessed directly by visiting “/login”, a link behind “Sign in” button.

With the first request to the server, the system initiates a session. The session is stored on the application server level. The application server, Apache Tomcat, sends the unique identifier of the session back to the browser. This unique identifier is a 26-character hash key into the server’s map of the active sessions. This key is stored in JSESSIONID.

On the login page, the user is challenged to provide login credentials. When the form is submitted, the server verifies the login and hashed password against the database. If a match is found, the system saves the information into the HTTP session and returns a cookie acceleratorSecureGUID with the random value. This cookie brings the information the customer is logged in.

Next time the user requests a page, the browser will send JSESSIONID and acceleratorSecureGUID in the request header, so that the server will be able to identify the session and extract the acceleratorSecureGUID from the session and compare the value with the one received with the request. If there is a match, the system will check whether the logged user from the session has enough permissions to get to the page. The session parameters, such as session variables are used to store the session user parameters.

The login processing URL is specified in the configuration. By default, it is “/j_spring_security_check”. This URL is mapped to UsernamePasswordAuthenticationFilter to serve the requests. There are two parameters for the filter used from the form. In SAP Commerce configuration, these are j_username and j_password. Depending on the authentication status, the Spring Framework executes one of two methods:

  • Failure:
    • …yacceleratorstorefront.security.
      • LoginAuthenticationFailureHandler.
        • onAuthenticationFailure(req, resp, exception)
  • Success:
    • …acceleratorstorefrontcommons.security.
      • GUIDAuthenticationSuccessHandler.
        • onAuthenticationSuccess(req,resp,authentication) –
          • Sets up a cookie acceleratorSecureGUID with a random GUID.
          • calls the default StorefrontAuthenticationSuccessHandler which sets up a session customer, merges the carts, session parameters, publishes the login event. The user is redirected to the target URL, (default if none).

For logging out, there is a dedicated URL, “/logout”. It is mapped to StorefrontLogoutSuccessHandler. When logging out from a restricted page, the user is returned to a home page.

Read more on Spring Security in SAP Commerce: https://help.sap.com/viewer/4c33bf189ab9409e84e589295c36d96e/1811/en-US/8aef3efe8669101481a0ffe871a2f84c.html

Remaining Logged In / Remember Me

The concept of the “remember me” feature is that the authenticated state is persisted beyond the immediate scope of use. The customers can close the browser, turn off their PC than come back later and the web shop still knows who they are and offers them all the same features they had when they left it. In the logging form, this functionality is represented by the “Keep me logged in” checkbox.

In SAP Commerce Cloud, the default login form doesn’t have a “remember me” checkbox, but the functionality is present and on by default. The Accelerator storefronts support the “Soft Login” concept, which automatically logs a customer into the storefront based on a  *rememberMe (for example, “yb2bacceleratorstorefrontRememberMe” for the OOTB B2B). It uses the Spring TokenBasedRememberMeServices implementation.

After the user is logged in, the system sends this *rememberMe cookie to the browser, and the value of the cookie contains a username along with the hashed password (and some other information). When the session is over, the authentication mechanism will use the rememberMe cookie to re-authenticate the customer automatically. The rememberMe cookie is base64 encoded, so the username is extracted easily using Base64 decoder.

The rememberMe cookie-based authentication is called “soft” to distinguish it from the hard authentication based on the password. A soft authenticated customer needs to provide a password and log in fully to access the account related functionality or proceed through checkout.


Signing out removes the rememberMe cookie completely from the browser.


More details on j_spring_security_check implementation in hybris

The login and password are sent as POST to


The SAP Commerce Cloud spring configuration tells us that “/j_spring_security_check” is mapped to springSecurityFilterChain.

There are tens of standard storefront filters processed. I would like to feature some of them which are important in our context:

  • CSRF token check filter (CsrfFilter)
  • Logout Filter
  • Authentication Processing filter (UsernamePasswordAuthenticationFilter which extends AbstractAuthenticationProcessingFilter)
  • RememberMe Authentication Filter
  • Anonymous Authentication Filter
Csrf Filter

The filter checks if the URL is in the list of urls where CSRF check shouldn’t be involved. The property names csrf.allowed.url.patterns and csrfAllowedUrlPatternsList specify URL patterns that allow you to bypass CSRF token validation.. These urls are ones end with “…sop/response”, “…merchant_callback”, “…hop/response”, “…language”, and “…currency”. Once “/j_spring_security_check” is not in the list, the system compares the value from the http session and a value from the form. If they are different or CSRF token is not present, the system stops the request processing with the corresponding exception.

Logout Filter

This filter performs logging out for the user, if the request URL matches the particular pattern (“/logout” by default)

Authentication Processing filter

This filter intercepts the authentication URL processing. By default, it is “j_spring_security_check”. For username and password mode, it retrieves the credentials from the form data and pass them through authentication provider, which performs the following checks:

  • Admin authority? Reject, if yes.
  • Outside the allowed user types? Reject, if yes.
  • B2B: Outside the authorized groups for B2B? Reject, if yes.
  • B2C: Outside the customer group? Reject, if yes.
  • User is disabled? Reject, if yes.
  • Username doesn’t exist? Reject, if yes.
  • Credentials are not valid? Reject, if yes.

These checks are provided by B2B Accelerator Authentication Provider (b2b accelerator add-on) and Core Authentication Provider (platform core).

Abstract Accelerator Authentication Provider increments the counter of failed attempts and disables the account after the configured number of attempts (Brute force attack attempts protection).

RememberMe Authentication Filter

The concept of RememberMe has explained above. The filter is responsible for activating the feature in the solution.

This filter is part of the Spring Framework. SAP Commerce uses it as is but configures the filter with a custom RememberMe Services class (AcceleratorRememberMeServices).

When a registered user logs in to the storefront, a cookie <storefront>RememberMe is created by the Remember Me services to identify the user. When the user logs out, this cookie is cleared out. However, if an authenticated user closes their browser without logging out, and the user attempts to visit a specific page in the storefront again, when the request is made for this secure resource, the RememberMe Authentication Filter injects the RememberMe authentication token into the session when loading the storefront.

Anonymous Authentication Filter

This is a Spring Framework filter. It creates a filter with a principal named “anonymousUser” and the single authority “ROLE_ANONYMOUS”. SAP Commerce uses it as is...

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Will you be attending the SAPPHIRE or SAP CX Live events in Orlando next week?

As usual, I am highlighting some of the sessions seem interesting to me and hopefully to you also, the reader.

In this year, there are no technical sessions, the focus is mainly on sharing the experience and best practice, from SAP and partners. This year, at CX LIVE, SAP Litmos, CallidusCloud’s learning management solution, is going to be one of the major topics. Tens of sessions are dedicated to it.


“SAP Marketing Cloud and CDS” (assumingly) —

CX LIVE: The Future of Customer Experience with AI and Machine Learning by

  • Charles Barton (Senior Product Manager, SAP CX) and
  • Rusty Belsinger (ex-Head of Sales of SeeWhy and currently – Account Executive, SAP CX):
  • Artificial intelligence (AI) and machine learning are making the SAP Commerce Cloud solution smarter. Join this session to learn how we are embedding intelligence into SAP Commerce Cloud to support real-time personalization, conversational commerce, category suggestions, and search capabilities. Look at the overall vision for planned intelligent commerce scenarios. Understand how it will enhance the shopping experience of customers and boost productivity for business users.”.
    – Wed 11:30 a.m. – 11:50 a.m. Hall D, OCCC West, Level 2, Theater 1.

“Upscale Commerce” —

CX LIVE: Looking into the Future of Commerce by

  • Charles Nicholls (SVP, SAP Upscale Commerce)
  • Jerrel Fielder (Product Development, SAP Upscale Commerce)
  • In this forward-looking presentation, we’ll examine how the buying experience is changing – spanning mobile, the Internet of Things, voice, and messaging, embedded in devices and stores – and what this means for merchants. We’ll also explore what future customer experiences look like and the roles of consent, mobile, AI, personalization, and distributed order management.
  • — Wed 12:00 p.m. – 12:20 p.m. Hall D, OCCC West, Level 2, SAP Customer Experience Stadium

“SAP Product Content Hub” —

CX LIVE: Create Great Product Experience with SAP Product Content Hub by

  • Christian Schoenauer (Senior Director Solution & Strategy, SAP CX)
  • Christian Muench (Principal Product Manager, SAP CX)
  • Want to create a great product content experience for your customers? Need control over product content regarding multiple consumer touchpoints? Tired of inconsistent and poor product data quality? Find answers with our new SAP Product Content Hub – a product experience management solution holding the golden record of your product content, enabling you to publish product content to multiple channels. Join us for an outlook into the future of product experience management and how we can help you.
  • — Tue 05:00 p.m. – 05:20 p.m. Hall D, OCCC West, Level 2, Theater 1


CX LIVE: Find Out What’s Cooking in SAP Customer Experience Labs by

  • Anja Wilbert (Lead of SAP CX Design
  • “Discover the latest exploration of emerging technologies and how they apply to different aspects of the customer journey. Deep dive into innovation prototypes including conversational AI, workplace safety, digitalized identity, bot negotiation, mixed reality, decentralized master data enrichment, probabilistic programming, and much more.”
  • — Tue 12:00 p.m. – 12:20 p.m. Hall D, OCCC West, Level 2, Theater 3

Other topics seem to be interesting for me, and I hope for you too:

CX LIVE: An E-Commerce Marketplace in Three Months! Mission Impossible, or Is There a Recipe for Success?

  • Geert Chielens (eCommerce Director Brussels Airport Company)
  • “Airports, like any other businesses, have changed radically over the past decade, attempting to provide relevant customer experiences. Companies must rethink their entire way of working, ensuring both acceleration of digital innovation and a guaranteed fast go-to-market. Brussels Airport had to build and deliver an e-commerce marketplace within three months to ensure relevance as a digital host to the present retail partners and to provide innovative services and products for its passengers.”
  • — Tue 03:00 p.m. – 03:20 p.m. Hall D, OCCC West, Level 2, Theater 1

CX LIVE: Discover Five Trends That Can Affect Your Customer Experience

  • David Jonker (VP, SAP Insights research center)
  • Get ready for a world where customers precisely specify the terms of your relationship, and where customer service is provided by intelligent chatbots. Find out how customer engagement is being disrupted, and explore five trends of the future of customer experience”
  • — Wed 10:45 a.m. – 11:05 a.m. Hall D, OCCC West, Level 2, Community Area: SAP Customer Experience Community Theater

CX LIVE: Look Ahead at the Road Map for SAP Commerce Cloud by

  • Riad Hijal (Global VP, Commerce Strategy and Solution Mgmt, SAP CX)
  • Lisa de Souza (Lead, Product Mgmt, SAP Commerce Cloud, SAP CX)
  • Receive an overview of the guiding principles and strategy for innovating the SAP Commerce Cloud solution. Gain insight into the short- and long-term road map, as well as key focus areas such as machine learning, customer experience, product content management, and integration.
  • — Wed 02:00 p.m. – 02:40 p.m. Hall D, OCCC West, Level 2, SAP Customer Experience Stadium

The next sessions are going to be held in the SAP SAPPHIRE areas:

SAPPHIRE: Demystify Cloud Migration for Your Commerce Solutions

  • Bradley Bushell (Head of Practice, SAP Customer Experience Services COE, SAP)
  • Discover the benefits of migrating to the latest cloud offering, whether you are currently running an on-premise or legacy cloud commerce solution. Learn about SAP Customer Experience Expert Services offerings, incorporating best-practices that are based on migrations to SAP Commerce Cloud solutions.
  • —  Tue 12:30 p.m. – 12:50 p.m. Customer Experience CU202

SAPPHIRE: Run Best-In-Class Cloud Commerce Deployments

  • Bradley Bushell (Head of Practice, SAP Customer Experience Services COE, SAP)
  • Keep complex cloud commerce implementations on track and realize your project goals. Gain valuable insights and access proven best-practice guidance from SAP experts.
  • — Wed 12:00 p.m. – 12:20 p.m. Customer Experience CU224

SAPPHIRE: Lessons Learned From Customer Experience Innovations across B2B & B2C

  • Mathi Natarajan (Director Consulting Services, CGI Canada)
  • Amit Kaul (Partner, SAP Head, CGI Canada)
  • “Join CGI Canada to learn how Datahub and SAP Gateway are leveraged to integrate Hybris and IS-Retail, SOLR performance tuning resulting in 2000% improvement, single Hybris instance supporting B2B /B2C landscapes, and Hybris powering customer experience across brands and channels.”
  • — Thu 02:00 p.m. – 02:40 p.m. S330D (South Concourse, Level 3)

SAPPHIRE: Get More from Your Cloud Commerce Software Using the Power of Microservices

  • Roman Frid (Principal Consultant, CX Cloud Strategy, SAP)
  • Develop new cloud commerce capabilities using a microservices-based approach. Learn how to extend functionality within SAP Commerce Cloud solutions. Tour a cloud-services portal used to provision, deploy, and manage SAP Commerce Cloud on Microsoft Azure.”
  • — Tue 03:30 p.m. – 03:50 p.m. Customer Experience CU201
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Non-product content management has for many years been the weakest link of the SAP Commerce platform. Formally, the suite had a WCMS module from the very first version, but it was obvious to everybody that the solution was terribly old-fashioned and outdated.

Already back in 2016, SAP Hybris Commerce was extended with a new solution, SmartEdit. On the one hand, SAP released it too early. Many commented that the product is underbaked to replace WCMS Cockpit. Even two years later, Smartedit had drawn much criticism from the users and developers. On the other hand, it was long overdue. We had been waiting for a replacement for years.

In the search of the truth, in 2018, I was examining Smartedit in detail and shared the findings here on Hybrismart. There hasn’t been a lot of water under the bridge since 2018, but the situation with Smartedit has taken a definite turn for the better. However, being a “non-mandatory” component during two years, it was generally ignored by the community. After all, why should we use Smartedit if there is a good old WCMS Cockpit, time-tested and proven, albeit with known issues and poor customizability? This is why only a small percentage of developers are aware of Smartedit from the technical perspective. I hope this article will help them to move forward.

It should be noted that starting from the version 1811, we’re currently out of options. The cockpit framework, and with it the WCMS Cockpit and BTG personalization, were completely removed from the platform. So, Smartedit is the only option for content management and personalization.

Content Management Systems

To better understand the concepts of Smartedit and its place in the market, let’s look back at how content management systems have evolved over time.

The first CMSs were developed to support only websites. Mobile, desktop, and kiosk versions were considered as separate channels.  As a result, many of the solutions were designed as monolithic applications in which the user interface and data access code are combined into a single program to form a single platform. When it comes to innovation, most of the CMS solutions are constrained by their legacy architecture. This was one of the reasons why CMS Cockpit had not been evolved for years by SAP. With every single year of dealing with the deprecating software, the developer experience has been a key pain point which boils down to lack of vendor support, poor documentation, problematic troubleshooting and debugging.

The traditional monolithic CMSes provide both content management and content rendering.  More specifically, they have been developed with the frontend and backend being designed into a single platform. In Hybris, we had Accelerator templates built in JSP, page controllers built in Java, and all this stuff must be aligned with the data model and architecture of the CMS system.  The WCMS was designed mainly for developers thereby reducing the leverage of these systems by non-technical users. Many concepts require an understanding of the underlining data model.

Headless CMS has no default front-end system to determine how the content is presented to the end user. It provides all of the capabilities of the backend of a traditional CMS (i.e., the “body”) while giving the responsibility of content presentation to the delivery channels (i.e., the head). It acts as a content repository. The data can be requested or pulled from the CMS by any channel, such as mobile or kiosk app, by way of a RESTful API. Each individual channel takes advantage of their own unique presentation capabilities. Some would also name those content management systems “API driven CMS”.

The problem with most of headless CMS is you need one more layer to be managed somehow, presentation layer. The possibility to have a page-oriented approach with layouts or grids is important, but generally not provided by headless CMS solutions.


Smartedit was designed to bring the best of both worlds. Its architecture combines the flexibility and adaptability of API-first approach and the user-friendliness of a traditional CMS. For the stable set of API interfaces, you can create more than one storefront and even more than one content management client.  For example, on top of the CMS API, you can develop a module that regularly checks the content and performs corrective actions if it doesn’t meet requirements.

Smartedit is built as a single page app interacting with the CMS API. Like a storefront, it makes Smartedit headless too, so that theoretically you can have an alternative client app for CMS operations. Such architecture makes Smartedit clear and understandable in terms of architecture, customizability and data flows.

However, Smartedit introduces a bunch of new technologies and tools. It imposes stricter requirements to the project team. It is very unlikely you will use old and obsolete AngularJS 1.6 for the storefront too.

The reason for that is clear. Being a single page app, the essential part of Smartedit belongs to frontend development. Frontend world is crazy: it has changed fundamentally in the last 4-5 years. Multiple tools and libraries rose up to meet the challenge, and the best ones have slowly floated to the top. Years ago AngularJS was a promising toolset.

As we can see from the code, roadmap and Smartedit releases and updates, the product has been making great headway in refactoring and redesign. The extendable components are now in Typescript, and the interfaces have been evolving to meet the market standards.

Key concepts of Smartedit

Smartedit maintains content in a “page-oriented” manner. Content is generally stuck to the components it is used in.

The page editor is integrated into the website UI. It makes Smartedit look like a transparent layer on top of the storefront.

Without Smartedit With Smartedit

This layer can be presented in different functional modes, from the basic one to advanced. Each mode provides a set of available content operations and can be restricted to a content management role. There are five modes out of the box: Preview, Basic Edit, Advanced Edit, Versioning and Personalization.

The content delivery layer is completely decoupled from the content management. Smartedit is responsible for the content management part only. Content delivery and rendering are based on the old good Spring MVC.

For the pages, components and templates, the data model is generally the same we had with WCMS. 

Smartedit is designed to work with any storefront supporting the well-documented Smartedit storefront integration contract. Currently, only two storefronts support the contract, the built-in JSP-based SAP Commerce Accelerator, and a brand-new javascript storefront, Spartacus.

The modular structure helps to extend almost everything. The extensibility capabilities are demonstrated with a personalization module, which can also be used as a reference implementation for your own custom modules.

Key components

There are three functional components, Navigation Management and Page Management and Personalization. In terms of navigation, there is a four component, Versioning, but functionally it is part of page management.

As a name of the CMS feature, “Versioning” was really confusing to me.  It is not an Undo/Redo functionality as I’d have thought. Functionally, Smartedit’s versioning is closer to Backup/Restore: you can save a current version before making changes, and revert to the saved version to discard the changes and restore the saved state.

Currently, versioning in SmartEdit is only by page. Content slot and component versioning are not available yet but promised to be added in the future.

Navigation management is based on similar concepts as found in good old CMS Cockpit. You can add, move, edit and delete navigation nodes, change the order of the navigation nodes using drag and drop feature. You can associate pages, media and components with the navigation node.

Smartedit’s UI is faster, sleeker, and generally more convenient than what we had in WCMS Cockpit. However, a few hiccups are still a struggle for newbies. For example, you can’t publish the changes in navigation directly from the page where these changes were done. You need to synchronize a navigation component which is in a completely different corner. The navigation data model makes it difficult for many people who are not familiar with complex data structure handling and not experts in this matter at all. How to explain the different types of navigation entry to non-specialist?   However, these cases are few.

The Smartedit’s personalization module replaced the old similar-purpose component, Behaviour Target Groups, or Advanced Personalization. The legacy solution triggers actions or shows CMS content to the different group of customers based on their behavior, interests, and historical data. You could use customer-specific promotions, suggest the products, or adapt the content and products of the website to their interests. It looks great at the demonstrations, but due to significant impact on performance, this module was generally avoided. The new module, personalization module, is promising.

Architecture of SmartEdit Personalization

Personalization module is based on two pillars:

  • User/Segment assignments. The segment is used to group the users having the similar behavior or attributes.
    • For example, all users come from the e-mail ad can be grouped into the “Email ad” segment.
    • For example, all users with the age from 40 to 50 can be grouped into the “40-50 y.o.” segment.
  • Page or component actions, variations and the triggers to activate a group of variations.
    • For example, for users assigned to combination of the segments “Email ad” and “40-50 y.o”, the banner on the main page is different.

It is important to say that SAP Commerce can’t assign customers to segments automatically based on their behavior or attributes. This is so because of separation of responsibilities: this is not part of the e-commerce platform purpose and tasks.  You need to integrate SAP Commerce with the segment data provider, such as SAP Marketing or Context-driven Services. There is an out-of-the-box integration package from SAP to make the process smooth and fast.

Also, you can’t edit or remove the segments and users-to-segment assignments in Smartedit. These data are considered as external to the e-commerce platform.

Smartedit allows you to create variations and connect them to variation triggers. A variation is a set of changes. This change, an action, can be applied to the CMS, Promo or Search configurations. For example, you can replace a CMS component with another version of it and name it as a variation. The trigger is used for enabling the variations for specific conditions. For example, segment trigger is used when the customer belongs to a particular segment.

The legacy BTG personalization engine was not popular mainly because of the performance impact it creates. That module was capable to assign users to segments dynamically, and activate the changes based on the combination of the rules in near real-time manner. These two operations are expensive in terms of resources. In the new implementation, the user segmentation is no longer e-commerce platform’s responsibility, and the rules are checked only when particular events (such as User Login) are occurred. After a variation is calculated for the customer session, the system caches the findings in the session, and uses it for the further requests.

Such actions are used for Smartedit configuration (DefaultCxRecalculationService):

  • RECALCULATE is internally calculates the variations and caches the results in the session for the further calls,
    • code: calculateAndLoadInSession(user)
  • UPDATE pulls the user/segment assignment updates from the external system,
    • code: updateSegments(user, configData.getUpdateProviders())
  • ASYNC_RECALCULATE initiates the asynchronous variation recalculation process.
    • asyncRecalculate(user, configData.getAsyncProcessProviders())
  • LOAD loads the personalization results from the database and saves them serialized as a session var
    • loadResult(user)

These actions can be specified for a fixed (OOTB) number of events, such as user login or user gives consent:

The key difference between SmartEdit personalization and legacy BTG personalization is a way how the calculation is implemented. For BTG, it was a real-time calculation while in the Smartedit the calculation is asynchronous. The results can come seconds later after the customer is logged in. Using the asynchronous calculation makes this approach fast and manageable in terms of performance. There are tiny delays, but they don’t affect the customer experience because they occur in the background.

When the events are expected to be too frequent, there is a way to perform recalculation on schedule. There is  CxDefaultPersonalizationCalculationJob which executes CxService().calculateAndStoreDefaultPersonalization. 

Customizing Smartedit

There are two main techniques (used together) for customizing the business logic and interfaces:

  • Extending APIs
    • Configuring existing
    • Rewriting existing
    • Creating new APIs
  • Extending frontend business logic and presentation
    • Menu items, popups, buttons…

The following APIs are considered as Smartedit-related. Some of them are purely frontend mechanisms, such as Angular services, some of them are RESTful webservices, or a combination of the webservices and Angular services.

Permission Service and Permission API. It is used to determine what functional elements are available for a user and what a user can access or access and edit. Permissionwebservices exposes the RESTful API for global, types, attribute, and catalog permission checks.  The typical requests for global and catalog permissions:

Global permissions Catalog permissions
https://localhost:9002/ permissionswebservices/ v1/ permissions/ principals/ admin/ global?permissionNames= smartedit.configurationcenter.read https://localhost:9002/ permissionswebservices/ v1/ permissions/ principals/ admin/ catalogs?catalogId= electronicsContentCatalog &catalogVersion=Staged


Example code:

/* Copyright (c) 2017 SAP SE or an SAP affiliate company. All rights reserved. */
angular.module('personalizationsmarteditRulesAndPermissionsRegistrationModule', [
]).run(function($q, permissionService, personalizationsmarteditRestService, personalizationsmarteditContextService) {

var getCustomizationFilter = function() {
return {
currentPage: 0,
currentSize: 1

// Rules
names: ['se.access.personalization'],
verify: function() {
return personalizationsmarteditContextService.refreshExperienceData().then(function() {
return personalizationsmarteditRestService.getCustomizations(getCustomizationFilter()).then(function() {
return $q.when(true);
}, function(errorResp) {
if (errorResp.status === 403) {
//Forbidden status on GET /customizations - user doesn't have permission to personalization perspective
return $q.when(false);
} else {
//other errors will be handled with personalization perspective turned on
return $q.when(true);

// Permissions
aliases: ['se.personalization.open'],
rules: ['se.read.page', 'se.access.personalization']
  • SmartEdit side:
    • Define a rule “se.access.personalization” => REST CALL to /customizations. If 403, rule result is NO ACCESS
    • Define a rule “se.read.page” => catalogVersionPermissionService.hasReadPermissionOnCurrent()
    • Define a permission
      • se.personalization.open” = ”se.read.page” and “se.access.personalization
    • perspectiveService creates a personalization perspective with a constraint: se.personalization.open
    • Combined view toolbar item is created with a constraint:
      • se.read.page
  • SAP Commerce side:
    • /customization access is driven by spring security (user roles) and oauth scopes 

Catalog API. Returns catalog + catalog Versions you have access to, namely: catalog name, catalog version name, thumbnail of the homepage, and page display condition details.

CMS SmartEdit Structure API. Returns metadata about the specific CMS component type to determine which of the component’s attributes are editable, to handle editable and non-editable attributes properly. The response contains the field types as well. They are used for rendering the UI controls.

It provides information about

  • CMS component types (code, name, exposed attributes)
  • CMS component type attributes (qualifier, type, localization info)

Configuration API. CRUD operations with the configuration attributes.

Drag-and-drop service. Angular API for implementing drag-and-drop services.

Translation API. Provides the functionality to retrieve a SAP Commerce resource bundle for specific locales.

Gateway Factory. A gateway system between the iframe (where the storefront is injected) and a SmartEdit container (which contains the iframe). Establishes communication between the frames (outer/inner) using a pub/sub pattern

HTTP Interceptor Service. This service is used in the preprocessing and the postprocessing of $http requests and responses in AngularJS. There are four predefined interceptors:

  • httpAuthInterceptor – adds an authentication token to all REST requests
  • httpErrorInterceptor – handles all 401 (unauthorized access) or OCC validation errors
  • i18nInterceptor – appends a locale to an URL for the request and postprocess the..
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Today I would like to look into the case when a customer, having items in their authenticated shopping cart, but browsing the site anonymously, as a guest, creates a new shopping cart, and then log in. The question is what should happen to the items in the account-linked carts versus the items in the anonymous carts.

For merchants, there is a reason behind merging carts, and that is in wanting the customers not to forget the products they intended to buy earlier. From the customer’s perspective, having extra items in the cart after logging in may be confusing and undesired, especially if done during the checkout process and especially if the shopping carts are large and complex.

There are five possible solutions:

  • Non-interactive
    • “Session cart priority”. If a customer has an anonymous cart, the account-linked shopping cart will be ignored after the login. If an anonymous cart is empty, the shopping cart contains the items from the account-linked cart.
    • “Cart Archive”: Replacing the old cart with the latest cart; moving the contents of the old cart to the wish list or archived carts list and inform the user about the actions done,
    • “Merge”Combine the shopping carts together so the user still has a single shopping cart, and the merged cart replaces the old account-linked cart, and inform the user about the changes,
    • “Multiple carts”: Merging carts by default, but letting the customer undo the operation and restore one of two saved carts, “anonymous/guest” and “old account-linked”.
  • Interactive:
    • “Interactive” Explaining the situation to the customer and letting him decide what he wants to do, namely, which cart to keep or if he wants to merge them:
      • You have items both in your present cart. But you also have items stored in a previous cart. Would you like to:
        • with your present items
        • your saved items into your present purchase

I find the “Merge” and “Multiple carts” as the best options. The second strategy is explained at the end of this article as an “experimental solution”. It is not enough to explain how additional items appeared in the cart, but also let the customer decide which of the shopping carts they want to keep or merge.

In SAP Commerce, the default strategy is “Restoring an account-linked cart only if the anonymous cart is empty”, which it titled as “Merge” on the list above.  The idea is simple: SAP Commerce ignores an account-linked cart if a session cart is present.

The alternative SAP Commerce’s out-of-the-box strategy is about merging the account-linked and anonymous carts so that the resulting account-linked cart contains all their unique items combined together. However, it is not designed to be interactive: you can’t pause the operation to let the customer decide.

Such online stores as Amazon, eBay, Target, Etsy, Walmart, Aliexpress, Barnes and Noble use this approach too. More specifically, they use “a silent cart merge”: the cart from the session is combined with the cart from the account after the customer is logged in without any messages or popups.

Using merging carts functionality raises new questions and challenges.

For example, if you have the same SKU in both carts, how many products should be in the final cart after merging is done? Common sense tells us that the right answer is one product in the amount of two pieces. That is exactly how I ordered two iPhone6 cases instead of one I needed a month ago.

If one or more products are not available at the time of cart recalculation, SAP Commerce simply removes them from the shopping cart and inform the user. This simple strategy is used by default. Is it correct? If it works well for the most cases, for some business and setups, such as online grocery stores or B2B portals, you may find it inappropriate. For such cases, consider an alternative approach, in which the products are marked as temporarily unavailable and the system offers substitute products instead of unavailable ones. Marking products as unavailable instead of removing them from the shopping cart has another advantage: the unavailable products can be back in stock next time the customer visits the website.

Also, you need to check the case when incompatible (mutually exclusive) products exist in the basket after the merge. The products should remain in the basket, but your shop should notify the customer and prevent the order from being completed until the incompatible product has been removed.

If anonymous customers are able to change the state of their shopping cart by adding a coupon or specifying the delivery details, the merge process may become challenging if the session and account-linked cart configurations are in conflict. If the anonymous cart already has a coupon code, the coupon code of the new cart should be discarded.

Yet another question is often raised: should we restore the anonymous cart after the customer signs out? Of course, the cart must be emptied at the end of the session. The rationale is the following:

  • leaving the account-linked cart doesn’t seem right because of security and privacy considerations and
  • restoring the anonymous cart is not correct too, because one of the products from the original cart or the whole cart can be already purchased by the end of the customer session

SAP Commerce Out-of-the-box Merging Carts Strategy: Architecture and implementation details

To use mergingCartRestorationStrategy

  • create a your-customized-addon extension
  • add the following code in the your-customized-addon-web-spring.xml
    <alias name=”mergingCartRestorationStrategy” alias=”cartRestorationStrategy”/>
  • Install the addon via
    ant addoninstall -Daddonnames=”customized-addon extension” -DaddonStorefront.yacceleratorstorefront=”your acceleratorstorefront”

Let’s look closer at how it is implemented under the hood.

A customer logs in the system. Spring security config defines the bean responsible for success login scenario, that is loginGuidAuthenticationSuccessHandler. This bean is linked to the class/method GUIDAuthenticationSuccessHandler.onAuthenticationSuccess(request, response, auth). In this class, the system sets up cookies and pass the baton to StorefrontAuthenticationSuccessHandler.onAuthenticationSuccess (request, response, auth). If the user is not from the admin group, the system attempts to restore a cart by calling CartRestorationStrategy.restoreCart(request).

So, to activate a merging cart strategy, you need to change a cartRestorationStrategy parameter of the defaultLoginAuthenticationSuccessHandler bean in <yourstorefront>/web/webroot/WEB-INF/config/spring-security-config.xml

Scenario 1. DefaultCartRestorationStrategy (“Restoring an account-linked cart only if the anonymous cart is empty”).

The restoreCart method of this strategy checks if the current shopping cart is empty or not. If it is empty, the latest account-linked cart will be restored. Otherwise, nothing will be changed (the session cart will be untouched, as is).

Scenario 2. MergingCartRestorationStrategy (“Restoring an account-linked cart only if the anonymous cart is empty

The case when both a session cart and account-linked cart exist is processed by restoreCart in the mergingCartRestorationStrategy.

The system finds the most recent cart (let’s name it “A”) for the current user and the current website and merge it with the session cart (let’s name it “B”). The process has the following steps:

  • Recreates a new session cart (let’s name it “B*” from the current session cart (B)
    • Removes all payment transactions linked to the cart B
    • Re-generates cart GUID for B*
    • Recalculates the cart B* (entries, totals, discounts, taxes…) via commerceCartCalculationStrategy
      • If any entries aren’t available, they won’t be added
    • Updates promotions on B*
    • Executing cart calculation hooks (“before” and “after”) on B*
  • Recalculates the cart (B*) again (entries, totals, discounts, taxes…) via commerceCartCalculationStrategy. This time for the new cart object which is a duplicate of the original session cart.
  • Update promotions again on B*. This time for the new cart object which is a duplicate of the original session cart.
  • Calculates external taxes if needed
  • Merges a recreated session cart (B*) with an account-linked cart (A, the most recent if more than one)
    • Special processing for entry groups (if any)
    • Move all cart entries from the account-linked cart (A) to the current session cart (B*)
    • Remove old session cart (B)
  • The merged cart is a union of A and B,  C=A+B*;
  • Recalculates the resulting cart (A+B*)
  • Calculate (A+B*) cart [again] via commerceCartCalculationStrategy
  • Update promotions again on A+B*.

In terms of performance, merging carts could be slow and resource-intensive, because your shopping is recalculated against different sets of products several times within a single session.

As you see from the process above, the shopping cart is recalculated three times.  Moreover, in fact, there are another three times outside this process, that makes six recalculations per session. In all these six calls, cart recalculation is followed by applying promotions process which is also slow and CPU and memory intensive, especially for large carts and a big number of promotions. I recommend to include this scenario in your performance tests and plan and implement the changes if necessary.

How to reproduce:

  • Prerequisites:
    • Account-linked cart – 6 items
    • Anonymous cart – 1 item
  • Scenario:
    • A customer signs in having 6 items in the account-linked cart and 1 item in the session cart.
  • Totally, after the customer logged in, the following methods are called six times (!):
    • calculateEntries, calculateTotals, updatePromotions – 6 times
      • First – From onAuthenticationSuccess →  loginSuccess →  updateSessionCurrency → setSessionCurrency -> recalculateCart → calculateEntries
      • Second –  From onAuthenticationSuccess → loginSuccess -> recalculateCart → calculateEntries
      • Third – From onAuthenticationSuccess → restoreCart (mergingCartRestorationStrategy) → restoreCartAndMerge → restoreCart (service) → restoreCart (DefaultCommerceCartRestorationStrategy) → recalculateCart  → calculateEntries
      • Fourth – From onAuthenticationSuccess → restoreCart (mergingCartRestorationStrategy) → restoreCartAndMerge → restoreCart (service) → restoreCart (DefaultCommerceCartRestorationStrategy) → calculateCart  → calculateEntries
      • Five – From onAuthenticationSuccess → restoreCart (mergingCartRestorationStrategy) → restoreCartAndMerge → restoreCart (service) → restoreCart (DefaultCommerceCartRestorationStrategy) → recalculateCart  → recalculate → calculateEntries
      • Six – From onAuthenticationSuccess → restoreCart (mergingCartRestorationStrategy) → restoreCartAndMerge → restoreCart (service) → restoreCart (DefaultCommerceCartRestorationStrategy) → recalculateCart  → calculate → calculateEntries
Alternative solution: Undo/Multiple carts

I consider this solution as experimental. I haven’t seen any live shops using this approach. If you know any, please let me know.

In this solution, the carts are automatically merged too, but the difference is in the way what the customer can do afterwards. If the carts are merged, the customer is redirected to the shopping cart page where they can undo the operation and restore one of two saved carts, “anonymous/guest” and “old account-linked” or leave things as is.

If the customer decided to merge the carts, clear out the anonymous cart and move the items to the account-linked cart. If the user chose to keep the carts separate, let the customer continue shopping with the account-linked cart. Once the user signs out (or he is automatically signed out), revert to the anonymous cart.

If implemented right, this could be a really slick feature unique to SAP Commerce.

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 
1. Introduction

The purpose of this work is to describe and demonstrate how Vue Storefront can be integrated with SAP Commerce as well as identify the weaknesses, pitfalls and bottlenecks of the solution.

1.1. What is Vue Storefront?

Vue Storefront is known as an open-source mobile-first Progressive Web App storefront, one of two available on the market. Initially developed for Magento, it was later integrated with a number of backend platforms, such as Pimcore, Prestashop, BigCommerce, WooCommerce or Shopware.

1.2. What makes it unique?
  • Platform-agnostic. Vue Storefront has a layer which is in charge of translating the platform-specific API calls and formats into the Vue Storefront general data abstraction.
  • Mobile first. The product is consistent with the principles of progressive advancement: designing for the smallest screen and poor connection and working the way up. People have spent more and more time on the internet from mobile ends, and mobile devices are becoming the main screen in their lives.
  • Offline first. With some exceptions, many components of Vue Storefront can be usable even if the customer is temporarily offline. That is is achieved by the “offline first” or “cache first” pattern: if a resource is cached and available offline, return it first before trying to download it from the server. If it isn’t in the cache already, download it and cache it for future usage.
  • Server-Side Rendering. The SSR is used for handling the render when a user or search engine crawler requests the first page. When the server receives the request, it renders the required component into the HTML, and then sends it as a response to the client. Pre-fetching of some frequently used pages, such as Homepage, Category page, Product page, and Checkout page helps to make it even more universal. Read more: https://medium.freecodecamp.org/what-exactly-is-client-side-rendering-and-hows-it-different-from-server-side-rendering-bd5c786b340d 
  • Modern technology stack. Being built with Node.js, the storefront technology allows developers to write JavaScript server-side and client-side coding. Javascript is definitely a natural part of your e-commerce development process. Besides that, this architecture becomes easy to send and synchronize the data between client and server components automatically. The system is built to be customizable via extensions, themes, easy to update with separated core when new versions are released.
  • Frequent updates, great support, and a clear roadmap. The storefront was introduced last summer. The developers keep us on our toes with new releases almost every week. The official slack channel is also very responsive. There is also a planning board available publicly on the GitHub.
  • Open source and MIT License. The license allows programmers to put their code in proprietary software (on the condition that the license is given with that software). MIT license is GPL-compatible, meaning that the GPL permits programmers to combine and redistribute it with software that uses the MIT License. It means that if you want to use it with the commercial software, everything you need is to include a copyright notice for the work it has used. It doesn’t mean the entire commercial work is then licensed under the MIT license.

There are many interesting features of the smaller scale, such as

  • Image Lazy Loading of Images. The idea behind lazy loading images is that you wait until a user scrolls further down the page and the image comes into view before making the network request for it. If your web page contains multiple images, which is the case of the product lists, you’ll end up saving bandwidth as well as ensuring that your web page loads quicker.
  • Native app features, such as “Install on Home Screen” supported by iOS, Android, and Chrome. Push notifications are not supported in full, but there is some foundation. Read more: https://developers.google.com/web/fundamentals/app-install-banners/ 
  • Supports both browser’s offline storages (IndexedDB, WebSQL, or localStorage) and ServiceWorker native caching (Cache API). Each caching approach has its pros and cons, and you can map the services to the caching approach in the configuration.
1.3. Challenges 1.3.1 Is it production ready?

Of course, there are risks we need to take into account:

  • The product is still new and immature, but promising. Because of the relatively large codebase, it cannot be learned in a day. The product requires effort before you are able to master them. You may need to hire a trained person to make things easier, but this will incur additional costs. The technologies used are not common for typical SAP Commerce project.
  • Vue storefront is being changed in terms of architecture. Some components might not be backward compatible because of inevitable changes in architecture.

As an example, in the last version, the developers got rid of IndexedDB for the offline cache because of the network-related issues and problems with network errors handling.

However, we need to recognize that in its first six months of existence, Vue Storefront was integrated with many backend platforms and was a foundation for a dozen (at least) of commercial projects.

1.3.2 SAP Commerce Cloud

Despite on a great integration layer and tooling, SAP Commerce Cloud is not on the list of the platforms Vue Storefront has connectors to. As we found in our research, SAP Commerce Cloud integration interfaces and data model are not fully compatible with Vue Storefront API calls and data model and require customization at least on the SAP Commerce’s side.

Also, you need to understand that many SAP Commerce OOTB backoffice capabilities won’t work with Vue Storefront in full without touching Vue Storefront codebase.

Some examples of the pieces of functionality of SAP Commerce which won’t work with Vue Storefront without customization:

  • AdaptiveSearch and Search configuration won’t work in full because of different implementation of facet and search UIs
  • SmartEdit and WCMS won’t work because they are built around a page builder. Data-driven component-based dynamic page composition is not supported by Vue Storefront.
  • Assisted Service Module, Promotions  and so on.

It’s also worth noting that the listed drawbacks are not related to Vue Storefront only. They are common for any setup where a custom storefront is involved. A big advantage of Spartacus storefront is that this support is in the roadmap.

Other VueStorefront-specific challenges we faced to are listed below in the Challenges section.

2. Objective

The objective of the work is to

  • Understand the maturity and readiness of the product, limitations, and capabilities.
  • Design and implement a proof-of-concept of Vue Storefront/SAP Commerce integration, document the architecture decisions and rationale behind those decisions, challenges, and pitfalls.
  • Create a foundation for the full-fledged solution.
3. Authors

I would like to commend Marshall Levin for his useful advice and his effort to read this long report.

4. Approach 4.1. Vendor-recommended Solution for Integration

As the official documentation says, Vue Storefront is platform-agnostic, which is explained as an ability to get connected to any CMS/e-commerce platform. In fact, it can be easily integrated only to the products which share with Vue Storefront some set of concepts. For others, “you need an adapter”.

The out-of-the-box integration mechanism involves the use of ElasticSearch as a backend for all catalog operations and direct calls to the e-commerce platform for all cart/user/order related operations (See https://github.com/DivanteLtd/vue-storefront-api) . Both sets of interfaces are standardized and documented.  

Figure 1. Vendor recommended integration approach. 

The out-of-the-box API connector works in two phases:

  • Data pump. This component is used for pulling static data, such as catalog or orders, from the eCommerce platform to Vue Storefront ElasticSearch. The format of the pulled data will be changed to the one consumed by the integration layer, vue-storefront-api. After pulling the data you can display a product catalog in Vue Storefront. After pumping the data into ElasticSearch it will stay in sync with changes on the backend platform side and update its content.
  • Task queue / Worker pool APIs. It involves synchronization of so-called dynamic calls (user sessions, cart rules etc) that can’t be stored in the database and needs to be called by vue-storefront-api directly from the backend platform.

Getting these components adopted for the particular platform, Vue Storefront will work with them as is, without any customizations in its code.

Some of the backend platforms already have their integrations (Magento 2, Magento 1, CoreShop, BigCommerce, WooCommerce) but you can easily make your own with integration boilerplate.

This is how the recommended approach looks like.

4.2. Alternative Solutions

Along with the recommended approach, we have developed a couple of project-specific solutions to consider.

  • Vendor-recommended:
  • Alternatives:
    • Using vanilla OOTB SAP Commerce API (OCC v.2) directly from the storefront or from the extension for the VueStorefont API.
    • Extending SAP Commerce only. Zero customizations in the API layer,  Vue Storefront and boilerplate.
4.3. Analysis of the Possible Solutions

Table 1. Integration approaches to consider and resolutions

After some analysis, we have come to believe that using custom middleware only for a bunch of simple requests is an additional burden and thick layer of complexity. We decided to get rid of it as well.

So, option #4 turned out to be the best choice. For the option #4, we need to parse Magento and ElasticSearch APIs and generate the compatible responses.

Figure 2. Chosen integration approach (Option #4)

The Vue Storefront connector (SAP Commerce CS API at the diagram above) exposes all required interfaces mimicking the API interfaces of ElasticSearch and Vue Storefront API layer. To emulate the interfaces in full, we used the following approaches:

  • Reconstructing business logic from the source code of Vue Storefront.
  • Logging reverse proxy to log all requests and responses as well as the POST payloads for further analysis.

In order to get possible to see the results as the new API endpoints come to play, we created a reverse proxy with conditional URL routing capability.  

Figure 3. Using a reverse proxy for the first phases of development

The development was structured into the following four phases:

  • Phase 1. Reverse proxy redirects all requests to demo.Vue Storefront.io and delivers the results back to Vue Storefront. Vue Storefront is configured to work only with a proxy.
  • Phase 2. Some of the requests are redirected to the SAP Commerce module. The module delivers static JSONs only.
  • Phase 3. All requests are redirected to the SAP Commerce module by proxy. We remove the proxy from the request flow and connect SAP Commerce module directly to Vue Storefront.
  • Phase 4. Static JSONs are replaced with a response generator with real business logic, one entity at a time: categories, then products, then reviews etc.

Reverse proxy was developed using Python and werkzeug (a WSGI Server python library). After phase 3, this server was excluded from the project and used occasionally only for troubleshooting of protocol-related issues.

5. Challenges

The details of the challenges below, as well as the solutions for them, are explained below in the “Architecture/Challenges”.  

  • All unique identifiers are numbers in Vue Storefront, not strings! Not a big deal, of course, but it made us come up with the workarounds.
  • On-the-fly image resize is part of the reference architecture and part of core functionality. SAP Commerce can’t resize images on-the-fly. Also not a big deal.
  • Documentation. The principle “Good code is its own best documentation” doesn’t work with Vue Storefront primarily because of the prototype-based nature of Javascript. At run-time, you can read, change, add or remove properties of any object, and it makes difficult to collect all possible attributes and separate them from the attributes not used in the code.  
    • Need to parse ElasticSearch Query DSL. Need to generate/process nested documents. The ElasticSearch protocol is documented, but only a subset (a finite number of use cases) is actually used by Vue Storefront.
    • API is poorly documented. For example, only some non-documented subset of product and category properties are actually used by Vue Storefront.
  • Internationalization. We found that it is not 100% compatible with the concepts used in SAP Commerce. For example, a product or category in hybris is the same business object for all language versions, but some specific product or category attributes, ‘localized attributes’, may have more than one language version. Vue Storefront works with the separate sets of products or categories for different language versions.
  • Shopping cart and checkout are implemented completely on the Vue Storefront side and they are relatively independent of the e-commerce platform. The good side of this implementation approach is that there is a mechanism to keep data in sync with an e-commerce platform. In Vue Storefront, both the shopping cart page and checkout are implemented differently than these in SAP Commerce.  
  • Facets. The facet search setup is not platform or data-driven, but configuration-driven, and it is completely on the Vue Storefront side. Such useful tools as SAP AdaptiveSearch won’t work with Vue Storefront facets in a similar way as it is implemented in the SAP Commerce reference stores.  Ranges are not implemented well; Currently, a range facet functionality is designed/used primarily to implement a price group facet.
  • Performance. In hybris, some services are backed by the relatively slow database layer while in Vue Storefront they are backed by super-fast NoSQL layer (ElasticSearch). For example, categories and facet definitions are stored in the database in Hybris. In Vue Storefront/Magenta they are indexed in ElasticSearch.  
6. Results

We developed a custom module for SAP Commerce, the sole objective of which is converting the requests from Vue Storefront to hybris service calls and converting the responses to the format Vue Storefront expects.

Integrating VueStorefront and SAP Commerce (Hybris) - YouTube

As of today, we don’t share the source code of the PoC.

7. Architecture and technical details

Let’s start with the quick overview of the Vue Storefront architecture, followed by the details of the integration with SAP Commerce.

7.1 Architecture of Vue Storefront in a Nutshell

In this section, you will know about the technologies and key architectural decisions used in the storefront solution.

7.1.1 Vue Storefront Technology Stack Overview Two-tier, client-server, separate deployment

Vue Storefront is built upon the client-server pattern and can be installed separately from the e-commerce platform.

The storefront can be deployed separately, with its own release cadence. You can use more than one storefront instance.


Node.js is a way to develop web applications using one language for the whole web app, rather than two – and this language is JavaScript. In some tests, NodeJS shows better performance for backend tasks than such languages as Python, Ruby, or PHP.

VueJS and VueX

Vue.js is a popular JavaScript framework with various optional tools for building user interfaces. One of the greatest advantages of Vue.js is its small size. The size of this framework is 18–21KB and it takes no time for the user to download and use it. It beats all the bulky frameworks like React.js, Angular.js, and Ember.js.

Vue.js makes the use of virtual DOM, which is also used by other frameworks such as React, Ember, etc. The changes are not made to the DOM, instead, a replica of the DOM is created which is present in the form of JavaScript data structures. Whenever any changes are to be made, they are made to the JavaScript data structures and the latter is compared with the original data structure. The final changes are then updated to the real DOM, which the user will see changing. This is good in terms of optimization, it is less expensive and the changes can be made at a faster rate.

VueX is a state management pattern and library for Vue.js applications. It serves as a centralized store for all the components in an application, with rules ensuring that the state can only be mutated in a predictable fashion. Unlike similar patterns such as Flux and Redux, Vuex is also a library implementation tailored specifically for Vue.js to take advantage of its granular reactivity system for efficient updates.

In Vue Storefront, NodeJS and VueX are core technologies you need to know to understand the architecture and plan customizations.

7.1.2 Overall architecture diagram

Figure 4. Vue Storefront. The overall architecture diagram.

7.1.3 Concepts Service Worker

A service worker is a script that your browser runs in the background, separate from a web page. Today, they already include features like push notifications and background sync. Vue Storefront uses this concept for caching out static and dynamic data feeds and to make them available offline as well as to run offline data sync.

Caching and IndexedDB

For the version 1.7, the Vue Storefront development team made a decision to get rid of using an in-browser IndexedDB (IndexedDB) in favour of ServiceWorker caching. Explaining the reason, they reported that the indexedDB caching proved to be a problematic solution because of problems with network-errors handling and that the indexedDb sometimes has really strange response times (>800ms) and sometimes even kind of deadlocks.

Task Queues

Task Queues are used to manage background work that must be executed outside the usual HTTP request-response cycle. Tasks are handled asynchronously either because they are not initiated by an HTTP request or because they are long-running jobs that would dramatically reduce the performance of an HTTP response.

Task queues in Vue Storefront supports API authentication mechanisms and re-request a token if session is over.

The task queue has two interfaces, “offline mode friendly” and “normal”.

Some types of network calls shouldn’t have been queued. For example the shopping cart synchronization or other tasks that operate on the volatile data/state. This is where the normal mode is used. The..

Read for later

Articles marked as Favorite are saved for later viewing.
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Separate tags by commas
To access this feature, please upgrade your account.
Start your free month
Free Preview