Looking at S-Core today feels a bit like walking into a German factory that has just announced a “major transformation initiative.” Before anything actually starts to change, there are:
- discussions about processes
- and documents about these discussions
- and meetings about these documents
- and calls to align on how these documents about those processes should be structured
- and how such document structure will impact work in 20 years from now
The intention is good, but as my colleague from work pointed out some time ago: "bureaucracy expands to meet the needs of the expanding bureaucracy". It may sound funny, but this is how European automotive software has been built since years.
For a long time, this approach worked surprisingly well. German - and more broadly European - OEMs optimized for correctness, documentation and process, not for speed or innovation. That slowness was not a bug, it was really a feature and part of the mindset. The market was stable, competition was predictable and even strong players like Japanese or Korean manufacturers did not disrupt that golden setup . Nobody was releasing cars like software updates because nobody really had to. Building slowly, reviewing it thoroughly and certifying it carefully was acceptable when everyone played the same game.
However, when Chinese manufacturers entered the market at full speed, they started producing new models at a pace that European processes simply cannot match. Not because the correctness does no longer matter, but because speed and innovation suddenly do.
This is the tension S-Core seems to be trying to address. Unlike many open-source projects that start out fast and experimental and only later attempt to “add automotive”, S-Core approaches the topic how we like it in Europe.
We don't like when the things break, especially when these things are 2 tons SUVs running 140km/h on a highway and especially in an undocumented way, so it starts strict. Process, structure and ASIL-certifiability are part of the baseline, not something to care about in the future. After all, automotive software engineers know that writing code is only part of the job, the other part is to make that code safe, compliant and legally sellable in a heavily regulated market.
At the same time, S-Core wants to be open source, so that the code evolves in a dynamic way, powered by multiple contributors. If this combination works (and not just combines the disadvantages of both approaches), it could be a revolution for the entire automotive industry.
So let's give it a chance and try to implement a minimal S-Core publisher and subscriber using the latest release v0.5.
S-Core development environment setup
Before writing any code, let's setup the development environment. The prerequisites are:
- install Docker: https://docs.docker.com/engine/install/ubuntu/
- install Bazel: https://bazel.build/install/ubuntu
S-Core provides its development environment as a container in this GitHub repo. To use it, create .devcontainer/devcontainer.json file in the root of your project with the following content:
{
"name": "eclipse-s-core",
"image": "ghcr.io/eclipse-score/devcontainer:v1.1.0"
}
Now, open the project with VS Code and click "Reopen in Container" button on the pop-up window.
Read also on pikotutorial.com: How to write Arduino Uno code with Python?
Minimal S-Core publisher/subscriber implementation
With the environment in place, we can finally write some code. The goal here is intentionally simple: a minimal publisher and subscriber where publisher broadcasts a vehicle position as an event and subscriber reads that data. No optimizations, no abstractions - just the smallest example that shows how the pieces fit together in S-Core and how it is different (or similar) to Adaptive AUTOSAR.
Note: although the namespaces coming from S-Core are long, I don't use any
using namespacefor shortening them and keep all the S-Core types prefixed with their namespaces explicitly. This is on purpose, to make it clear in the example which S-Core type comes from which namespace.
Communication modelling
Before writing publisher or subscriber code, communication must be modeled. The S-Core's documentation says explicitly that S-Core relies only on the source code and does not utilize any domain specific modelling language for modelling of the communication. So why do I call this section "modelling"?
It's just because LoLa (the implementation of S-Core's IPC communication), although it uses pure C++, it requires a very specific way of interface definition, so if you come from Adaptive AUTOSAR, it will be easier for you to understand the purpose of a bit bizarre C++ code that I will write in the next step.
Let's then define our interface between publisher and subscriber. Create model/position_interface.h:
#ifndef MODEL_POSITION_INTERFACE_H
#define MODEL_POSITION_INTERFACE_H
#include "score/mw/com/types.h"
struct Point
{
std::uint32_t x;
std::uint32_t y;
};
template <typename Trait>
class PositionInterface : public Trait::Base
{
public:
using Trait::Base::Base;
typename Trait::template Event<Point> center{*this, "center"};
};
using PositionInterfaceProxy = score::mw::com::AsProxy<PositionInterface>;
using PositionInterfaceSkeleton = score::mw::com::AsSkeleton<PositionInterface>;
#endif // MODEL_POSITION_INTERFACE_H
Let's now break it down:
-
Pointrepresents the structure of the data being exchanged between the publisher and subscriber - in this case it's just a 2D point representing the coordinates. -
PositionInterfacerepresents the interface which connects publisher and subscriber - both of them will have access to its elements. -
Event<Point> centerdefines an element of the interface. S-Core promises supports for events, fields and methods, but at the time of writing this article, methods are not supported yet. In our example, we definecenteras anEventwith an underlying data structure beingPoint. Single interface may have multiple events. -
PositionInterfaceProxy- type of the object via which the subscriber will access thecenterevent to read the data. -
PositionInterfaceSkeleton- type of the object via which the publisher will access thecenterevent to write the data.
Publisher implementation
Create src/publisher/main.cpp in your project. First add the necessary S-Core headers:
#include "score/mw/com/impl/instance_specifier.h"
#include "score/mw/com/runtime.h"
And the header to our interface definition:
#include "model/position_interface.h"
Define constant values to use. Don't worry about the kPathToComConfig for now - I will explain it later.
static constexpr const char* kPathToComConfig {"config/communication_config.json"};
static constexpr const char* kInstanceSpecifier {"SomeInstanceSpecifier"};
Now let's implement the main function. First, set the path to the communication configuration defined above:
score::StringLiteral runtime_args[2U] = {"--service_instance_manifest", kPathToComConfig};
score::mw::com::runtime::InitializeRuntime(2, runtime_args);
Next create the service instance specifier, unique for each service:
const score::Result<score::mw::com::InstanceSpecifier> instance_specifier_result = score::mw::com::InstanceSpecifier::Create(std::string{kInstanceSpecifier});
if (!instance_specifier_result.has_value())
{
std::cerr << "Error: failed to create instance specifier!" << std::endl;
return 1;
}
const score::mw::com::InstanceSpecifier instance_specifier = instance_specifier_result.value();
Use the created instance_specifier to build the skeleton via which we will send the data from the publisher. Notice that PositionInterfaceSkeleton is the type alias declared in model/position_interface.h:
score::Result<PositionInterfaceSkeleton> skeleton_result = PositionInterfaceSkeleton::Create(instance_specifier);
if (!skeleton_result.has_value())
{
std::cerr << "Failed to create skeleton for instance specifier "
<< instance_specifier.ToString() << std::endl;
return 1;
}
auto& position_skeleton = skeleton_result.value();
Service offering must be done explicitly with OfferService():
const auto offer_result = position_skeleton.OfferService();
if (!offer_result.has_value())
{
std::cerr << "Failed to offer service "
<< instance_specifier.ToString() << std::endl;
return 1;
}
Now it's time for the actual work - create the data and send it to the world. I add a 1 second delay between each send to make it easier to see what's going on in the terminal later.
for (std::uint32_t i=0U; i<50U; i++)
{
Point data {
.x = i,
.y = i * 2U
};
std::cout << "Sending data ("
<< data.x << ", " << data.y << ")" << std::endl;
position_skeleton.center.Send(data);
std::this_thread::sleep_for(std::chrono::seconds(1U));
}
After sending the data 50 times, stop offering the service and exit:
position_skeleton.StopOfferService();
return 0;
Subscriber implementation
The subscriber begins similarly to the publisher, so first we add the necessary headers in src/subscriber/main.cpp:
#include "score/mw/com/impl/instance_specifier.h"
#include "score/mw/com/runtime.h"
#include "model/position_interface.h"
And define the constants, but here there is one more than in the publisher - the size of the buffer of the incoming data (once again - don't worry about the config path for now):
static constexpr const char* kPathToComConfig {"config/communication_config.json"};
static constexpr const char* kInstanceSpecifier {"SomeInstanceSpecifier"};
static constexpr std::size_t kBufferSize {3U};
In the main function we start with setting the config file path and the instance specifier creation:
score::StringLiteral runtime_args[2U] = {"--service_instance_manifest", kPathToComConfig};
score::mw::com::runtime::InitializeRuntime(2, runtime_args);
const score::Result<score::mw::com::InstanceSpecifier> instance_specifier_result = score::mw::com::InstanceSpecifier::Create(std::string{kInstanceSpecifier});
if (!instance_specifier_result.has_value())
{
std::cerr << "Error: failed to create instance specifier!" << std::endl;
return 1;
}
const score::mw::com::InstanceSpecifier instance_specifier = instance_specifier_result.value();
Now comes the important part - actually finding the service that our publisher offers. In general, there are 2 ways of finding the service:
- synchronous with
FindServicefunction - asynchronous (non-blocking) with
StartFindServicefunction
In this example I will use FindService to make the example as explicit as possible by iterating and waiting 1 second after each failed iteration:
score::mw::com::ServiceHandleContainer<score::mw::com::impl::HandleType> services{};
do
{
const auto services_result = PositionInterfaceProxy::FindService(instance_specifier);
if (!services_result.has_value())
{
std::cerr << "Error: failed to find services for specifier "
<< instance_specifier.ToString() << ": "
<< services_result.error() << std::endl;
return 1;
}
services = services_result.value();
if (services.size() == 0U)
{
std::this_thread::sleep_for(std::chrono::seconds(1U));
}
} while (services.size() == 0);
const auto service = services.front();
When the service is found, create the proxy via which subscriber will receive the data sent by the publisher:
auto proxy_result = PositionInterfaceProxy::Create(service);
if (!proxy_result.has_value())
{
std::cerr << "Failed to create proxy for the found service!" << std::endl;
return 1;
}
auto& position_proxy = proxy_result.value();
Now the most important part - the implementation of receiving the data. You do that by setting a receive handler on the center event. When the receive handler is called, it means that the new data is ready to be read. Inside of that receive handler implementation, you call GetNewSampes which let's you to actually access the received data:
position_proxy.center.SetReceiveHandler([&position_proxy](){
score::Result<std::size_t> num_samples_received = position_proxy.center.GetNewSamples(
[](score::mw::com::SamplePtr<Point> point) noexcept {
if (!point) {
std::cerr << "Received data is invalid" << std::endl;
return;
}
std::cout << "Received data ("
<< point->x << ", " << point->y << ")" << std::endl;
},
kBufferSize);
if (!num_samples_received.has_value()) {
std::cerr << "num_samples_received does not have a value!" << std::endl;
return;
}
std::cout << "Received " << num_samples_received.value()
<< " new samples" << std::endl;
});
After that, S-Core knows how you want to read the data, but to actually start receiving it, you must subscribe to the center event:
position_proxy.center.Subscribe(kBufferSize);
Because calling the receive handler is done asynchronously in the background, add the loop preventing application from exiting. You can base it on the signal handler, but for purpose of this simple example, I'll just go with:
while (true) {}
If you however go with the proper signal handling, you may want to add a cleanup after the loop which unsubscribes from the center event:
position_proxy.center.Unsubscribe();
return 0;
Communication configuration
Ok, now it's the time to start bother about the line which both publisher and subscriber share and which I earlier told you not to worry about:
static constexpr const char* kPathToComConfig {"config/communication_config.json"};
The implementation of the publisher and subscriber is not enough - the communication must still be configured in the JSON file by providing at least information presented below. Create the config/communication_config.json file with the following content:
Note: I keep instance specifier as
SomeInstanceSpecifierand service type name asSomeServiceTypeNameinstead of something likePositionServiceorPointto show that the names in the configuration file does not need to be the same or even related to the names in the interface model. They need to be however in sync inside the configuration file, so if you setserviceTypeNametoSomeServiceTypeNamein theserviceTypeslist, you must use the sameserviceTypeNamein theserviceInstanceslist.
{
"serviceTypes": [
{
"serviceTypeName": "SomeServiceTypeName",
"version": {
"major": 1,
"minor": 0
},
"bindings": [
{
"binding": "SHM",
"serviceId": 1,
"events": [
{
"eventName": "center",
"eventId": 1
}
]
}
]
}
],
"serviceInstances": [
{
"instanceSpecifier": "SomeInstanceSpecifier",
"serviceTypeName": "SomeServiceTypeName",
"version": {
"major": 1,
"minor": 0
},
"instances": [
{
"instanceId": 1,
"asil-level": "QM",
"binding": "SHM",
"events": [
{
"eventName": "center",
"numberOfSampleSlots": 3,
"maxSubscribers": 10
}
]
}
]
}
]
}
Read also on Medium: Combining Bazel with Docker
Building the project
General project setup
Create empty WORKSPACE file in the root of your project. Then create the Bazel runtime configuration .bazelrc file with the following content:
common --@score_baselibs//score/mw/log/detail/flags:KUse_Stub_Implementation_Only=False
common --@score_baselibs//score/mw/log/flags:KRemote_Logging=False
common --@score_baselibs//score/json:base_library=nlohmann
common --@score_communication//score/mw/com/flags:tracing_library=stub
common --registry=https://raw.githubusercontent.com/eclipse-score/bazel_registry/v0.5.0-beta/
common --registry=https://bcr.bazel.build
After that, configure thirdparty modules in MODULE.bazel file:
bazel_dep(name = "score_toolchains_gcc", version = "0.5", dev_dependency=True)
gcc = use_extension("@score_toolchains_gcc//extensions:gcc.bzl", "gcc", dev_dependency=True)
gcc.toolchain(
url = "https://github.com/eclipse-score/toolchains_gcc_packages/releases/download/0.0.1/x86_64-unknown-linux-gnu_gcc12.tar.gz",
sha256 = "457f5f20f57528033cb840d708b507050d711ae93e009388847e113b11bf3600",
strip_prefix = "x86_64-unknown-linux-gnu",
)
use_repo(gcc, "gcc_toolchain", "gcc_toolchain_gcc")
bazel_dep(name = "rules_boost", repo_name = "com_github_nelhage_rules_boost")
archive_override(
module_name = "rules_boost",
urls = ["https://github.com/nelhage/rules_boost/archive/refs/heads/master.tar.gz"],
strip_prefix = "rules_boost-master",
)
bazel_dep(name = "boost.program_options", version = "1.87.0")
bazel_dep(name = "score_baselibs", version = "0.2.2")
bazel_dep(name = "score_communication", version = "0.1.2")
bazel_dep(name = "trlc", version = "0.0.0")
git_override(
module_name = "trlc",
commit = "ede35c4411d41abe42b8f19e78f8989ff79ad3d8",
remote = "https://github.com/bmw-software-engineering/trlc.git",
)
Interface model setup
Create position_interface target in model/BUILD file:
load("@score_baselibs//score/language/safecpp:toolchain_features.bzl", "COMPILER_WARNING_FEATURES")
cc_library(
name = "position_interface",
hdrs = [
"position_interface.h",
],
features = COMPILER_WARNING_FEATURES,
deps = [
"@score_communication//score/mw/com",
"@score_baselibs//score/language/futurecpp",
],
visibility = [
"//src/publisher:__pkg__",
"//src/subscriber:__pkg__"
]
)
Configuration setup
Our configuration is a single JSON file, so there's nothing to build per se, but we need to export the configuration file, so that it can be later used by publisher and subscriber targets as a target in data attribute. In config/BUILD add:
exports_files([
"communication_config.json",
])
Publisher setup
In src/publisher/BUILD create publisher binary target:
load("@score_baselibs//score/language/safecpp:toolchain_features.bzl", "COMPILER_WARNING_FEATURES")
cc_binary(
name = "publisher",
srcs = ["main.cpp"],
data = ["//config:communication_config.json"],
features = COMPILER_WARNING_FEATURES,
deps = [
"//model:position_interface",
"@score_communication//score/mw/com",
],
)
Subscriber setup
In src/subscriber/BUILD create subscriber binary target:
load("@score_baselibs//score/language/safecpp:toolchain_features.bzl", "COMPILER_WARNING_FEATURES")
cc_binary(
name = "subscriber",
srcs = ["main.cpp",],
data = ["//config:communication_config.json"],
features = COMPILER_WARNING_FEATURES,
deps = [
"//model:position_interface",
"@score_communication//score/mw/com",
],
)
Building both applications
Now, to build the publisher, call:
bazel build //src/publisher
And to build the subscriber call:
bazel build //src/subscriber
Note: during my tests I was not able to successfully build the project without adding
--copt=-Wno-error=deprecated-declarationsto the above build commands.Running
It's finally time to run both applications. Open 2 terminals. In the first one call:
bazel run //src/publisher
And in the second one call:
bazel run //src/subscriber
In the publisher's terminal you should see logs like:
Sending data (5, 10)
Sending data (6, 12)
Sending data (7, 14)
And in the subscriber's terminal you should see logs like:
Received data (5, 10)
Received 1 new samples
Received data (6, 12)
Received 1 new samples
Received data (7, 14)
Top comments (0)