Platform Guide
Key concepts
Our platform leverages Apache Kafka, an open-source distributed event streaming platform.
Transport protocol
All the data is offered by the platform using the Kafka protocol.
More information can be found in the official Kafka Protocol Guide.
Events
An event
is a single message in the Kafka stream.
In our case, all the messages are represented using a proprietary Unified Data Format
created for the purpose in Protobuf
for high efficiency, flexibility, and extendability.
See also Events and Kafka Partitions.
Topics
A topic
in the Kafka language is a way to organize the events
.
Following a publish/subscribe
model, events
can be written by publishing them to a certain topic
and can be read by subscribing to the same topic
.
See also Kafka Topics.
Consumers
Consumers
are clients subscribed to the topics
and receiving the stream of events
.
A consumer
can be represented by multiple instances, therefore Kafka provides the concept of consumer group
. All the instances sharing the same consumer group
will have a shared access to the data, meaning that the events
will be sent only once across the same group.
TomTom will provide a set of allowed group IDs that can be used to consume the data.
See also Kafka Consumers and Consumer Groups.
Requesting data
To establish a successful connection with the service, you must follow the steps in the Getting started section.
Authentication
The platform supports OAuth2 authentication: an authorization framework enabling secure resource access by allowing applications to obtain limited access without exposing user credentials.
Your unique API-Key for the "Connected Service API" connects to the OAuth2 token endpoint enabling the retrieval of the OAuth2 token for connecting securely to the Kafka cluster.
Given that the API-Key handles authentication, the inclusion of clientId
and clientSecret
, although required in configuration, is superfluous and can be set to any valid string.
The following section presents an example of client configuration that demonstrates the authentication flow facilitated by the platform.
Testing the connection
The quickest way to test the new topic and start reading content is to use kafka-console-consumer
, a tool that's included in the Kafka package. See APACHE KAFKA QUICKSTART for instructions on how to use it.
TomTom recommends client software that is compatible with Kafka major version 3.
Create a configuration file with:
1echo "group.id=INSERT_CONSUMER_GROUP_HERE2security.protocol=SASL_SSL3sasl.mechanism=OAUTHBEARER4sasl.jaas.config=org.apache.kafka.common.security.oauthbearer.OAuthBearerLoginModule required clientId=\".\" clientSecret=\".\";5sasl.login.callback.handler.class=org.apache.kafka.common.security.oauthbearer.OAuthBearerLoginCallbackHandler6sasl.oauthbearer.token.endpoint.url=https://api.tomtom.com/connected-services/oauth2/token?key=INSERT_APIKEY_HERE&apiVersion=1" > config-consumer.properties
Remember to replace the placeholders in the previous template for both the group.id
and key
. The latter is a secret and must be securely stored.
Finally, test the connection with:
bin/kafka-console-consumer.sh --bootstrap-server kafka-bootstrap.prod.connected-services.tomtom.com:9094 --topic INSERT_TOPIC_HERE --consumer.config config-consumer.properties
Again, remember to replace the placeholder with the actual name of the topic
.
The list of available topics are retrieved with the following command:
bin/kafka-topics.sh --bootstrap-server kafka-bootstrap.prod.connected-services.tomtom.com:9094 --list --command-config config-consumer.properties
Creating a consumer application
To learn more about building a Kafka consumer application, explore the following resources: