Send and Receive Cloud Events ​
This tutorial demonstrates how to connect two Functions asynchronously with cloud events. It is based on the in-cluster Eventing example.
The example provides a very simple scenario of asynchronous communication between two Functions. The first Function accepts the incoming traffic using HTTP, sanitizes the payload, and publishes the content as an in-cluster cloud event using the Kyma Eventing module. The second Function is a message receiver. It subscribes to the given event type and stores the payload.
This tutorial shows only one possible use case. There are many more use cases on how to orchestrate your application logic into specialized Functions and benefit from decoupled, re-usable components and event-driven architecture.
Prerequisites ​
Steps ​
Export the
KUBECONFIGvariable:bashexport KUBECONFIG={KUBECONFIG_PATH}Enable Istio service mesh for
defaultnamespace:bashkubectl label namespaces default istio-injection=enabledCreate the
emitterandreceiverfolders in your project.
Create the Emitter Function ​
Go to the
emitterfolder and run Kyma CLIinitcommand to initialize the scaffold for your first Function:bashkyma function init nodejsThe
initcommand creates these files in your workspace folder:handler.jswith the Function's code and the simple "Hello Serverless" logicpackage.jsonwith the Function's dependencies
Provide your Function logic in the
handler.jsfile:NOTE
In this example, there's no real sanitization logic but the Function simply logs the payload.
jsconst { SpanStatusCode } = require("@opentelemetry/api"); module.exports = { main: async function (event, context) { let sanitisedData = sanitise(event.data) const eventType = "payload.sanitised"; const eventSource = "my-app"; const span = event.tracer.startSpan('call-to-kyma-eventing'); // you can pass additional cloudevents attributes // const eventtypeversion = "v1"; // const datacontenttype = "application/json"; // return await event.emitCloudEvent(eventType, eventSource, sanitisedData, {eventtypeversion, datacontenttype}) return await event.emitCloudEvent(eventType, eventSource, sanitisedData) .then(resp => { console.log(resp.status); span.addEvent("Event sent"); span.setAttribute("event-type", eventType); span.setAttribute("event-source", eventSource); span.setStatus({code: SpanStatusCode.OK}); return "Event sent"; }).catch(err=> { console.error(err) span.setStatus({ code: SpanStatusCode.ERROR, message: err.message, }); return err.message; }).finally(()=>{ span.end(); }); } } let sanitise = (data)=>{ console.log(`sanitising data...`) console.log(data) return data }Include opentelemetry SDK in the Function dependencies. Add the following to the
package.json:js{ "dependencies": { "@opentelemetry/api": "^1.0.4" } }The
payload.sanitisedis a sample event type that the emitter Function uses when publishing events. You can choose a different one that better suits your use case. Keep in mind the constraints described on the Event Naming and Cleanup page. The receiver subscribes to the event type to consume the events.The
eventobject provides a convenient API for emitting events. To learn more, read Function's specification.Apply your emitter Function:
bashkyma function create nodejs emitter --source handler.js --dependencies package.jsonYour Function is now deployed in Kyma runtime. Kyma exposes it through the APIRule. The incoming payloads are processed by your emitter Function. It then sends the sanitized content to the workload that subscribes to the selected event type. In our case, it's the receiver Function.
Expose Function by creating the APIRule CR:
bashcat <<EOF | kubectl apply -f - apiVersion: gateway.kyma-project.io/v2alpha1 kind: APIRule metadata: name: incoming-http-trigger spec: hosts: - incoming service: name: emitter namespace: default port: 80 gateway: kyma-system/kyma-gateway rules: - path: /* methods: ["GET", "POST"] noAuth: true EOFRun the following command to get the domain name of your Kyma cluster:
bashkubectl get gateway -n kyma-system kyma-gateway \ -o jsonpath='{.spec.servers[0].hosts[0]}'Export the result without the leading
*.as an environment variable:bashexport DOMAIN={DOMAIN_NAME}Test the first Function. Send the payload and see if your HTTP traffic is accepted:
bashcurl -X POST "https://incoming.${DOMAIN}" -H 'Content-Type: application/json' -d '{"foo":"bar"}'You should see the
Event sentmessage as a response.
Create the Receiver Function ​
Go to your
receiverfolder and run Kyma CLIinitcommand to initialize the scaffold for your second Function:bashkyma function init nodejsThe
initcommand creates the same files as in theemitterfolder. In the following example, the receiver function logs the received payload.jsmodule.exports = { main: function (event, context) { store(event.data) return 'OK' } } let store = (data)=>{ console.log(`storing data...`) console.log(data) return data }Apply your receiver Function:
bashkyma function create nodejs receiver --source handler.js --dependencies package.jsonThe Function is configured and deployed in Kyma runtime. The Subscription becomes active and all events with the selected type are processed by the Function.
Subscribe the
receiverFunction to the event:bashcat <<EOF | kubectl apply -f - apiVersion: eventing.kyma-project.io/v1alpha2 kind: Subscription metadata: name: event-receiver namespace: default spec: sink: 'http://receiver.default.svc.cluster.local' source: "my-app" types: - payload.sanitised EOF
Test the Whole Setup ​
Send a payload to the first Function. For example, use the POST request mentioned above. As the Functions are joined by the in-cluster Eventing, the payload is processed in sequence by both of your Functions. In the Function's logs, you can see that both sanitization logic (using the first Function) and the storing logic (using the second Function) are executed.