Serverless automation with Waylay engine

The Waylay engine is the rule engine that separates information, control and decision flow using the smart agent concept: where sensors, logic and actuators are separate entities of the rule engine.

Waylay lambda functions (π›Œ) are defined as either sensors or actuators. Sensors are β€œtyped” π›Œ functions, which can return back a state, data or both.
agent

Any time sensors are executed their results (in the form of both sensor's data and sensor's state) are inferred by the rule engine, which may result in execution of actuators (other π›Œ functions) or other sensors.

Logic creation

Rules are created using a visual programming environment with drag and drop functionality, see the screenshot below or via REST calls. Once rules are created, they are saved as JSON files. The visual programming environment allows the developer to make use of the library of sensors and actuators, logical gates as well as mathematical function blocks.
designer

Tasks

In the Waylay terminology, tasks are instantiated rules. There are two ways tasks can be instantiated:

  • one-off tasks, where sensors, actuators, logic and task settings are configured at the time the task is instantiated
  • tasks instantiated from templates, where task creation is based on the template (which describes sensors, actuators and logic)

Task also defines a "master clock" of the rule, like polling frequency, cron expressions etc. (these settings can be inherit by sensors as well).
Before any π›Œ function (sensor or actuator) is invoked, the engine makes a copy of the task context, providing, if required, results and data from all sensors executed (till that moment in time) to the calling π›Œ function.

RuleWaylay

Let's look a little bit closer to the picture above. With blue arrows we label the information flow, with red arrows the control flow and with green arrows we label decisions.
Two sensors are shown on the left side of the picture and on the right we find two actuators. Every sensor is composed of three parts:

  • Node settings that define the control flow (when the function is executed)
  • Sensor settings - input arguments for the function
  • π›Œ function code itself, which returns back states and data

In the picture below we see sensor settings and π›Œ function code:

sensor

Control flow

Sensor, a π›Œ function, can be invoked:

  • Via polling frequency, cron expression or one time (defined either on the node level, or via the inheritance, using the "master clock" of the task settings).
  • On new data arriving. If the node can be addressed via resource, e.g. if the node is a labeled as a testresource any time data arrives for that resource function will be called. Payload, which triggered the sensor is available to the calling function (blue arrow).
  • As the result of other function calls (sensors), via state transitions of the attached sensor (depicted as the red arrow that goes from the top sensor to the other one)
  • As the outcome of multiple function executions via inference (via logical gates)
  • And of course, with all different conditions combined if needed!

Node settings are defined the moment the rule is configured:
node_settings-1

In this example, we decided to invoke the sensor only when data arrives for testresource and we have also configured eviction time of 10 seconds. This way we decide for how long each sensor information is valid. That is also an elegant way of merging different event streams where information is valid only for a short period of time, which is a very important aspect to take into consideration when making decisions as is explained here.

Information flow

Sensors can use as the input arguments:

  • Input settings (e.g. city, database record etc.)
  • Task context (result of any other sensor)
  • Runtime data (which triggered sensor execution)

Decisions

Decisions are modelled via attaching one or multiple actuators to sensor state (or state transitions), or combination of multiple nodes/states.
For more information, please check this blog post, which shows how powerful this rule expression is, compared to decision trees.

To find more about the Waylay engine and internals, go to our documentation page or read the following blogs on the same subject:

Discover more about the Waylay rules engine in our solution white paper: The Waylay Rules Engine: Advanced Automation for the Internet of Things.

Veselin Pizurica

Co-founder and CTO @Waylay, R&D, background in IoT/M2M, Cloud Computing, Semantic Web, Artificial Intelligence, Signal and Image Processing, Pattern Recognition, author of 12 patent applications.


Back To Top

Operationalize your product data to transform your IoT solutions into business outcomes

Operationalizing product data means putting this data to use in a company’s day to day operations. Making it available to multiple functional stakeholders, many of which are business users: sales & marketing, business executives, support or third-parties like distributors and partners.…

Resources

Subscribe to our newsletter!

Spend time with a Waylay expert
Please complete this form and we will contact you shortly

* indicates required
Close