In this section we will explain the concept of IoT Catalyst scope, and we will also give you a rule of thumb to understand in which situations you should invoke a particular scope.
An IoT Catalyst scope is a necessary part of the interaction with the IoT Catalyst Domain because the choice of the scope sets the type of message broker the Navigator interacts with. First of all, we clarify that the Navigator has two possible scope choices: a Domain scope and a Local scope. A Domain scope makes use of a MQTT message broker, whereas a Local scope makes use of a Redis message broker. So, for example, a statement like the following tells the Navigator that you want to use a local scope:

copy
my_scope = navigator.local

Conversely you may want to use a global, entire-iot-domain scope. You can do that in this way:

copy
my_scope = navigator.domain

There is another, more implicit way to tell the Navigator you want to use a Domain scope. It is by invoking a particular IoT Catalyst Hypervisor as in the following example:

copy
my_hypervisor = navigator.PopCornMachine

A hypervisor invocation implicitly sets a Domain scope for you, so that you can keep invoking other hypervisors thoughout your script or use the

copy
navigator.domain

syntax later in your code.

The important takeaway here is that once you invoke a particular IoT Catalyst scope, you have to keep using it in you program, since changing scope during navigation is not possible. So the obvious follow-up is to understand when it is appropriate to use a Domain scope and when a Local scope is instead the best choice.

The Domain scope is the most powerful way of interaction, because it gives you full access to any Digital Twin wherever they are in your IoT Catalyst Domain through Containers and Features.

For example you can either get access to a IoT Catalyst Container directly via something like the following:

copy
my_container = navigator.domain.Internal_Temperature

or passing through its IoT Catalyst Hypervisor as in the following snippet:

copy
my_container = navigator.PopCornMachine.Internal_Temperature

So, from what precedes it stems that for maximum versatility you should really consider using the Domain scope.

But the Domain scope comes also with a limitation: if the IoT Edge where the Navigator code is running goes offline for any reason (the connection with IoT Catalyst Server is lost), it will be impossible to update your Features, to call Functions and Actions and more generally to interact with the message broker.

On the other side, the Local Domain scope is alway available but lets you get access exclusively to all the Digital Twins running "locally", i.e., on the same IoT Edge node where you are running your Navigator script. In this case you would continue to get access to the local Features, Functions and Actions even if the IoT Edge went offline. Thus, a Local Domain is a good option for implementing closed loop business rules that you need to execute in any case: "If you press this button than switch on the light", here the assumption is that the light and the button are connected to the same IoT Edge.
Besides, you may want to use a Local scope when you value a greater speed of interaction with the message broker, keep in mind however that this speed difference becomes something you would notice only in rather high-frequency interactions.

Getting access to a IoT Catalyst Container through a Local scope is not remarkably different than navigating with the Domain scope:

copy
my_container = navigator.local.Internal_Temperature

So, the takeaways of this section are:

 a) in many cases navigating with the Domain scope is the best choice for optimal flexibility
 b) When you don't need access to non-local-running IoT Catalyst Containers, when you want to execute closed loop business rules even when offline or when you really value speed, than navigating with a Local scope could be your best option.

By the way, you already found references to our Pop Corn Machine in this tutorial.
You can click here to find out more about it.

When you invoke a IoT Catalyst Feature through the navigator, for example by typing:

copy
my_feature = navigator.domain.Internal_Temperature.Temperature

it subscribes to its relevant broker messages, so that it can effectively start tracking and storing some information about that particular Feature ("Temperature", in this case).

The kind of information the Navigator stores depends on the nature of the Feature, but in general for all Features you invoke it starts tracking the following information:

 a) the "current" value of the Feature, we will talk more about it soon
 b) the timestamp of the first message the Navigator received from the message broker after the Feature has been invoked
 c) the timestamp of the last message it received from the broker
 d) the total number of messages regarding Feature's current value ("updates") received since Feature's invocation
 e) the "last known value" of the Feature

You can retrieve such information by typing, respectively:

copy
my_feature.value
my_feature.first_update
my_feature.last_update
my_feature.updates
my_feature.last_known_value

Later we'll learn about the additional information tracked for numeric Data Features.

Here it is worth spending some time on the concepts of "current value" and "last known value", since they are intimately tied and to introduce the closely related concept of the "TTL" of a Feature. Every time a new Feature is invoked, the Navigator listens for new values coming from the message broker (the messages come from the broker with a frequency determined by the Container's setting "polling time data" or "polling time event", depending on the kind of Feature) so that you can easily assign such "current" value to a variable in this way:

copy
current_value = my_feature.value

Every time a message with the new value of a Feature arrives to the Navigator, the latter sets the "last known value" to coincide with the "current value", so in "normal circumstances" these two values coincide.

Now let's suppose that our "Temperature" Container has a "polling time data" of 5 seconds. We know then that the numeric Features of this Container will release a new data point every 5 seconds. What if for some reasons a new data point doesn't get produced for 10 minutes? Is it acceptable to consider its value from a distant time still relevant? Maybe not, that's why you may want to set a specific TTL (time to live) for your Feature, so that when you query a Feature's value and you get None instead that an actual numeric value you immediately realize that something went wrong with the periodic emission of data points every 5 seconds. So let's say you deem 20 seconds as a time span large enough to consider your Feature's value no more up-to-date, then you can set your estimation into the Navigator easily by typing:

copy
my_feature.ttl = 20

After some time, you query your Feature's value:

copy
print(my_feature.value)
None

and you get None in return. You immediately realize the Temperature's value is outdated. Anyway you still would like to know what the last value of the Temperature was, before it stopped producing data points.
You can get that by typing:

copy
print(my_feature.last_known_value)
40

So here you can appreciate the importance of keeping the concepts of "value" and "last_known_value" distinct: they coincide when the data points are produced with a higher frequency compared to the intended Feature's TTL.

You can even set a "global" TTL, which will be applied both to the Features already invoked and to the ones you'll invoke later in your script:

copy
navigator.set_ttl(20)

Anyway, it is still possible to set a different TTL for a specific Feature, overwriting the global TTL:

copy
my_feature.ttl = 10

In case you don't set any TTL, the Navigator will assume that for you it is acceptable to consider "current value" the last value it received from your Feature, however old it was.

In addition to what you have learnt about the information that the IoT Catalyst Navigator starts tracking as soon as you invoke a Data or Event Feature, numeric Data give naturally the possibility to keep track of numeric stats. And that is what the Navigator does, so you can access those stats fairly easily:

copy
my_feature.average
my_feature.min
my_feature.max

give you access to, respectively, the average, the min and max values of the numeric Data Feature from the time you first invoked that Feature.

Now that you know how to retrieve Features' information, let's imagine you want to get noticed every time the Temperature of the Pop Corn Machine goes beyond two arbitrary limits you set (say 0 as the lower limit and 40 as the higher). You could of course keep retrieving the value of the Feature every few seconds and do something when those limits are overcome:

copy
import time
    while True:
    current_value = my_feature.value
    if current_value < 0 or current_value > 40:
        do_something()
    time.sleep(5)

That would work, but that would be a bit cumbersome and besides you should ideally change the sleeping time according to the "polling time data" setting of the "Internal_Temperature" Container, in order not to poll your Feature too frequently, regardless.

The Navigator has a far more convenient way to deal with such scenarios:

copy
my_feature.on_limit_overflow(do_something, 0, 40)

and that%apos;s it! Notice that the Navigator will inject the current_value into your function "do_something" and thus you should have already defined your function with a signature like the following one:

copy
do_something(<current value injected>, *args, **kwargs)

(this is a generic signature, you don't need to use *args and **kwargs, you can use simple arguments and named arguments).

So you can of course name the first argument as you wish, but you may want to give it a name that reminds you it holds the current value (of the Temperature Feature, in our example), such as "current_value".

You can add additional parameters to <do_something> very easily:

copy
my_feature.on_limit_overflow(do_something, 0, 40, my_parameter, another_parameter=100)

where after the usual first three arguments, namely the callback function, the lower limit, and the higher limit, you can add additional arguments and/or named arguments directly to the method.

Similarly, you can decide to call the method "on_percentage_overflow" on a Feature:

copy
my_feature.on_percentage_overflow(do_something, 20, 100)

The previous line will tell the Navigator that you want to execute "do_something" should your Feature have a percentage increase of 20% or a percentage decrease of 100% between its current value and the previous one.

Notice that both for "on_limit_overflow" and "on_percentage_overflow" you are not bound to set any of the two limit values, while you are required to pass a valid callable to the methods. Should you not pass one of the two limit values, the Navigator will just neglect that limit.

You can also pass a IoT Catalyst Function, Action or Container's control command (the latter is described in a separate section) to one of these methods, in those cases the current value of the feature won't be injected, while you can pass arguments and named arguments to them as seen before for the generic function "do_something", for example:

copy
my_other_container = navigator.domain.PopCornControl
my_feature.on_limit_overflow(my_other_container.RedLightON, 0, 40, "0", "0", "0")

So in this case, should the Temperature go beyond the set limits, the Red Light of the Pop Corn Machine will light up (we don't discuss here the three parameters "0" passed to the method, since it is not relevant to our discussion over the different mechanisms of callback of the Navigator).

In case you want to load your callback on a hypothetical string Data Feature such as:

copy
my_string_feature = navigator.domain.PopCornStatus.MaintenanceLevel

you can simply leverage the "on_change" method:

copy
my_string_feature.on_change(my_other_container.GreenLightON, "0", "0", "0")

This will light up the green light should the string Data Feature "MaintenanceLevel" change.

In case you want to load a callback on an Event Feature such as:

copy
my_event = navigator.domain.LeftDoorOpened

you can choose to use one or both of the following two methods:

copy
my_event.on_event_start(my_other_container.RedLightON, "0", "0", "0")
my_event.on_event_end(my_other_container.GreenLightON, "0", "0", "0")

so that when the left door of the Pop Corn Machine will be opened, the red light will light up, while the green one will light up should the left door be closed.

The IoT Catalyst Navigator can also tell you the status of a IoT Catalyst Container as well as let you perform some actions on it.

You can get the current status by typing for example:

copy
print(my_container.ctrl.status)
started

"started" would be the returned value should your container be running.

Similarly you can call the following methods on a IoT Catalyst Container:

copy
my_container.ctrl.start()
my_container.ctrl.stop()
my_container.ctrl.restart()
my_container.ctrl.remount()

which would perform the respective actions on the IoT Catalyst Container.