Table Of Contents

Determinism and Jitter in a Real-Time System

    Last Modified: April 18, 2016

    When programmed appropriately, a real-time system can guarantee that tasks consistently execute in a specified time constraint. Determinism is the characteristic that describes how consistently a system executes tasks within a time constraint.

    A perfectly deterministic system would experience no variation in timing for tasks. However, in the real world, even highly deterministic systems experience slight variations in timing for tasks. The variation between the expected timing and the actual timing for a task is known as jitter. Jitter occurs both when actual timing is later than expected timing and when actual timing is earlier than expected timing for a task. The following illustration shows the jitter for multiple executions of one task.

    1. When actual timing is later than expected, the task can fail to finish executing within the timing requirements of an application.
    2. When actual timing is earlier than expected, the task can get out of sync with other tasks in the application.

    The amount of jitter that a task can tolerate while still meeting the timing requirements of the task defines the acceptable jitter for the task. In the following illustration, mean execution time for each normal distribution graph corresponds to the expected timing for the task. The range of actual execution times shown along the x-axis is the jitter for the task. When the jitter for the task falls within the range of acceptable execution times, the task is sufficiently deterministic.

    Recently Viewed Topics