Store

The Store tab controls where measurements are persisted and how they are stored after data collection and processing.

Depending on the use-case, it is possible to ingest data either to a data store in Cloud, or/and to a local historian running either directly on the edge device, or locally in the same network.

Edge devices use store-and-forward buffering to preserve data during connectivity outages and automatically backfill when links are restored.

Cloud Ingestion

Cloud ingestion is the most common way to persist data centrally.

Why send data to Cloud?

  • Single source of truth: Consolidate data from all sites so dashboards, reports, and APIs use one consistent dataset.

  • Cross-site visibility: Compare performance across lines/plants, roll up KPIs, and benchmark assets globally.

  • Advanced analytics & ML: Leverage cloud-scale BI, feature stores, and model training not practical at the edge.

  • Long-term retention: Tier to cost-effective storage for years while keeping recent data hot.

  • Reliability: Benefit from managed durability, backups, and geo-redundancy.

  • Integration & automation: Feed ERP/MES/CMMS, trigger alerts/workflows, and expose data through secure APIs.

  • Elastic scale: Handle bursts, add devices, or change resolution without re-architecting edge systems.

  • Governance & compliance: Centralized access control, audit trails, and policy enforcement across the enterprise.


Local historian Ingestion

The platform provides the option to persist the data either directly on the edge device, or on an on-premises server.

Why use a local historian?

  • Low-latency access: Instant trending, diagnostics, and KPIs without Cloud round trips.

  • Resilience offline: Keeps performing analytics during internet outages; backfills to cloud when connectivity returns.

  • Bandwidth & cost control: Reduces continuous uplink traffic by serving local analytics and batching/cloud-offloading on your terms.

  • Deterministic performance: Polling, writes, and queries stay close to the equipment.

  • Data sovereignty & compliance: Retain sensitive time-series data on-prem as required.

  • Operational decoupling: Local apps (HMI, MES, reporting) can read from the historian even as cloud services evolve or are temporarily unavailable.

  • Flexible targeting: Per-tag selection of historian instances lets you split workloads (e.g., critical signals to a fast store, bulk data to a cost-efficient store).


Properties (Meta Data)

Use Properties to attach key-value meta data to Areas, Assets, and Tags—such as Unit of Measure, equipment identifiers, or business context. After a successful hierarchy deployment, these properties are automatically synchronized to the time-series database, so your stored measurements carry consistent context for search, filtering, and analytics.

Example properties for a Tag connected to a temperature sensor.

Where to add: For Tags, use the Store tab; for Areas/Assets, use their Info tabs. All meta data is replicated with the deployed hierarchy version, preserving context for analysis. metadata


Last updated

Was this helpful?