Normalizing Incoming Data Automatically
- Updated2025-04-25
- 2 minute(s) read
Normalizing Incoming Data Automatically
Create a routine to automatically convert incoming data files into a data table.
- In the SystemLink web application, navigate to .
- Click Python 3 (ipykernel).
-
Use the linked example as a template to create ETL pipelines.
The example includes steps to complete the following actions.
- Extract the file contents.
- Transform the data into a data table format.
- Load the data into storage with the DataFrame Service API.
Note When you create a data table, you must do the following.- Define the column type.
- Define the data type for each column.
- Define the total number of columns. The total number of columns cannot exceed 2500.
Column Type Description NORMAL The column has no special properties. This behavior is the default behavior. INDEX The column provides a unique value per row. Each table must provide one INDEX column with one of the following data types. - INT32
- INT64
- TIMESTAMP
NULLABLE The column allows rows to contain null values. You can exclude NULLABLE columns when appending rows. Appended rows will use null values for that column. Note After you finish appending data to a table, free up resources associated with the table. Use the route POST /nidataframe/v1/tables/{id}/data where {id} is the table ID. Set endOfData to true in the JSON body of the request. -
Create a routine to execute your notebook and specify File
uploaded for the event. Refer to Automating Actions with
Routines to learn how to create a routine.
When files are created, your notebook will execute to convert the file data into a data table.
From a dashboard | Click the top of the data table panel in your dashboard and
select Inspect. Click Download
CSV. Note Data tables
downloaded from a dashboard are decimated based on the query
settings for the dashboard. |
From a test result | Select a data table in the Data Tables tab
and click Download CSV. Note Data tables
downloaded from a test result are undecimated. |
Related Information
- Normalizing Data for Efficient Storage and Access
Data tables are a read-optimized, columnar data storage format designed to store tables with millions of rows of data.
- Automating Actions with Routines
Create routines to automate an action when an event occurs.
- ETL Example