Transform the JSON into a Table for easier manipulation
You need to do quite a bit of filtering and transformation to get the desired result. As you have learned, a good strategy for tackling larger problems is to solve the problem in small steps.
You decide to scope the first step so that the result is the raw JSON data transformed into a Table (a data structure provided by the
RPA.Tables library) that is then easy to filter, sort, group, etc.:
You add a new
Load traffic data as table keyword that returns the JSON data in a Table format.
Load JSON from filekeyword provided by the
RPA.JSONlibrary returns the file contents (string) in JSON format.
Create Tablekeyword from the
RPA.Tableslibrary converts the JSON format into a Table.
Tables can be created from lists, for example. Looking at the raw JSON content, you see that the
valueproperty points to a list of all the things you will need.
You run the robot and view the log. Your robot has transformed the raw JSON data into a Table with lots of rows:
Created table: Table(columns=['Id', 'IndicatorCode', 'SpatialDimType', 'SpatialDim', 'TimeDimType', 'TimeDim', 'Dim1Type', 'Dim1', 'Dim2Type', 'Dim2', 'Dim3Type', 'Dim3', 'DataSourceDimType', 'DataSourceDim', 'Value', 'NumericValue', 'Low', 'High', 'Comments', 'Date', 'TimeDimensionValue', 'TimeDimensionBegin', 'TimeDimensionEnd'], rows=11640)
The table entry in the log is nice and concise but lacks the actual data.
You call the super-handy Write table to CSV keyword:
After rerunning the robot, you see a
test.csv file at the root. Clicking on it opens the raw CSV file. You click on the
Open Preview icon at the top-right. A preview of the data is displayed as a table. What a nice way to manually inspect the intermediate results!
Always in search for tidy code, you see that you have broken the Don't repeat yourself principle by duplicating the path to the
It's not a massive deal at the moment since you have so little code, but it still requires future maintainers to find and change the path in two places if the path needs to be changed.
Little repetition might eventually grow into lots of small repetition and make it slower to maintain your robot or even cause subtle bugs if someone forgets to replace all the instances of the duplicated things.
You decide to store the path of the
traffic.json file into a variable and then refer to that variable instead. This way, you remove the duplication: