Most web systems and APIs now send out responses in the JSON data format. Though the format itself is quite easy to read and write, perhaps you’d like to work with CSVs? Converting JSON to CSV using Python or other techniques can get daunting fairly quickly.
There are two ways to solve the problem:
In this article, we’ll walk through you through both ways to convert JSON to CSV, using Python code as well as the the #NoCode way of converting JSON to CSV.
Python is extremely handy for developers, data analysts, and several other roles. It can be used for operations ranging from automating #DataConversion operations (like the one we’re going to talk about) to performing complex scientific calculations. Truly wonderful!
However, it does require you to code. Coding requires both experience and a lot of time – which in your case you might not have.
For instance, if you’d like to convert JSON to CSV in Python, you can use functions from the Pandas library. It has two functions, namely [.giga-code-snippet]read_json[.giga-code-snippet] and [.giga-code-snippet]to_csv[.giga-code-snippet] which can read JSON and convert to CSV in a few seconds.
Here’s sample code using the aforementioned functions for converting a very basic JSON structure to a CSV sheet:
[.giga-code-snippet]import pandas as pd json
Body = pd.read_json("sample.json")
csvBody = jsonBody.to_csv()[.giga-code-snippet]
The csvBody variable contains the CSV data which we can now print out. Here’s the console output containing both the JSON and CSV bodies:
How simple? Only thing is; the JSON in the sample.json file is a simple list of a single object with no nested objects i.e., a simple structure which is rarely the case with production systems. Here’s the content of the file:
[.giga-code-snippet][{
"A": 1,
"B": 2,
"C": "ABC", }][.giga-code-snippet]
Now, what happens when you’d like to convert a complex JSON to a CSV? This is where you’ll be spending some time debugging issues with your data, normalizing or flattening objects, and more – requiring more time!
Python’s Pandas does have a function to normalize the JSON from a semi-structured dataset to a flattened object (with no nestings, easier for CSV conversion).
I’ve updated the sample.json file to contain a few more objects with nesting thereby creating a bit more complex JSON structure:
[.giga-code-snippet][{
"a": 1,
"b": {
"c": "OK",
"d": 0
},
"e": {
"f": {
"g": "NOT OK"
}
}
},
{
"a": 1,
"b": {
"c": "OK",
"d": 0
},
"e": {
"f": {
"g": "NOT OK"
}
}
},
{
"a": 1,
"b": {
"c": "OK",
"d": 0
},
"e": {
"f": {
"g": "NOT OK"
}
}
}][.giga-code-snippet]
Here’s sample code for using the json_normalize function:
[.giga-code-snippet]import pandas as pd
import json
with open('test.json', encoding='utf-8') as f:
data = json.loads(f.read())
jsonBody = pd.json_normalize(data)
csvBody = jsonBody.to_csv()[.giga-code-snippet]
The snippet simply opens the file and using the json.loads function reads the content from the file handle. Once done, it creates a normalized data frame using the json_normalize function from the Pandas library.
Here’s the output from the console after we print the jsonBody and csvBody variables:
Again, these examples are fairly simple JSON objects. If you’re dealing with large objects with multiple nested fields, you might come across several problems in Python’s JSON to CSV conversion. Nothing you can’t solve but why go through the struggle and hours on end debugging the issues?
Let’s switch our focus to Gigasheet’s #NoCode parser and see how it converts JSON to CSV without a single line of coding!
It’s crazy simple; upload your JSON file to Gigasheet and sit back. Go get a drink or something and that’s really it. The parser will take a minute or so as it processes your data and visualizes it in a web-based spreadsheet where you can operate on your JSON data.
I’ll give you a quick walkthrough.
Gigasheet operates on a variety of file types – CSV, JSON, TSV, PCAP, and so many more. Since we’re working on JSON conversions, let’s upload our JSON file using the Upload button:
Drag and drop your file and wait for a few minutes. You can see my test.json file is ready post-crunching by the parser and waiting for me to work on it. Double-click and you’ll see your JSON file de-normalized and broken down into multiple cells (columns and rows) resembling a CSV.
Just to do a quick comparison – this took me a little under 3 minutes to do. Whereas to convert JSON to CSV, Python took me a little over 20 minutes to figure out and debug an error or two. Mind you again – as this data scales, the problems with Python might be twofold – doubling the time required to fulfill the objective.
Before I forget – this was just one part of the goal i.e., normalize the JSON and de-nestify it (if that’s even a term). For our second part, we have to convert this JSON to CSV. To do that, you can use the Export option under the File option in the navigation bar:
Click on Export and your data will be available as a new archive file (in the Library) for downloading. Once downloaded, you can unzip it and the resulting file will be a CSV with the same data structure as you saw on the web UI.
Et voila! Using the same strategies, you can convert multiple JSONs to CSVs using Gigasheet (and Python if you still prefer coding).
That’s all for our short guide on converting JSON to CSV with Python and Gigasheet. You’ve seen both the solutions in play; which one are you going to pick and use for future conversions?
Still not convinced that Gigasheet can solve all your data needs in mere minutes? Go for a test run today! You can sign up for free access to the parser, the web view, and lots of other features.
Interested to see other use-cases of Gigasheet in the field of data analytics and engineering, give these articles a read: