If you've ever wanted to work with a large JSON file in Excel, a BI tool like Tableau, or another application that requires a tabular format, you know how painful it can be to convert JSON to CSV. In this blog, we'll show you how you can convert large JSON files to CSV online with Gigasheet. Gigasheet is a no-code, big data workbench that allows analysts to work efficiently with huge datasets – and you can use the free version to convert JSON to CSV.
For those uninitiated, JSON stands for JavaScript Object Notation and is a lightweight data-interchange format. It's essentially a "text format that is completely [programming] language independent but uses conventions that are familiar to programmers." If you're not a programmer, it can be a bit intimidating to work with large files of JSON data. JSON is often used to send data between computers and databases and most modern programming languages to include code to generate and parse JSON data. JSON is an ideal format for larger data sets that have a hierarchically structured relationship, but this structure also makes it difficult to analyze in tools that expect data to be in rows and columns.
Good news! It's as easy as 1, 2, 3.
That's it! When you're done simply export the file as a CSV.
Here's how it works. Gigasheet accepts two possible JSON file structures:
In many applications we frequently see item 2, one-per-line structure in exports from no-SQL databases (e.g. MongoDB, etc). In this case, each JSON object becomes a row. Gigasheet handles the varying structure by creating a column for each leaf node of the nested sub-objects and sub-lists within an object. This results in a way to create a tabular representation of varying structured data where common fields are represented in the same column across rows and unique fields just show up in their own column.
Example of Structure 1: This file is one giant JSON object.
Example of Structure 2: Each line is a JSON object.
Excel can work with JSON files to some extent, and the process varies depending on which version of Microsoft Excel you have installed. Even then you'll be limited by the max number of rows Excel can support. Since the release of MS Office 2007, the number of rows supported by Excel has been 1,048,576. Unfortunately, that's not a lot of data in today's world. (If you're looking to open a big CSV file, check out our other post.)
If your flattened JSON data is less than a million rows, and depending on your computer's performance, you may be able to import the JSON to Excel. In older versions of Excel (Office 2010 - Office 2013), you'll need to use Power Query, and in the Advanced Query editor enter the path to your file and build a query like this:
Replace C:\Users\my.json with your file name and path:
Source = Json.Document(File.Contents(“C:\Users\my.json”)), #”Converted to Table” = Record.ToTable(Source)
in
# “Converted to Table”
In Newer Versions of Excel you can get JSON data directly from a file using the following steps:
After you have the JSON data open in Excel you can save the file as a CSV.
There are numerous online JSON to CSV converter websites we tested before building Gigasheet. They all seemed to have shortcomings. Some were overly complex and require the user to define a schema. This allows for a lot of control, but it can also be daunting if you're unfamiliar with the data, or JSON in general. Perhaps the most pervasive issue is that most of these tools take JSON data via copy-and-paste or a small file upload. The files we work with are much larger than a clipboard can accommodate and most of these free sites don't support the conversion of large 20mb+ files.
On the surface, Gigasheet is a web-based, billion-cell spreadsheet. Behind the scenes, it provides a high-performance big data analytics platform built specifically for analysts working with big data. Get started now with a free Gigasheet account.