How to Analyze JSON Data in the Browser Without Python
How to Analyze JSON Data in the Browser Without Python
JSON is everywhere. It's the format that APIs speak, the structure that most modern data exports use, and the thing that data analysts keep getting handed with no clear instructions on what to do with it.
If you're not a developer, JSON can feel hostile. It's not a spreadsheet. It's not a table. It's nested, sometimes inconsistently structured, and looking at raw JSON in a text editor feels like trying to read a data center manual.
The standard advice is: "Just load it in Python with pandas." Which is great advice if you know Python. But if you're an analyst, researcher, freelancer, or bootcamp grad who works primarily in SQL and spreadsheets, that advice sends you down a rabbit hole that has nothing to do with the actual question you're trying to answer.
This article is about analyzing JSON data without Python — fast, in your browser, starting right now.
The JSON Problem for Non-Developers
Let's be concrete about what makes JSON hard to work with if you're not a developer.
It's not a flat table. A typical API response looks like this:
{
"articles": [
{
"id": 1,
"title": "Market Update",
"source": {
"name": "Reuters",
"url": "https://reuters.com"
},
"tags": ["finance", "markets"],
"published_at": "2024-03-15T09:22:00Z"
}
]
}
To analyze this in Excel, you'd need to flatten it first — manually mapping each nested field to a column. That's a data engineering task, not a data analysis task.
The data is often huge. A news API might return 10,000 articles. An e-commerce API might return 50,000 order objects. You can't meaningfully read this by hand.
Nested arrays are everywhere. The tags field above is an array. To count how many articles are tagged "finance," you'd need to unnest that array — a SQL/Python operation, not an Excel one.
Python is the standard answer — but it's not the only one. Yes, json.loads() and pd.json_normalize() work. But if you're not already comfortable with Python, adding that to your workflow adds hours of overhead and debugging for what should be a simple analytical question.
What You Actually Need to Analyze JSON
Strip away the technology and here's the task:
- See the structure: What fields exist? What are the data types? Are there nested objects or arrays?
- Flatten it: Turn the nested structure into a table with rows and columns
- Query it: Filter, group, aggregate, sort
- Get answers: "Top 10 sources by article count," "average sentiment by category," "how many articles mention X keyword"
- Export: CSV for further use or reporting
Notice that none of those steps require Python. They require a tool that can read JSON and make it queryable. That tool is Harbinger Explorer.
How Harbinger Explorer Handles JSON
Harbinger Explorer uses DuckDB WASM — a full SQL analytics engine — running inside your browser. DuckDB has excellent native support for JSON: it can automatically parse nested structures, flatten arrays, infer schemas, and make everything immediately queryable.
What this means in practice:
- You paste a JSON URL or load a JSON file
- Harbinger parses the structure automatically
- Nested objects become dot-notation columns (e.g.,
source.name) - Array fields can be unnested with a simple query
- You write SQL — or ask in plain English — and get results
No Python. No pandas. No json_normalize. No debugging.
Step-by-Step: From JSON to Insight
Let's walk through a real workflow.
Scenario
You have access to a news API that returns articles in JSON format. You want to know: which news sources published the most articles about a specific topic in the last 7 days?
Step 1: Open Harbinger Explorer
Go to harbingerexplorer.com. You're in the tool. Nothing to install.
Step 2: Load Your JSON
Either:
- Paste the API endpoint URL directly (HE will fetch and parse it)
- Or upload a downloaded JSON file
Harbinger handles the fetching, decompressing, and parsing automatically.
Step 3: See the Schema
Harbinger displays the detected columns and types. You can immediately see: title (string), source.name (string), published_at (timestamp), tags (array), sentiment_score (float), etc.
Step 4: Ask Your Question
Option A — Plain English (AI agent):
"Which 10 news sources published the most articles in the last 7 days?"
The AI translates this to SQL and runs it. You see the results in a clean table.
Option B — SQL:
SELECT
source_name,
COUNT(*) as article_count,
AVG(sentiment_score) as avg_sentiment
FROM articles
WHERE published_at >= CURRENT_DATE - INTERVAL 7 DAYS
GROUP BY source_name
ORDER BY article_count DESC
LIMIT 10
Both approaches give you the same result in under a minute.
Step 5: Export
Download the results as CSV, copy to clipboard, or continue querying.
Comparing Tools for JSON Analysis
| Tool | Setup Time | Handles Nested JSON | SQL Queries | Plain English | Works in Browser |
|---|---|---|---|---|---|
| Python + pandas | 30+ min (if unfamiliar) | ✅ (with code) | ❌ | ❌ | ❌ |
| jq (command line) | 15 min | ✅ | ❌ | ❌ | ❌ |
| JSON Viewer (browser ext) | 2 min | ✅ (view only) | ❌ | ❌ | ✅ |
| Excel Power Query | 20 min | ⚠️ (limited) | ❌ | ❌ | ❌ |
| Postman | 10 min | ✅ (view only) | ❌ | ❌ | ❌ |
| Harbinger Explorer | 30 sec | ✅ | ✅ | ✅ | ✅ |
Real-World Time Savings
Let's quantify this:
Task: Analyze 5,000 news articles from a JSON API — find top sources, top categories, average article length
| Approach | Total Time |
|---|---|
| Python (experienced) | ~45 min |
| Python (learning as you go) | 3–5 hours |
| Excel Power Query + formulas | 60–90 min |
| Harbinger Explorer | ~10 min |
For a researcher or freelancer doing this kind of work regularly, the difference is not marginal — it's hours per week.
Handling Nested and Array JSON Fields
This is where most tools struggle and where Harbinger Explorer shines.
Nested objects: If your JSON has source: { name: "Reuters", country: "UK" }, Harbinger automatically creates source_name and source_country columns. You query them directly.
Array fields: If each article has tags: ["finance", "markets", "economy"], you can unnest the array to analyze tag distributions:
Ask the AI: "What are the top 20 most common tags across all articles?"
Or write the SQL yourself using DuckDB's array functions. Either way, you're not writing Python.
Use Cases Beyond News APIs
JSON analysis comes up in more places than you might expect:
- Stripe/payment data exports: Order history, subscription data, refund patterns
- Social media API data: Tweet metadata, engagement stats, user attributes
- HubSpot/CRM exports: Contact data, deal stages, activity logs
- Government open data: Many public datasets are published as JSON APIs
- Internal tool exports: Slack exports, Notion database exports, Airtable data
In every case, the pattern is the same: you have JSON, you have questions, and you need answers fast without writing a data pipeline.
Building Your Analytical Intuition
One underrated benefit of using Harbinger Explorer's AI agent: it teaches you SQL on your own data.
When you ask a question in plain English and the AI produces a SQL query, you can read the query, understand what it's doing, and learn. Over time, you start writing queries directly instead of asking the AI — because you've seen the patterns repeated enough times.
This is a much more effective learning path than tutorials: you're solving your real questions, on your data, with immediate feedback.
Pricing and Trial
- 7-day free trial: Full access, no credit card required
- Starter: €8/month — JSON loading, SQL queries, source catalog, CSV export
- Pro: €24/month — AI agent chat, natural language queries, advanced features
If you regularly deal with JSON data and don't have a smooth workflow for it, the Starter plan pays for itself in the first hour of saved time.
Stop Fighting JSON. Start Analyzing It.
JSON doesn't have to mean Python scripts, pandas DataFrames, or hours of data wrangling. It can mean: open browser, load data, ask question, get answer.
Try Harbinger Explorer free for 7 days →
Your JSON is just data waiting to be understood. Let's make that faster.
Harbinger Explorer is a browser-based data exploration platform. It uses DuckDB WASM for in-browser SQL analytics and supports JSON, CSV, and API data sources. No installation required.
Continue Reading
Search and Discover API Documentation Efficiently: Stop Losing Hours in the Docs
API documentation is the final boss of data work. Learn how to find what you need faster, stop getting lost in sprawling docs sites, and discover APIs you didn't know existed.
Automatically Discover API Endpoints from Documentation — No More Manual Guesswork
Reading API docs to manually map out endpoints is slow, error-prone, and tedious. Harbinger Explorer's AI agent does it for you — extracting endpoints, parameters, and auth requirements automatically.
Track API Rate Limits Without Writing Custom Scripts
API rate limits are silent project killers. Learn how to monitor them proactively — without building a custom monitoring pipeline — and stop losing hours to 429 errors.
Try Harbinger Explorer for free
Connect any API, upload files, and explore with AI — all in your browser. No credit card required.
Start Free Trial