Parquet File Viewer Online: Open & Query Parquet Without Installing Anything
title: "Parquet File Viewer Online: Open & Query Parquet Without Installing Anything" seo_title: "Parquet File Viewer Online — Free Browser Tool" seo_description: "View, query, and export Parquet files online for free. Compare ParquetViewer, DuckDB CLI, and Harbinger Explorer side by side."
You exported a Parquet file from a Spark job. Or someone on Slack just sent you one: "Hey, can you check row counts in this?" You double-click it. Nothing happens. Of course nothing happens — Parquet is a columnar binary format, not a spreadsheet.
So begins the ritual: install a CLI tool, pip-install PyArrow, spin up a Jupyter notebook, or beg someone with DuckDB on their machine to run a quick SELECT *. All of this to look at a file.
If you've ever Googled "view parquet file online free", you already know the options are thin. Let's fix that.
TL;DR — Which Parquet Viewer Should You Use?
If you just need to peek at a file, DuckDB CLI is excellent — but requires installation. ParquetViewer is a lightweight desktop app for Windows. Harbinger Explorer runs entirely in the browser with full SQL support, natural-language queries, and export options — no install, no upload to a server.
| Feature | Harbinger Explorer | DuckDB CLI | ParquetViewer |
|---|---|---|---|
| Setup Time | 0 min (browser) | 5–10 min (install) | 5 min (Windows install) |
| Platform | Any browser | macOS, Linux, Windows | Windows only |
| SQL Support | ✅ Full (DuckDB WASM) | ✅ Full | ❌ None |
| Natural Language Queries | ✅ Ask in plain English | ❌ | ❌ |
| Schema Inspection | ✅ Auto-detected | ✅ .describe | ✅ Basic |
| Large File Support | ✅ In-browser (WASM) | ✅ Excellent | ⚠️ Slower on large files |
| Export Formats | CSV, Parquet, JSON | CSV, Parquet, JSON | ❌ View only |
| Data Governance / PII | ✅ Column mapping + PII detection | ❌ | ❌ |
| Pricing | Free 7-day trial, then €8/mo | Free (open source) | Free (open source) |
| Learning Curve | Low (point-and-click + NL) | Medium (SQL required) | Low (GUI) |
Last verified: April 2026
The Pain: Why Viewing Parquet Is Still Annoying in 2026
Parquet is everywhere. It's the default output of Spark, the preferred format for data lakes, and the go-to for anyone who cares about compression and columnar reads. But the tooling for looking at Parquet files hasn't kept up with the tooling for producing them.
Here's what most people do today:
Option 1: Python + PyArrow (the "quick" script that isn't quick)
# Python — requires: pip install pyarrow pandas
import pyarrow.parquet as pq
import pandas as pd
table = pq.read_table("events_2026_q1.parquet")
df = table.to_pandas()
print(df.head(20))
print(f"Rows: {len(df)}, Columns: {df.columns.tolist()}")
print(df.dtypes)
This works — if you have Python installed, PyArrow installed, and the file on your local machine. That's three assumptions that fail regularly on corporate laptops, shared VMs, and basically any machine that isn't your dev setup.
Option 2: DuckDB CLI
-- DuckDB CLI — requires: brew install duckdb / apt install duckdb
SELECT * FROM read_parquet('events_2026_q1.parquet') LIMIT 20;
DESCRIBE SELECT * FROM read_parquet('events_2026_q1.parquet');
SELECT COUNT(*) FROM read_parquet('events_2026_q1.parquet');
DuckDB is genuinely excellent. Fast, powerful, handles huge files. But you need to install it first, and you need to know SQL. For a data engineer, that's fine. For the product manager who just wants to verify a field name? Not so much.
Option 3: ParquetViewer (Windows Desktop App)
ParquetViewer is a free, open-source Windows desktop application. You open a file, see the schema and data in a table view. It's simple, it works, and it does exactly one thing.
The catch: Windows only, no SQL, no export, no querying. If you need to filter rows or aggregate anything, you're back to Python or DuckDB.
Option 4: Upload to Some Random Website
There are a few "upload your Parquet file" websites floating around. Most are side projects, rarely updated, with unclear data handling policies. Uploading production data to an unknown server is... let's call it "not recommended" by your security team.
What You Actually Want
Let's be honest about the requirements:
- No installation. Open a browser, drag a file, see data.
- SQL support. Filtering, aggregating, joining — the basics.
- Schema inspection. Column names, types, nested structures.
- Export. Get the results out as CSV or JSON for the next step.
- Security. The file shouldn't leave your machine if possible.
That's not a lot to ask. But until recently, no single tool checked all five boxes.
Harbinger Explorer: Parquet in the Browser
Harbinger Explorer runs DuckDB WASM directly in your browser. Your Parquet file never leaves your machine — it's processed locally via WebAssembly. Here's what the workflow looks like:
Step 1: Open Harbinger Explorer — go to harbingerexplorer.com and sign in (or start a free 7-day trial).
Step 2: Upload your Parquet file — drag it into the source catalog or use the file upload button. The schema is auto-detected instantly: column names, data types, nested structs, everything.
Step 3: Explore with SQL or natural language — you can write SQL directly:
-- DuckDB SQL (runs in-browser via WASM)
SELECT event_type, COUNT(*) as event_count, AVG(duration_ms) as avg_duration
FROM uploaded_file
WHERE event_date >= '2026-01-01'
GROUP BY event_type
ORDER BY event_count DESC;
Or skip SQL entirely and ask in plain English: "Show me the top 10 event types by count in Q1 2026" — the AI generates the query for you.
Step 4: Inspect columns — the column mapping view shows data types, sample values, and flags potential PII (email addresses, phone numbers, IP addresses). Useful when you're exploring a file someone else created.
Step 5: Export — download results as CSV, Parquet, or JSON. One click.
Total time: about 2 minutes from "I received a Parquet file" to "I have answers." Compare that to the 15–30 minutes of installing tools, writing scripts, and fighting dependency errors.
When to Choose What
Tools aren't universally better or worse — they fit different contexts:
Choose DuckDB CLI when:
- You're a data engineer comfortable with SQL and terminal
- You're processing multiple large files in batch
- You need maximum performance on multi-GB files
- You want a free, open-source tool you can script around
- You already have it installed
Choose ParquetViewer when:
- You're on Windows and just need a quick visual peek
- You don't need SQL or filtering
- You want the simplest possible GUI
- File sizes are moderate (under 500 MB)
Choose Harbinger Explorer when:
- You need to view Parquet files on any machine without installing software
- You want SQL and natural-language querying
- You need PII detection or data governance features
- You want to export results in multiple formats
- You're sharing findings with non-technical colleagues (they can use NL queries)
- Security matters — file stays in your browser, no server upload
Honest Trade-Offs
Harbinger Explorer isn't the right tool for everything:
- No direct database connectors — you can't query Snowflake or BigQuery directly (yet). It's for files and APIs, not live database connections.
- No real-time streaming — this is for batch exploration, not event streams.
- No team collaboration — it's a single-user tool today. No shared workspaces or commenting.
- No scheduled refreshes on Starter plan — if you need automated recurring analysis, you'll need the Pro tier.
- Performance ceiling — DuckDB WASM in the browser is fast, but DuckDB native on a beefy machine will always be faster for multi-GB files. For most Parquet inspection tasks (files under 1 GB), you won't notice.
- Cost — DuckDB CLI and ParquetViewer are free forever. Harbinger Explorer has a 7-day free trial, then starts at €8/month.
If you're a power user who lives in the terminal and already has DuckDB installed, Harbinger Explorer probably isn't replacing your workflow. It shines for the other 80% of cases: quick inspections, ad-hoc questions, sharing with colleagues, and working on machines where you can't install software.
Beyond Viewing: What Else Can You Do With Parquet in HE?
Once a Parquet file is in your source catalog, it becomes a queryable data source you can combine with other data:
- Join Parquet with API data — crawl a REST API with the setup wizard, then join the API response with your Parquet file using SQL. Example: join your event log with a user API to enrich events with user metadata.
- Compare multiple Parquet files — upload two exports from different dates and run diff queries to spot changes.
- Profile the data — use natural-language queries like "Show me null percentages for every column" or "What's the cardinality of each string column?" for quick data profiling.
- Export cleaned subsets — filter, transform, and export a cleaned version as CSV for stakeholders who need Excel-friendly formats.
FAQ: Parquet File Viewer Questions
Can I view Parquet files without Python? Yes. DuckDB CLI, ParquetViewer (Windows), and Harbinger Explorer all work without Python. HE requires no installation at all.
Is there a free online Parquet viewer? Harbinger Explorer offers a 7-day free trial with full functionality. DuckDB CLI is permanently free but requires installation.
Can I convert Parquet to CSV online? Yes — in Harbinger Explorer, upload the Parquet file, optionally filter or transform with SQL, and export as CSV. No code required.
Are Parquet files safe to upload to online tools? With Harbinger Explorer, the file is processed locally in your browser via DuckDB WASM — it doesn't get uploaded to a server. Always check the privacy policy of other online tools.
What's the maximum Parquet file size I can view in a browser? Depends on your machine's RAM. DuckDB WASM typically handles files up to 1–2 GB comfortably in modern browsers. For larger files, DuckDB CLI on native hardware is the better choice.
Try It — 5 Minutes, No Install
If you've got a Parquet file sitting on your desktop right now, open harbingerexplorer.com, start the free 7-day trial, and drag it in. You'll have schema, row counts, and your first query results before you'd finish typing pip install pyarrow.
→ Try Harbinger Explorer free for 7 days
Continue Reading:
- CSV Data Analysis Without Excel — Same browser-based approach for CSV files
- Natural Language SQL Query Tool — Deep dive into NL-to-SQL capabilities
- JSON Data Analysis in Browser — Working with JSON data the same way
[PRICING-CHECK] — DuckDB CLI and ParquetViewer confirmed free/open-source. HE pricing (€8/€24) per product page.
Continue Reading
API Data Quality Check Tool: Automatic Profiling for Every Response
API data quality breaks silently. Harbinger Explorer profiles every response automatically — null rates, schema changes, PII detection — before bad data reaches your dashboards.
API Documentation Search Is Broken — Here's How to Fix It
API docs are scattered, inconsistent, and huge. Harbinger Explorer's AI Crawler reads them for you and extracts every endpoint automatically in seconds.
API Endpoint Discovery: Stop Mapping by Hand. Let AI Do It in 10 Seconds.
Manually mapping API endpoints from docs takes hours. Harbinger Explorer's AI Crawler does it in 10 seconds — structured, queryable, always current.
Try Harbinger Explorer for free
Connect any API, upload files, and explore with AI — all in your browser. No credit card required.
Start Free Trial