Documentation

Everything you need to succeed

Comprehensive guides, API references, and examples to help you implement data governance at scale.

Getting Started

Introduction to DataMetric

DataMetric is an enterprise-grade data governance platform that combines comprehensive metadata management with interactive analytics capabilities.

DataMetric helps organizations understand, govern, and utilize their data assets effectively. Our platform provides:

  • Automated metadata extraction from your data sources
  • End-to-end data lineage tracking and visualization
  • Fine-grained access control and data policies
  • Interactive analytics and BI capabilities

System Requirements

Before you begin, ensure you have the following

Development Environment

  • • Node.js 18+ or Python 3.9+
  • • 4GB RAM minimum
  • • 10GB available disk space
  • • macOS, Linux, or Windows

Production Deployment

  • • Kubernetes 1.24+ or Docker
  • • PostgreSQL 14+
  • • Redis 6+ (for caching)
  • • Object storage (S3, GCS, or Azure Blob)

Installation

Install the CLI

Install the DataMetric CLI using your preferred package manager

# Install DataMetric CLI
npm install -g @datametric/cli

# Initialize your project
datametric init my-project

# Connect to your data source
datametric source add postgres --host localhost --port 5432

Python SDK Installation

Use pip to install the Python SDK

pip install datametric-client

Node.js SDK Installation

Use npm or yarn to install the TypeScript/JavaScript SDK

npm install @datametric/sdk

Configuration

Basic Configuration

Configure your DataMetric instance

Environment Variables

DATAMETRIC_API_ENDPOINT=https://api.datametric.cloud
DATAMETRIC_API_KEY=your-api-key-here

DATAMETRIC_ENVIRONMENT=production

DATAMETRIC_DOMAIN=data.yourcompany.com
Tip
Use environment variables for sensitive configuration. Never commit API keys to version control.

Connecting Data Sources

Connect your databases, data warehouses, and BI tools

PostgreSQL
datametric source add postgres \\
--host localhost \\
--port 5432 \\
--database mydb \\
--username user \\
--schema public
Snowflake
datametric source add snowflake \\
--account xy12345 \\
--warehouse compute_wh \\
--database analytics

API Reference

REST API

Programmatic access to all DataMetric features

Authentication

All API requests require an API key in the Authorization header:

Authorization: Bearer your-api-key-here

Common Endpoints

GET /api/v1/datasetsList all datasets
POST /api/v1/datasetsCreate a dataset
GET /api/v1/lineage/:idGet lineage graph
POST /api/v1/policiesCreate data policy

Code Examples

SDK examples in popular languages

Python SDK
python
from datametric import DataMetricClient

# Initialize the client
client = DataMetricClient(
    api_key="your-api-key",
    endpoint="https://api.datametric.cloud"
)

# Register a dataset
dataset = client.datasets.create(
    name="customers",
    source="postgres",
    description="Customer master data"
)

# Track lineage
client.lineage.track(
    upstream="raw.customers",
    downstream="cleaned.customers"
)
TypeScript SDK
typescript
import { DataMetric } from '@datametric/sdk';

const client = new DataMetric({
  apiKey: process.env.DATAMETRIC_API_KEY,
  environment: 'production'
});

// Query metadata
const datasets = await client.datasets.list({
  filter: { domain: 'finance' },
  include: ['lineage', 'owners']
});

// Create a data policy
await client.policies.create({
  name: 'PII Access Control',
  rules: [{
    field: 'contains_pii',
    action: 'restrict',
    roles: ['analyst']
  }]
});

Security & Isolation

Enterprise-Grade Isolation

How DataMetric protects your data through physical separation

Database-per-Tenant

Your data resides in a dedicated PostgreSQL database. We do not use shared tables with Row-Level Security for customer data.

Ephemeral Compute

Notebooks and jobs run in isolated, short-lived containers that are destroyed immediately after execution.

Audit Logging

Every action—from login to query execution—is logged in a tamper-proof audit trail available via API.

Need more help?

Our team of data governance experts is here to assist you.