Documentation

Everything you need to succeed

Comprehensive guides, API references, and examples to help you implement data governance at scale.

Getting Started

Introduction to DataMetric

DataMetric is an enterprise-grade data governance platform that combines comprehensive metadata management with interactive analytics capabilities.

DataMetric helps organizations understand, govern, and utilize their data assets effectively. Our platform provides:

  • Automated metadata extraction from your data sources
  • End-to-end data lineage tracking and visualization
  • Fine-grained access control and data policies
  • Interactive analytics and BI capabilities

System Requirements

Before you begin, ensure you have the following

Development Environment

  • • Node.js 18+ or Python 3.9+
  • • 4GB RAM minimum
  • • 10GB available disk space
  • • macOS, Linux, or Windows

Production Deployment

  • • Kubernetes 1.24+ or Docker
  • • PostgreSQL 14+
  • • Redis 6+ (for caching)
  • • Object storage (S3, GCS, or Azure Blob)

Installation

Install the CLI

Install the DataMetric CLI using your preferred package manager

# Install DataMetric CLI
npm install -g @datametric/cli

# Initialize your project
datametric init my-project

# Connect to your data source
datametric source add postgres --host localhost --port 5432

Python SDK Installation

Use pip to install the Python SDK

pip install datametric-client

Node.js SDK Installation

Use npm or yarn to install the TypeScript/JavaScript SDK

npm install @datametric/sdk

Configuration

Basic Configuration

Configure your DataMetric instance

Environment Variables

# API Configuration
DATAMETRIC_API_ENDPOINT=https://api.datametric.io
DATAMETRIC_API_KEY=your-api-key-here

# Environment
DATAMETRIC_ENVIRONMENT=production

# Optional: Custom domain
DATAMETRIC_DOMAIN=data.yourcompany.com
Tip
Use environment variables for sensitive configuration. Never commit API keys to version control.

Connecting Data Sources

Connect your databases, data warehouses, and BI tools

PostgreSQL
datametric source add postgres \\
--host localhost \\
--port 5432 \\
--database mydb \\
--username user \\
--schema public
Snowflake
datametric source add snowflake \\
--account xy12345 \\
--warehouse compute_wh \\
--database analytics

API Reference

REST API

Programmatic access to all DataMetric features

Authentication

All API requests require an API key in the Authorization header:

Authorization: Bearer your-api-key-here

Common Endpoints

GET /api/v1/datasetsList all datasets
POST /api/v1/datasetsCreate a dataset
GET /api/v1/lineage/:idGet lineage graph
POST /api/v1/policiesCreate data policy

Code Examples

SDK examples in popular languages

Python SDK
python
from datametric import DataMetricClient

# Initialize the client
client = DataMetricClient(
    api_key="your-api-key",
    endpoint="https://api.datametric.io"
)

# Register a dataset
dataset = client.datasets.create(
    name="customers",
    source="postgres",
    description="Customer master data"
)

# Track lineage
client.lineage.track(
    upstream="raw.customers",
    downstream="cleaned.customers"
)
TypeScript SDK
typescript
import { DataMetric } from '@datametric/sdk';

const client = new DataMetric({
  apiKey: process.env.DATAMETRIC_API_KEY,
  environment: 'production'
});

// Query metadata
const datasets = await client.datasets.list({
  filter: { domain: 'finance' },
  include: ['lineage', 'owners']
});

// Create a data policy
await client.policies.create({
  name: 'PII Access Control',
  rules: [{
    field: 'contains_pii',
    action: 'restrict',
    roles: ['analyst']
  }]
});

Security & Compliance

Security Best Practices

Keep your data governance platform secure

Use API Keys

Rotate API keys regularly and never expose them in client-side code.

Enable SSO

Configure SAML or OIDC for single sign-on with your identity provider.

Audit Logging

Enable comprehensive audit logging for compliance and security monitoring.

Network Policies

Restrict access to your DataMetric instance using IP allowlists and VPN requirements.

Need more help?

Our team of data governance experts is here to assist you.