Advanced
Extensible
Customizable

Custom DataSource

Build Your Own Bitcoin Data Provider

Create custom data sources to integrate any Bitcoin data provider with LaserEyes. Perfect for specialized use cases and custom infrastructure.

Flexible
Any data source
Composable
Mix & match
Type-Safe
Full TypeScript
Configurable
Full control

Features

Custom Implementation

Build data sources tailored to your specific needs, from custom APIs to specialized blockchain indexers.

Modular Design

Mix and match different data sources, with automatic fallback support and priority handling.

Performance Control

Optimize performance with custom caching strategies, request batching, and error handling.

Extensible Interface

Implement only the methods you need, with TypeScript ensuring type safety throughout.

Implementation Guide

Custom Data Source

import { DataSource, DataSourceConfig } from '@omnisat/lasereyes-core'

interface CustomConfig extends DataSourceConfig {
  apiKey?: string
  baseUrl: string
}

export class CustomDataSource implements DataSource {
  private config: CustomConfig

  constructor(config: CustomConfig) {
    this.config = config
  }

  async getBalance(address: string): Promise<string> {
    const response = await fetch(`${this.config.baseUrl}/balance/${address}`)
    const data = await response.json()
    return data.balance
  }

  async getTransaction(txid: string): Promise<any> {
    const response = await fetch(`${this.config.baseUrl}/tx/${txid}`)
    return response.json()
  }

  // Implement other methods as needed...
}

Configuration

Setup

import { LaserEyesClient, createConfig } from '@omnisat/lasereyes-core'
import { CustomDataSource } from './custom-datasource'

const config = createConfig({
  dataSources: {
    custom: new CustomDataSource({
      baseUrl: 'https://your-api.com',
      apiKey: 'your-api-key',
      priority: 1
    }),
    mempool: {  // Fallback source
      priority: 2
    }
  }
})

const client = new LaserEyesClient(config)

Advanced Examples

Caching Implementation

export class CachedDataSource implements DataSource {
  private cache: Map<string, { data: any, timestamp: number }>
  private ttl: number // Time to live in ms

  constructor(config: CustomConfig) {
    this.cache = new Map()
    this.ttl = config.cacheTTL || 60000 // 1 minute default
  }

  private getCached<T>(key: string): T | null {
    const cached = this.cache.get(key)
    if (!cached) return null
    
    if (Date.now() - cached.timestamp > this.ttl) {
      this.cache.delete(key)
      return null
    }
    
    return cached.data as T
  }

  private setCache(key: string, data: any): void {
    this.cache.set(key, {
      data,
      timestamp: Date.now()
    })
  }

  async getBalance(address: string): Promise<string> {
    const cacheKey = `balance:${address}`
    const cached = this.getCached<string>(cacheKey)
    if (cached) return cached

    const balance = await super.getBalance(address)
    this.setCache(cacheKey, balance)
    return balance
  }
}

Batch Processing

export class BatchDataSource implements DataSource {
  private batchSize: number
  private batchDelay: number
  private queue: Map<string, Promise<any>>

  constructor(config: CustomConfig) {
    this.batchSize = config.batchSize || 10
    this.batchDelay = config.batchDelay || 100
    this.queue = new Map()
  }

  private async processBatch(addresses: string[]): Promise<Record<string, string>> {
    const response = await fetch(this.config.baseUrl + '/batch', {
      method: 'POST',
      body: JSON.stringify({ addresses })
    })
    return response.json()
  }

  async getBalance(address: string): Promise<string> {
    if (this.queue.has(address)) {
      return this.queue.get(address)!
    }

    const promise = new Promise<string>(async (resolve) => {
      await new Promise(r => setTimeout(r, this.batchDelay))
      const batch = Array.from(this.queue.keys()).slice(0, this.batchSize)
      const results = await this.processBatch(batch)
      resolve(results[address])
    })

    this.queue.set(address, promise)
    return promise
  }
}

Best Practices

Error Handling

Implement proper error handling and normalize errors to match the LaserEyes error format.

Rate Limiting

Add rate limiting and request queuing to prevent API abuse and handle throttling gracefully.

Data Normalization

Normalize response data to match the expected format of other data sources for consistency.

Performance

Implement caching and batch processing where appropriate to optimize performance.