Logo

Senior Data Engineer

Biconomy
Biconomy empowers Web3 developers. As a Data Engineer, you'll build and scale the infrastructure powering HyperSignals, a high-throughput analytics engine extracting insights from on-chain and CEX trading flows.

Overview

Department

Data Science & Analytics

Job type

Full time

Compensation

Salary not specified

Location

Global

Resume Assistance

See how well your resume matches this job role with our AI-powered score. By uploading your resume, you agree to our Terms of Service

Ready to apply?

You're one step away - it takes less than a minute to upload your resume

Requirements

  • 4+ years of experience as a Data Engineer in high-throughput environments such as trading, crypto, or fintech
  • Expert-level Python (pandas, pyarrow, asyncio) and SQL skills, with strong fundamentals in algorithms and distributed systems
  • Proven experience with streaming frameworks (Flink, Spark Structured Streaming, Kafka Streams) and message buses (Kafka, Kinesis, Pulsar)
  • In-depth understanding of blockchain data structures (blocks, receipts, logs), indexers (The Graph, Substreams), and node/RPC infrastructure
  • Familiarity with CEX market APIs (REST & WebSocket) and mechanics of perpetual futures (funding, mark price, open interest, liquidations)
  • Proficient in cloud-native development (AWS or GCP), including IaC (Terraform/CDK), CI/CD, and container orchestration (EKS/GKE)
  • A strong track record of building and owning production systems end-to-end, with clear documentation and operational rigor
  • Passionate about perpetual futures and market microstructure—you don’t need to be a trader, but curiosity is key
  • Responsibilities

  • Design, build, and maintain streaming and batch ETL pipelines for on-chain sources across EVM, Solana, Sui, Starknet, and more
  • Develop NLP and sentiment pipelines for off-chain sources (Binance, Bybit, social platforms) to extract actionable market signals
  • Normalize and unify disparate market data schemas (order books, trades, liquidations, funding rates) into a single analytics model for perpetuals
  • Implement low-latency ingestion systems using Kafka, Kinesis, PubSub, WebSockets, or Firehose, with exactly-once guarantees
  • Build and optimize lakehouse/warehouse layers (Iceberg, Delta, Snowflake, BigQuery) with Z-ordering, partitioning, and materialized views
  • Enforce data quality and observability using dbt tests, Great Expectations, and OpenTelemetry
  • Collaborate with quants and backend engineers to deliver data via GraphQL/REST APIs and feature stores
  • Continuously optimize performance, cost, and scalability across AWS/GCP infrastructure
  • Benefits

  • Flexible Working Hours– Enjoy autonomy over your schedule
  • Generous Vacation Policy– 25 days vacation per year plus public holidays
  • Competitive Salary– With regular performance reviews
  • Token Allocation– Be rewarded with tokens as part of our compensation package
  • Growth Opportunities– Be part of an exciting new project with significant career growth potential
  • Innovative Work Culture– Join a team that’s at the cutting edge of Web3, AI, and DeFi, and help shape the future of the digital economy
  • Fun and Engaging Team Activities– Game nights, virtual celebrations, and work retreats to keep things exciting
  • © All rights reserved.