Logo

Software Engineer, Data Infrastructure

Alchemy
As an engineer focused on backend systems at Alchemy, you'll be working with one of the most sophisticated and high-throughput distributed systems in the blockchain world. You'll focus on architecting and building new systems as well as improving existing ones for a platform that supports millions of users globally.

Overview

Department

Engineering

Job type

Full time

Compensation

$200,000 per year

Location

New York, United States, North America

Resume Assistance

See how well your resume matches this job role with our AI-powered score. By uploading your resume, you agree to our Terms of Service

Ready to apply?

You're one step away - it takes less than a minute to upload your resume

About Alchemy

Our mission is to bring web3 to a billion people, by providing builders with the tools they need to build exceptional onchain products. Alchemy is the only complete developer platform that offers the powerful APIs, SDKs, and tools necessary to build and scale onchain apps and rollups.

 

Its infrastructure powers 70% of the top web3 teams, 90%+ of web2 companies building in web3 and 100+ million end users. Customers include top web3 brands like Polymarket, OpenSea, Circle, WorldCoin, as well as major global brands like Shopify and Adobe.

 

The team draws from decades of deep expertise in massively scalable infrastructure, AI, and blockchain from leadership roles at leading companies and universities like Google, Microsoft, Facebook, Stanford, and MIT.

 

The company is backed by the world's leading VCs and institutions, including: Lightspeed, Silver Lake, a16z, Coatue, Pantera, Addition, Stanford University, Coinbase, and Charles Schwab, among others.

What You'll Do

  • Maintain Alchemy’s batch pipelines that power our production serving systems
  • Set up frameworks and tools to help team members create and debug pipelines by themselves
  • Track data quality and latency, and set up monitors and alerts to ensure smooth operation
  • Build production DAG workflows for batch data processing and storage
  • Aggregate logs from multiple regions and multiple clouds
  • Design and implement our next generation data warehouse that aggregates internal and third-party data sources
  • What We're Looking For

  • Requires a Bachelor’s degree or foreign degree equivalent in Computer Science, Computer Engineering, Information Systems, or a closely related field
  • 4+ years of relevant industry experience in data engineering or data infrastructure
  • Knowledge of the following is also required: Multi-state orchestration frameworks such as Airflow; Jenkins; and Pachyderm
  • Benefits

  • Competitive compensation, including base salary as well as equity
  • Comprehensive medical, dental, and vision coverage
  • Other benefits such as 401k and unlimited flexible time off
  • © All rights reserved.