Xsolla is a global commerce company with robust tools and services to help developers solve the inherent challenges of the video game industry. From indie to AAA, companies partner with Xsolla to help them fund, distribute, market, and monetize their games. Grounded in the belief in the future of video games, Xsolla is resolute in the mission to bring opportunities together, and continually make new resources available to creators. Headquartered and incorporated in Los Angeles, California, Xsolla operates as the merchant of record and has helped over 1,500+ game developers to reach more players and grow their businesses around the world. With more paths to profits and ways to win, developers have all the things needed to enjoy the game.
Key Responsibilities1. Development
Build, and optimize data pipelines, data dictionary and ETL workflows in Snowflake using Snowpark, Streams/Tasks, and Snowpipe.
Develop scalable data models supporting user 360 views, churn prediction, and recommendation engine inputs.
Support integration across data sources: MySQL, BigQuery, Redis, Kafka, GCP Storage, and API Gateway.
Implement CI/CD for data pipelines using Git, dbt, and automated testing.
Define data quality checks and auditing pipelines for ingestion and transformation layers.2. Performance \& Scalability
Tune warehouse performance and cost efficiency via query optimization, caching, and cluster sizing.
Establish data partitioning, clustering, and materialized views for fast query execution.
Build dashboards and monitors for pipeline health, job success, and data latency metrics (e.g., via Looker, Tableau, or Snowsight).3. Governance \& Best Practices
Establish and enforce naming conventions, data lineage, and metadata standards across schemas.
Contribute to the company’s evolving data mesh and streaming architecture vision.
0-3 years of experience in Data Engineering, with databases ecosystem.
SQL and Python skills, with proven experience building ETL/ELT at scale.
Understanding of Snowflake performance tuning, query optimization, and warehouse orchestration.
Understanding of data modeling (Kimball, Data Vault, or hybrid).
Familiarity with API-based data integration and microservice architectures.Preferred
Excellent cross-functional communication — can translate between engineering and business.
Hands-on problem solver who balances velocity with reliability.
We may use artificial intelligence (AI) tools to support parts of the hiring process, such as reviewing applications, analyzing resumes, or assessing responses. These tools assist our recruitment team but do not replace human judgment. Final hiring decisions are ultimately made by humans. If you would like more information about how your data is processed, please contact us.