We are seeking a Senior Data Integration Engineer to develop and maintain reliable financial reporting pipelines. This role combines data engineering with analytics to provide actionable insights that enhance decision-making and optimize financial operations. You will collaborate with finance, insights, and revenue teams while leveraging advanced tools to support a global ecosystem.
Responsibilities
Collaborate with data engineers, analysts, and integration specialists to design and implement end-to-end financial reporting pipelines
Partner with finance teams to build analytics systems that capture revenue, tax, and payment data for internationally recognized game titles and partner products
Work closely with e-commerce engineers and application developers to deliver scalable solutions for analyzing transaction volumes across Epic Online Services
Focus on integrating data into Epic’s global revenue reporting platform from platforms such as Microsoft, PlayStation, Nintendo, and PC storefronts
Design and maintain dashboards and reports using Tableau to visualize tax, transactional, and e-commerce data
Collaborate with central data engineering teams to capture, store, and automate the delivery of financial event data using Databricks, Snowflake, and AWS
Requirements
Bachelor’s or Master’s degree in Finance, Computer Science, Mathematics, or a related field
At least 3 years of experience in data engineering and analytics
Proficiency in managing data pipelines using Apache Airflow and Databricks
Strong knowledge of API protocols, including REST and GraphQL
Advanced skills in analytical and development languages, such as SQL, Python, Pyspark, or C#
Experience working with cloud platforms such as AWS, Azure, or GCP
Skilled in version control and CI/CD processes using tools like GitHub
Proven ability to work with diverse technologies and take ownership of dashboards, event pipelines, and warehousing solutions
Fluent English communication skills, both written and spoken, at a B2+ level
Nice to have
Experience with AWS Lambda for serverless computing
Knowledge of predictive analytics, including regression analysis, classification, or machine learning techniques
Proven ability to debug and manage CI/CD pipelines effectively
Experience optimizing large-scale data using technologies such as Databricks, Snowflake, Hadoop, EMR, or Kafka
We offer
Career plan and real growth opportunities
Unlimited access to LinkedIn learning solutions
Constant training, mentoring, online corporate courses, eLearning and more
English classes with a certified teacher
Support for employee’s initiatives (Algorithms club, toastmasters, agile club and more)
Enjoyable working environment (Gaming room, napping area, amenities, events, sport teams and more)
Flexible work schedule and dress code
Collaborate in a multicultural environment and share best practices from around the globe
Hired directly by EPAM \& 100% under payroll
Law benefits (IMSS, INFONAVIT, 25% vacation bonus)
Major medical expenses insurance: Life, Major medical expenses with dental \& visual coverage (for the employee and direct family members)
13 % employee savings fund, capped to the law limit
Grocery coupons
30 days December bonus
Employee Stock Purchase Plan
12 vacations days
Official Mexican holidays, plus 5 extra holidays (Maundry Thursday and Friday, November 2nd, December 24th \& 31st)
Monthly non-taxable amount for the electricity and internet bills
EPAM is a leading global provider of digital platform engineering and development services. We are committed to having a positive impact on our customers, our employees, and our communities. We embrace a dynamic and inclusive culture. Here you will collaborate with multi-national teams, contribute to a myriad of innovative projects that deliver the most creative and cutting-edge solutions, and have an opportunity to continuously learn and grow. No matter where you are located, you will join a dedicated, creative, and diverse community that will help you discover your fullest potential.
By applying to our role, you are agreeing that your personal data may be used as in set out in EPAM´s Privacy Notice and Policy.