What the Candidate Will Do
Provide technical leadership in architecting, implementing, testing, releasing, and monitoring data models for intuitive analytics and business insights
Build strong relationships within the organization through meaningful collaboration and trust-building
Actively identify and solve engineering and business problems with minimal guidance
Serve as a role model for sound judgment and responsibility in project planning and execution
See the big picture and identify strategically important problems, inefficiencies, or opportunities for high-impact improvements
Drive alignment and deliver high-leverage solutions at the group and organizational level
Raise the bar on sustainable engineering by improving best practices and producing high-quality code, documentation, testing, and monitoring systems
Mentor and advise both engineers and leaders, managing differing opinions maturely to help teams commit and move forward
Basic Qualifications
Bachelor's or Master's degree in Computer Science or a related field
Passion for driving continual improvement in engineering best practices (coding, testing, monitoring)
Strong technical leadership abilities and experience collaborating with stakeholders for high-impact outcomes
Extensive experience designing and managing data pipelines, dimensional data models, and data warehouses for business analytics
Experience with distributed data systems for logging, storage, ETL, data quality, and monitoring
10+ years of hands-on experience using SQL to build and deploy production-quality ETL pipelines
10+ years of experience with Hadoop, Hive, Vertica, or other MPP databases (e.g., AWS Redshift, Teradata)
Experience developing tools and scripts to accelerate data consumption
Proven success as a Staff Engineer or above at a leading technology company
Excellent written and verbal communication skills, including technical documentation
Passionate about team development and mentoring fellow engineers
Preferred Qualifications
Master's or PhD degree in Computer Science or a related field
Experience in Fintech or Payments, particularly in regulatory and compliance-heavy projects
Hands-on experience with big data technologies such as Hadoop and Hive
Deep understanding of fault-tolerant systems and multi-datacenter/cloud architectures
Extensive experience with real-time data ingestion and stream processing
Proficiency in multiple programming languages (e.g., Go, Java, Python, Scala) and data storage technologies (e.g., MySQL, Cassandra, Redis)