Requisition ID: 68560
About Whirlpool Corporation
Whirlpool Corporation (NYSE: WHR) is a leading home appliance company, in constant pursuit of improving life at home. As theonly major U.S.-based manufacturer of kitchen and laundry appliances, the company is driving meaningful innovation to meet the evolving needs of consumers through its iconic brand portfolio, including Whirlpool, KitchenAid, JennAir, Maytag, Amana, Brastemp, Consul, and InSinkErator. In 2024, the company reported approximately $17 billion in annual sales - close to 90% of which were in the Americas - 44,000 employees, and 40 manufacturing and technology research centers. Additional information about the company can be found at WhirlpoolCorp.com.
The team you will be a part of
The Data Science team is responsible for modeling complex business problems, discovering business insights and identifying opportunities through the use of statistical, algorithmic, mining and visualization techniques. In addition to advanced analytics skills, this role is also proficient at integrating and preparing large, varied datasets, architecting specialized database and computing environments, and communicating result.This role in summary
We are seeking a skilled Data Engineer to design, build, and maintain the data infrastructure that powers our analytics, products, and decision-making processes. You will be responsible for developing scalable data pipelines, optimizing data workflows, and ensuring high-quality, accessible data across the organization. This role requires strong technical expertise, problem-solving skills, and the ability to collaborate with cross-functional teams including data scientists, analysts, and software engineers.
This role is also responsible for setting up Whirlpool Data Assets and Data Products. The individual will provide leadership through mentoring and partnership with other architects and data stewards within the organization. This role is part of the Global Value Streams roadmap, focusing on designing and implementing next-generation advanced and flexible tech products to enable business growth.
Your responsibilities will include
Understand business priorities and success measures to design and implement appropriate data solutions.
Possess strong hands-on experience in Data Engineering and demonstrate expertise in modern data architectures across the organization.
Create robust and automated pipelines in GCP to ingest and process structured and unstructured data from source systems into analytical platforms, utilizing batch and streaming mechanisms leveraging cloud-native toolsets and DBT.
Collaborate with data architects, ETL developers, engineers, BI developers/data scientists, and information designers to identify and define required data structures, formats, pipelines, metadata, and workload orchestration capabilities.
Provide thought leadership and new perspectives on how to leverage GCP cloud services and capabilities.
Maintain technical skills and knowledge of market trends and competitive insights, collaborating and sharing with the technical community.
Design, develop, and maintain scalable ETL/ELT pipelines to support data ingestion, transformation, and integration from various sources.
Build and manage data warehouses, data lakes, and streaming platforms.
Ensure data quality, integrity, and governance through validation, monitoring, and automation.
Optimize data workflows for performance, scalability, and cost-efficiency.
Collaborate with stakeholders to understand business requirements and translate them into data solutions.
Implement best practices for data security, compliance, and privacy.
Troubleshoot data issues and support downstream analytics and reporting needs.
Stay current with emerging data technologies and recommend improvements to existing infrastructure.
Minimum requirements
Experience in product and/or data architecture and data engineering.
Bachelor’s degree or higher in STEM fields such as Computer Science, Information Management, Big Data \& Analytics, or equivalent work experience.
Hands-on experience in designing and implementing creative data solutions using GCP (BigQuery, Dataflow, Dataproc, Pub/Sub, Spark/PySpark, Python, SQL, etc.).
Strong background in architecting solutions for data extraction, transformation, and loading from a variety of structured, unstructured, and semi-structured sources using SQL, NoSQL, and data pipelines for real-time, streaming, batch, and on-demand workloads.
Solid understanding of Agile SDLC implementation in public cloud ecosystems, including environment management, test automation, CI/CD, and resource optimization.
Experience in analytics/data management strategy, architectural blueprinting, business case development, and effort estimation for GCP-based analytics solutions.
Knowledge of GCP services (BigQuery, DataFlow, Cloud Functions, Cloud Run, GCS) is a plus.
Experience with software configuration management tools such as JIRA, Git, Jenkins, Bitbucket, and Confluence.
Familiarity with Data Analytics tools such as Tableau and Looker.
Proven experience as a Data Engineer or in a similar role.
Proficiency in SQL and at least one programming language (Python, Java, Scala, etc.).
Hands-on experience with cloud platforms (AWS, GCP, or Azure) and data services (Redshift, BigQuery, Snowflake, Databricks, etc.).
Strong knowledge of data modeling, data warehousing concepts, and distributed systems.
Experience with workflow orchestration tools (Airflow, Prefect, Luigi, etc.).
Familiarity with streaming technologies (Kafka, Kinesis, Spark Streaming, Flink, etc.).
Understanding of CI/CD pipelines, DevOps practices, and version control systems (Git).
Excellent problem-solving, communication, and teamwork skills.
Preferred skills and experiences
Background in CPG/Retail or eCommerce.
GCP Data Engineering or Cloud Architect Certification.
Full stack development is a plus.
Experience with machine learning pipelines or supporting data science workflows.
Knowledge of data governance frameworks (GDPR, HIPAA, etc.).
Familiarity with containerization and orchestration (Docker, Kubernetes).
What we offer
_Flexible schedule
_"No dress code".
_Gympass;
_Transportation voucher, Shuttle buses or Free parking at the company;
_Meal on site;
_Benefits such as payment-deducted or social loans, health plan, dental plan, life insurance, and private pension plan compatible with the market;
_Employee Support Program, with 24-hour assistance from legal, social, financial, social workers, and psychologists;
_Daycare assistance or Nursery at the company;
_Services available on-site: beauty and aesthetics salon, internal bank agency, laundry, cafeteria, restaurant, and lactation room;
_Two weeks of remote work from anywhere;
_After 5 years with the company, eligible employees can take four weeks of paid leave;
_Discount on products through Compra Certa.
_Discount on insurance (pet, auto, home, bike, travel, and more)
_Extended maternity/paternity leave
Learn more at https://www.whirlpoolcareers.com/trabalhe-conosco-no-brasil/
Additional information
N/A
Connect with us and learn more about Whirlpool Corporation
See what it's like to work at Whirlpool by visiting Whirlpool Careers. Additional information about the company can be found on Facebook, Twitter, LinkedIn, Instagram and YouTube.
Whirlpool Corporation is committed to equal employment opportunity and prohibits any discrimination on the basis of race or ethnicity, religion, sex, pregnancy, gender expression or identity, sexual orientation, age, physical or mental disability, veteran status, or any other category protected by applicable law.