👨🏻‍💻 postech.work

Principal Software Development Engineer - AI Data Platform

Oracle • 🌐 In Person

In Person Posted 2 days, 12 hours ago

Job Description

Oracle’s Forward Deployed Engineer (FDE) team is hiring a Principal Software Development Engineer - AI Data Platform to help global customers unlock the full potential of their data. You will provide expert architectural guidance focused on designing, optimizing, and scaling modern AI/ML-centric data platforms. As a key member of Oracle’s Analytics and AI Service Excellence organization, you will work closely with enterprise customers, product management, and engineering to ensure seamless adoption of Oracle AI Data Platform, Gen AI services, and associated cloud technologies.

Key Responsibilities:

Design, implement, and maintain scalable software components and services that support AI/ML workloads.

Build APIs, SDKs, and automation frameworks to streamline the adoption of Oracle AI Data Platform and Gen AI services.

Optimize performance, scalability, and reliability of distributed data/AI systems.

Collaborate with cross-functional teams (engineering, product, and field) to solve complex technical challenges.

Participate in code reviews, testing, and CI/CD to ensure high-quality deliverables.

Document technical designs and contribute to knowledge-sharing (e.g., blogs, internal docs, demos).

Continuously explore new tools, frameworks, and best practices in AI, cloud, and data engineering.

Primary Skills:

Experience with LLMs and agentic frameworks (e.g., MCP, LangChain, CrewAI, Semantic Kernel).

Knowledge of RAG pipelines and vector DBs (e.g., 23ai, FAISS, Pinecone, Weaviate).

Familiarity with OCI Gen AI Services and model lifecycle workflows.

Solid Python and REST API skills.

Exposure to building autonomous agents and orchestration pipelines.

Experience working with cloud platforms like Oracle Cloud Infrastructure (OCI) and Oracle Cloud Infrastructure Big Data Service (BDS) and Big Data Appliance (BDA).

Proficiency in big data technologies such as Hadoop, Spark, Kafka, and Nosql

Design and implement scalable, secure, and efficient complex data architectures.

Manage and optimize large-scale databases.

Must have a solid understanding of networking concepts to design and optimize data transmission networks, configure network settings, and diagnose and resolve network-related issues.

Troubleshooting and problem-solving skills

Excellent communication and collaboration skills

Commitment to continuous learning and staying up-to-date with the latest big data technologies and trends

Desirable Skills:

Worked with multiple Cloud Platforms.

Certification in Big Data or a related field.

Get job updates in your inbox

Subscribe to our newsletter and stay updated with the best job opportunities.