About Us
Perficient is the global AI-first consultancy. Our team of strategists, designers, technologists, and engineers partners with the worldās most innovative enterprises and admired brands to deliver real business results through the transformative power of AI. As part of our AI-First strategy, we empower every employee to build AI fluency and actively engage with AI tools to drive innovation and efficiency. We break boundaries, obsess over outcomes, and shape the future for our clients. Join a company where bold ideas and brilliant minds converge to redefine whatās possible - while building a career filled with growth, balance, and purpose.
š§ Role Area š§ Responsibilities
DevOps Engineer
Automate deployment. Manage CI/CD pipelines for Neo4j updates and configuration changes
Data Engineer
Design and implement data ingestion pipelines from Azure Databricks services. Transform and model data for graph structures
Graph Security Strategy
Use the structure and metadata of the graphālabels, relationship types, and property keysāto enforce fine-grained access control and ensure data integrity
Performance \& Monitoring Lead
Monitor Performance, Neo4j query efficiency. Tune platform configurations and optimize Cypher queries
Execute Graph Data Scientist algorithms
Use Neo4j Graph Data Science (GDS) for advanced analytics. Apply algorithms like PageRank, community detection, and similarity search
Visualization Specialist
Implement Neo4j Bloom to visualize and explore graphs, enabling validation of graph representations with business stakeholders.
š” Skill/Tool
Core Technical Skills
Python Scripting ETL logic, interacting with Neo4j via drivers like neo4j or py2neo
SQL Querying and transforming data from relational sources
Apache Spark Distributed data processing in Azure Databricks
Cypher Neo4jās query language for graph modeling and manipulation
Cloud \& Platform Skills
Azure Databricks Managing clusters, notebooks, Delta Lake, and job orchestration
Azure Data Lake Storage (ADLS) Ingesting and managing structured/unstructured data
Azure Data Factory (ADF) Orchestrating pipelines across services
Graph Data Engineering
Graph Modeling Concepts Designing nodes, relationships, and properties, Organizing
Neo4j Integration Using drivers or REST APIs for ingestion
Data Engineering Practices
ETL/ELT Pipeline Design Building scalable, fault-tolerant pipelines
Data Governance \& Security Managing access and sensitive data across cloud services
Monitoring \& Optimization Tuning Spark jobs and Neo4j queries
Bonus Skills
Version Control \& DevOps Git, GitHub Actions, Azure DevOps for CI/CD
Visualization Tools Power BI, Neo4j Bloom for graph exploration
Certifications Azure DP-203, Databricks Data Engineer Professional