
Job Information
UnitedHealth Group Data Engineering Consultant - Azure data bricks, SQL , Python in Bangalore, India
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by diversity and inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health equity on a global scale. Join us to start Caring. Connecting. Growing together.
Primary Responsibilities:
Data Pipeline Development:
Design, develop, and maintain robust data pipelines and ETL processes using Azure Data Factory, Databricks, and other Azure services
Implement scalable and efficient data transformation solutions to handle large volumes of data
Data Integration:
Integrate data from various sources such as on-premises databases, cloud storage solutions (e.g., Azure Blob Storage), and third-party services into a unified data platform
Ensure seamless data ingestion and synchronization across diverse systems
Data Modeling & Architecture:
Design and optimize database schemas, data models, and storage structures to support analytical requirements
Develop strategies for effective data partitioning, indexing, and clustering to enhance query performance
Collaboration & Stakeholder Management:
Work closely with cross-functional teams including data scientists, analysts, product managers, and business stakeholders to understand their needs and deliver high-quality data solutions
Provide technical guidance and mentorship to junior engineers within the team
Performance Optimization:
Monitor the performance of ETL processes and Databricks jobs; troubleshoot issues related to slow queries or bottlenecks
Optimize code for efficiency in both processing time and resource utilization
Security & Compliance:
Implement robust security measures for data access control using Azure Active Directory (AAD), role-based access control (RBAC), encryption at rest/in transit
Ensure compliance with industry standards such as GDPR or HIPAA where applicable
Automation & Scripting:
Develop Python scripts for automation tasks including but not limited to system monitoring, logging enhancements, error handling workflows.
Create automated testing frameworks for validating ETL logic before deployment
Monitoring & Maintenance:
Establish monitoring mechanisms using Azure Monitor or similar tools to proactively identify potential issues in the data pipelines
Perform regular maintenance activities such as software updates/upgrades while minimizing disruptions
Documentation & Best Practices:
Maintain comprehensive documentation on architectural decisions taken during projects ensuring reproducibility of work by other team members
Establish coding standards/best practices within the team ensuring consistency across different projects/tasks undertaken
Innovation & Continuous Improvement:
Stay updated with emerging trends/technologies within the big-data ecosystem; evaluate new tools/frameworks that can improve our existing architecture/processes .- Foster a culture of continuous improvement by identifying opportunities for process enhancements through periodic reviews/retrospectives
Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so
Required Qualifications:
Bachelor's Degree in Computer Science, Information Technology, Engineering, or a related field
8+ years of Experience in data engineering, with a solid focus on building and managing large-scale data pipelines and ETL processes
Extensive experience working with Azure Cloud Services including but not limited to Azure Data Factory, Azure SQL Database, Azure Synapse Analytics, Azure Blob Storage
Hands-on experience with version control systems, particularly Git
Demonstrated experience working within Agile/Scrum environments participating actively in sprint planning/review sessions contributing towards continuous improvement initiatives undertaken by team
In-depth knowledge of SQL for querying and manipulating data across various database platforms
Knowledge of regulatory requirements (e.g., GDPR/HIPAA) impacting data handling/storage practices
Understanding of best practices related to data security, access control mechanisms (RBAC), encryption techniques within cloud environments
Familiarity with other programming languages such as PySpark, Python is beneficial
Proven expertise in Databricks including Spark-based data processing, Delta Lake management, and performance optimization within Databricks
Solid proficiency in Python Scripting for developing automation scripts and ETL tasks
Proven solid analytical skills to troubleshoot complex issues related to data ingestion/processing pipelines
Proven ability to design efficient algorithms for large-scale data transformations
Proven excellent communication skills for effectively collaborating with cross-functional teams including product managers/business stakeholders translating their requirements into technical specifications.
Proven track record of mentoring junior engineers providing technical guidance/support fostering their professional growth
Proven ability to work independently managing multiple priorities/projects simultaneously meeting deadlines without compromising quality standards set forth by the organization 17
Preferred Qualifications:
A Master's degree
Relevant certifications such as Microsoft Certified: Azure Data Engineer Associate or Databricks Certified Professional Data Engineer
Experience with modern data warehousing concepts and tools
At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone–of every race, gender, sexuality, age, location and income–deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes — an enterprise priority reflected in our mission.