Think Beyond The Label Jobs

Mobile Think Beyond The Label Logo

Job Information

TEKsystems Big Data Software Engineer 3 in Jacksonville, Florida

Description:

We're on a mission to bring the focus back to what truly matters – patient care. As the leading healthcare engagement platform, we're the heartbeat of an industry that impacts millions. With over 2 million providers connected to health plans, and processing over 13 billion transactions annually, our influence is continually expanding.

Join our energetic, dynamic, and forward-thinking team where your ideas are celebrated, innovation is encouraged, and every contribution counts. We're transforming the healthcare landscape, solving communication challenges, and creating connections that empower the nation's premier healthcare ecosystem.

Reporting to Application Development Manager, the Big Data Software Engineer III will work on a dedicated team of engineers developing, enhancing, and maintaining Availity’s high transactional Provider Data Management platform.

Sponsorship, in any form, is not available for this position.

Location: Remote US

Why work on this team:

• This team supports a high transactional platform that directly impacts patient experience

• This team is working to continually improve process and enhance platform capabilities

What you will be doing:

• Participating in daily stand ups at 9:30am ET

• Developing a scalable and resilient cloud data platform and scalable data pipelines

• Ensuring industry best practices around data pipelines, metadata management, data quality, data governance, and data privacy

• Building a highly scalable AWS Infrastructure (from scratch or through 3rd party products) to enable Big Data Processing in the platform

• Finding optimization within cloud resource usage to minimize costs while maintaining system reliability, including leveraging reserved instances and spot instances effectively

• Finding performance sensitive considerations within development best practices, as well as, troubleshooting across the data platform utilizing tools (e.g., Splunk and New Relic, Cloud Watch, etc.) to ensure performance measurement and monitoring.

• Participating in coding best practices, guidelines and principles that help engineers write clean, efficient, and maintainable code

• Participating in code reviews to catch issues, improve code quality, and provide constructive feedback to individuals within the team during code reviews

• Working on ETL transformation which includes gathering raw data and files from the client, transforming it into Availity’s format and sending down the ETL pipeline for further processing

• Working on a team following Agile Scrum principles

• Incorporating development best practices

• Ensuring your code is efficient, optimized, and performant

• Collaborating on programming or development standards

• Maintaining technical debt and applying security principles

• Innovating with ideas and products to the organization

• Performing unit testing and complex debugging to ensure quality

• Learning new things & sharing your knowledge with others

Requirements:

• Bachelor’s degree preferably Computer Science, Engineering, or other quantitative fields

• 4+ years of related experience in designing and implementing enterprise applications using big data

• 2+ years of experience working with Java EE

• 2+ years of experience working with large-scale data and developing SQL queries

• 2+ years of hands-on experience with AWS cloud services, such as Apache Spark, with Scala, AWS EMR, Airflow, RedShift

• 3+ years of experience with RESTFul APIs and web services

• Excellent communication skills including discussions of technical concepts, soft skills, conducting peer-programming sessions, and explaining development concepts

• In-depth understanding of Spark framework, scripting languages (e.g., Python, Bash, node.js) and programming languages (e.g., SQL, Java, Scala) to design, build, and maintain complex data processing, ETL (Extract, Transform, Load) tasks, and AWS automation.

• A firm understanding of unit testing.

• Possess in-depth knowledge of AWS services and data engineering tools to diagnose and solve complex issues efficiently, specifically AWS EMR for big data processing.

• In-depth understanding of GIT or other distributed version control systems.

• Excellent communication. Essential to performing at maximum efficiency within the team.

• Collaborative attitude. This role is part of a larger, more dynamic team that nurtures collaboration.

• Strong technical, process, and problem-solving proficiency.

• Must have experience with SQL and relational database systems (e.g., Oracle, SQL Server).

• Experience in the healthcare industry or another highly regulated field is a plus

Skills:

Python, pyspark, sql, scala, spark, aws, amazon web services

Top Skills Details:

Python, pyspark, sql, scala, spark, aws

Additional Skills & Qualifications:

Python/PySpark are nice to have

Experience Level:

Intermediate Level

About TEKsystems:

We're partners in transformation. We help clients activate ideas and solutions to take advantage of a new world of opportunity. We are a team of 80,000 strong, working with over 6,000 clients, including 80% of the Fortune 500, across North America, Europe and Asia. As an industry leader in Full-Stack Technology Services, Talent Services, and real-world application, we work with progressive leaders to drive change. That's the power of true partnership. TEKsystems is an Allegis Group company.

The company is an equal opportunity employer and will consider all applications without regards to race, sex, age, color, religion, national origin, veteran status, disability, sexual orientation, gender identity, genetic information or any characteristic protected by law.

DirectEmployers