Description
Position Overview
We are seeking a talented Data Engineer with strong experience in Python, AWS, and Databricks to design and build scalable data pipelines and modern data platforms. The ideal candidate will help develop and maintain data infrastructure that supports analytics, machine learning, and business intelligence initiatives. This role requires hands-on experience working with large datasets, cloud-native architectures, and distributed data processing frameworks.
Key Responsibilities
Design, build, and maintain scalable data pipelines and ETL/ELT workflows using Python and cloud technologies.
Develop and optimize data solutions using AWS services and Databricks.
Build and manage data lakes and data warehouses for structured and unstructured data.
Implement data transformation and processing pipelines using Apache Spark within Databricks.
Integrate data from multiple sources including APIs, databases, and streaming systems.
Ensure data quality, governance, security, and compliance across the data platform.
Monitor pipeline performance and troubleshoot data pipeline failures or latency issues.
Collaborate with data analysts, data scientists, and business stakeholders to deliver reliable datasets.
Optimize storage and compute costs within the AWS ecosystem.
Requirements
Required Qualifications
Bachelor's degree in Computer Science, Data Engineering, Information Systems, or related field.
3+ years of experience in data engineering or data platform development.
Strong programming experience with Python for data processing and automation.
Hands-on experience with AWS cloud services such as:
Amazon S3
AWS Glue
AWS Lambda
Amazon Redshift
Amazon EMR
Experience working with Databricks and Apache Spark for large-scale data processing.
Strong knowledge of SQL and relational databases.
Experience designing and maintaining ETL/ELT pipelines.
Preferred Qualifications
Experience with data orchestration tools such as Airflow or AWS Step Functions.
Familiarity with streaming data technologies (Kafka, Kinesis, or Spark Streaming).
Experience with CI/CD pipelines and DevOps practices.
Knowledge of data modeling, data warehousing, and lakehouse architectures.
Experience working in Agile development environments.
Technology Doesn't Change the World, People Do.®
Robert Half is the world's first and largest specialized talent solutions firm that connects highly qualified job seekers to opportunities at great companies. We offer contract, temporary and permanent placement solutions for finance and accounting, technology, marketing and creative, legal, and administrative and customer support roles.
Robert Half works to put you in the best position to succeed. We provide access to top jobs, competitive compensation and benefits, and free online training. Stay on top of every opportunity - whenever you choose - even on the go. Download the Robert Half app (https://www.roberthalf.com/us/en/mobile-app) and get 1-tap apply, notifications of AI-matched jobs, and much more.
All applicants applying for U.S. job openings must be legally authorized to work in the United States. Benefits are available to contract/temporary professionals, including medical, vision, dental, and life and disability insurance. Hired contract/temporary professionals are also eligible to enroll in our company 401(k) plan. Visit roberthalf.gobenefits.net for more information.
© 2025 Robert Half. An Equal Opportunity Employer. M/F/Disability/Veterans. By clicking "Apply Now," you're agreeing to Robert Half's Terms of Use (https://www.roberthalf.com/us/en/terms) and Privacy Notice (https://www.roberthalf.com/us/en/privacy) .