Data EngineersJoin the Utah Data Coordinating Center (DCC) as a Data Engineer, where your work will directly enable innovative clinical research at the University of Utah and across national partners. Youll lead the design of scalable data systems, define and enforce architecture standards, and work alongside software developers, data analysts, and research teams to ensure our platforms evolve with the needs of scientific discovery. This is a growth-focused role ideal for someone who thrives in a collaborative, mission-driven environment. The Utah DCC supports large-scale health data infrastructure that underpins national emergency response, clinical registries, and federal research initiatives.Establish project teams and provide overall direction for technical projects from initiation through to delivery. Perform project requirements, estimation, and budget management. Formulate project scope and delivery strategies and establish milestones/schedules. Maintain and report project status and monitor progress of all team members. Gather required data from end-users to evaluate objectives, goals, and scope to create technical specifications. Serve as liaison between technical and non-technical departments in order to ensure that all targets and requirements are met. Keep leadership informed of key issues that may impact project completion, budget, or other results.As a Data Engineer, your responsibilities will include:1. Design, develop, and maintain database architecture following industry best practicesDesign and implement scalable, secure, and high-performing database solutions aligned with industry standards and architectural best practices. This includes data modeling (conceptual, logical, and physical), schema design, indexing strategies, performance tuning, backup and recovery planning, and ensuring data integrity and consistency. Establish governance standards, naming conventions, version control processes, and documentation to support maintainability, reliability, and long-term scalability across environments.2. Build, optimize, and maintain scalable data pipelinesDesign, develop, and orchestrate reliable, high-performance data pipelines from initial data ingestion through final delivery. This includes data pipeline development, orchestration, transformation logic, and supporting data models optimized for analytics and operational workloads.3. Develop and optimize data processing and automation codeDesign, implement, and maintain robust code for data extraction, transformation, integration, and analysis using appropriate languages and frameworks. Optimize performance, ensure data accuracy, and uphold high standards for code quality, reliability, and maintainability in alignment with software and data engineering best practices.4. Drive continuous improvement and innovation in cloud data technologies (AWS-focused)Stay current with emerging data engineering technologies, industry trends, and evolving AWS services to continuously enhance platform capabilities and architectural standards. Evaluate and adopt appropriate AWS services (e.g., S3, Glue, Lambda, Redshift, RDS, EMR, Step Functions, Lake Formation) to improve scalability, performance, cost efficiency, and reliability. Balance innovation with operational excellence by maintaining and optimizing existing services, enforcing best practices, and ensuring stable, secure, and high-performing production environments.5. Collaborate with business partners to develop scalable data solutionsPartner with internal teams and external stakeholders to design and deliver innovative data solutions that support evolving business needs. This includes developing and exposing data through APIs, building and maintaining multi-dimensional cubes and semantic models, enabling secure data sharing, and creating reusable data services. Translate business requirements into scalable technical solutions that align with enterprise architecture standards, governance policies, and performance expectations.6. Implement and maintain CI/CD and version control best practicesDesign, implement, and support robust CI/CD pipelines to automate build, test, deployment, and release processes for data pipelines, database objects, and cloud infrastructure. Enforce effective version control practices using Git-based workflows, including branching strategies, pull requests, code reviews, and release management. Promote automated testing, infrastructure as code (IaC), and deployment standards to ensure consistency, traceability, reliability, and rapid, low-risk delivery across environments.7. Develop and support data pipelines for business intelligence and analytics Design, build, and maintain reliable, scalable data pipelines that deliver curated, analytics-ready datasets to support Business Intelligence and reporting needs.Implement transformation logic, data validation checks, and orchestration workflows to ensure accuracy, consistency, and timely data availability. Proactively monitor pipeline performance, troubleshoot data issues, and optimize data flows to support dashboards, KPI tracking, ad hoc analysis, and enterprise reporting requirements.8. Support and implement data security and compliance requirementsPartner with operations and security teams to implement and maintain data security controls, access policies, encryption standards, and compliance requirements to safeguard sensitive and regulated data.9. Monitor, troubleshoot, and enhance pipeline performanceContinuously monitor data workflows, resolve data processing issues, identify bottlenecks, and enhance performance across ETL/ELT processes, pipelines, and data integrations.10. Gather requirements and document data workflowsCollaborate with business stakeholders to collect requirements for data pipelines, integrations, and reporting needs. Document data processes, transformation logic, workflow designs, and operational procedures for cross-team visibility and long-term maintainability.11. Operate effectively both independently and within cross-functional teamsDemonstrate the ability to manage priorities, drive initiatives, and deliver high-quality solutions independently while also contributing collaboratively within cross-functional teams. Engage proactively with engineering, BI, security, operations, and business stakeholders to align on requirements, resolve issues, and deliver integrated data solutions. Communicate clearly, share knowledge, and support team objectives to ensure successful project outcomes and continuous improvement.The Utah DCC offers a career ladder for Data Engineers and provides growth and professional development opportunities.To learn more about the Utah DCC visit http://uofuhealth.org/UtahDCCLearn more about the great benefits of working for University of Utah: benefits.utah.edu