In this age of disruption, organizations need to navigate the future with confidence by tapping into the power of data analytics, robotics, and cognitive technologies such as Artificial Intelligence (AI). Our Strategy & Analytics portfolio helps clients leverage rigorous analytical capabilities and a pragmatic mindset to solve the most complex of problems. By joining our team, you will play a key role in helping to our clients uncover hidden relationships from vast troves of data and transforming the Government and Public Services marketplace.
Work you'll do
As a Data Engineer, you will be required to interpret business needs and select appropriate technologies and implement data governance of shared and/or master sets of data. You will work with key business stakeholders, IT experts, and subject-matter experts to plan and deliver optimal data solutions.
Responsibilities include:
Develop and design data pipelines to support an end-to-end solution.
Develop and maintain artifacts i.e., schemas, data dictionaries, and transforms related to ETL processes.
Manage production data within multiple datasets ensuring fault tolerance and redundancy.
Design and develop robust and functional dataflows to support raw data and expected data.
Collaborate with the rest of data engineering team to design and launch new features. Includes coordination and documentation of dataflows, capabilities, etc.
Design and develop databases to support multiple user groups with various levels of access to raw and processed data.
The team
Deloitte's Government and Public Services (GPS) practice - our people, ideas, technology and outcomes-is designed for impact. Serving federal, state, & local government clients as well as public higher education institutions, our team of over 15,000+ professionals brings fresh perspective to help clients anticipate disruption, reimagine the possible, and fulfill their mission promise.
The GPS AI & Data Engineering offering is responsible for developing advanced analytics products and applying data visualization and statistical programming tools to enterprise data in order to advance and enable the key mission outcomes for our clients. Our team supports all phases of analytic work product development, from the identification of key business questions through data collection and ETL, and from performing analyses and using a wide range of statistical, machine learning, and applied mathematical techniques to delivery insights to decision-makers. Our practitioners give special attention to the interplay between data and the business processes that produce it and the decision-makers that consume insights.
Qualifications
Required:
Bachelor's degree required
Must be legally authorized to work in the United States without the need for employer sponsorship, now or at any time in the future
Active TS/SCI security clearance required
3+ years of experience working with software platforms and services, such as Docker, Kubernetes, JMS/SQS, SNS, Kafka, AWS Lambda, NiFi, Airflow, or similar.
3+ years of experience datastores MongoDB/DynamoDB, PostgreSQL, S3, Redshift, JDBC/ODBC, Redis, and graph databases such as Neo4j, Memgraph, or others.
Ability to work on-site in Tampa, FL area
Preferred:
Familiar with Linux/Unix server environments.
Familiar with common data structures needed to support common machine learning packages such as scikit learn, nltk, spacey, and others.
Familiar with or desire to become familiar with data structures need to support Generative AI pipelines such as vector databases, NER, and RAG.
Information for applicants with a need for accommodation: https://www2.deloitte.com/us/en/pages/careers/articles/join-deloitte-assistance-for-disabled-applicants.html
All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, age, disability or protected veteran status, or any other legally protected basis, in accordance with applicable law.