Summary The United States Postal Service, Office of Inspector General is seeking a highly qualified and versatile individual to fill our Data Engineer position in the Research and Insights Solution Center located in Arlington, Virginia. Bring your skills and voice to our team! Responsibilities In-Office Requirement: The U.S. Postal Service, Office of Inspector General values collaboration, teamwork, and effective communication to foster our dynamic working and learning environment. The position adheres to our agency's telework policy to facilitate this and requires an in-office presence of at least two days per pay period. Remote work is not available for this role. The Research and Insights Solution Center (RISC) is the chief data and research component of the OIG, comprised of data scientists, data analysts, programmatic subject matter experts, geographic information system professionals, data engineers, research specialists, economists, and public policy analysts. Our analytics group offers the opportunity to drive value to the organization by designing and developing analytical solutions for auditors, investigators, and researchers. The RISC analytics group is currently seeking an experienced Data Engineer and Full Stack Developer who will provide expert-level advice around data engineering to build data pipelines using sound DevSecOps. In this role, you will be responsible for designing, developing, and maintaining data pipelines, web applications, and machine learning workflows. The USPS OIG uses a Pay Banding system, which is equivalent to the Federal GS scale. Grade and salary determinations will be made based upon a candidate's education and professional experience. This position is being advertised at the Journey Band level, equivalent to a GS-9 to GS-11. The salary range for this position is $68,405.00 - $107,590.00. The salary figures include locality pay. Promotion potential to a GS-13 equivalent is at management's discretion. Please note that the duties and responsibilities associated with this position may vary based upon the agency's needs at the time of hire. The following description of major duties and responsibilities is only intended to give applicants a general overview of the expectations. Work with cross-functional teams to deploy scalable data solutions on cloud platforms such as Azure, ensuring alignment with organizational goals. Design and implement data warehousing solutions that support analytics and data science initiatives, using advanced knowledge of database structures, data models, and performance optimization techniques. Develop and manage automated data pipelines to maintain data integrity, deploy machine learning models, and facilitate collaboration with data scientists and analysts. Use programming languages and tools like Python, Databricks, and Azure Data Lake to manipulate structured and unstructured data, creating efficient and scalable data pipelines. Integrate diverse data sources, including flat files, relational databases, SaaS applications, and web services using techniques such as JDBC/ODBC connections, REST APIs, and web scraping. Implement monitoring solutions, troubleshoot and resolve performance and production issues within data pipelines, leveraging findings to propose process improvements. Assess new data engineering tools and technologies, providing management with recommendations for enhancing data operations. Apply agile methodologies using tools like Azure DevOps and Git to streamline development processes. Design and develop responsive web applications using modern front-end frameworks (e.g., React, Angular, Vue.js) as part of full-stack development initiatives. Oversee the software development lifecycle, ensuring automated testing and quality assurance for data and analytics products. Create and integrate RESTful APIs with back-end services to enhance system interoperability and data accessibility. Development automated solutions to improve operational efficiency and data governance. Provide strategic recommendations regarding data architecture and integration to meet evolving needs. Work closely with data owners to establish and enforce data quality and documentation standards. Collaborate with data analysts, data scientists, investigators, auditors, and researchers to address and fulfill the data requirements of the organization. Requirements Conditions of Employment Qualifications MINIMUM QUALIFICATIONS You must meet ALL of the minimum qualifications listed below. Bachelor's Degree from an accredited college or university Must have specialized experience in building and maintaining data pipelines in cloud-based tools such as Azure Databricks, Azure Data Lake, or similar platforms Must have specialized experience with Python or SQL Must have at least 1 year of specialized experience with containerization and building CI/CD pipelines using tools such as Docker, Kubernetes, Azure DevOps, or Jenkins Must have at least 1 year of specialized experience integrating REST and SOAP APIs to create and access data or to trigger procedures or commands Applicants must take a timed data engineer assessment test to demonstrate your knowledge of data engineer principles and methods. Please click on the assessment link below to take the data engineer assessment test. https://app.alooba.com/take-assessment/16vipfk DESIRABLE QUALIFICATIONS Knowledge of DevOps and Agile methodologies Advanced degree from an accredited college or university Professional Certification(s) in data engineering such as Azure Data Engineer Associate or AWS/Google Cloud equivalent Knowledge of integrating business intelligence tools (e.g., PowerBI) with cloud hosted data repositories (e.g., Azure Data Lakes) Knowledge of Microsoft Power Platform tools Proficiency in HTML, CSS, and JavaScript Understanding of database systems, including SQL and NoSQL Familiarity with deploying applications and understanding hosting environments EVALUATION FACTORS You must have the experience, knowledge and skills as listed in EACH of the evaluation factors. Failure to demonstrate that you meet all of the evaluation factor requirements as listed below will result in a score of zero (0); an ineligible status, and you will not be referred for further consideration. Include your major accomplishments relevant to the position requirements in your resume. Skilled in exploratory data analysis, data quality management, onboarding and maintaining large datasets and data pipelines using cloud-based tools like Databricks and open-source programming languages. Demonstrated mastery of languages such as Python and SQL. Skilled in integrating REST and SOAP APIs to create and access data or to trigger procedures or commands. Skilled in containerization and building CI/CD pipelines using tools such as Docker, Kubernetes, Azure DevOps, Jenkins. Strong ability to provide presentations or briefings of technical information both orally and in writing to clearly communicate analytical products, arguments, conclusions, and recommendations to executives, peers, and other stakeholders. Ability to set goals, priorities, and complete high-quality work in a timely, efficient, and professional manner applying project management practices such as agile and DevOps. You will no longer be considered for this position if you receive a zero (0) rating on any evaluation factor. Failure to demonstrate that you meet all evaluation factor requirements will result in a score of zero (0). Upon receipt of a zero score, you will be deemed "not minimally qualified," and you will not be referred for further consideration. Candidates will be evaluated on the skills that they possess that are directly related to the duties of the position and the experience, education and training that indicate the applicant's ability to acquire the particular knowledge and skills needed to perform the duties of the position. Only those candidates who meet all qualification and eligibility requirements and who submit the required information by 11:59 PM EST on 11/07/2024 will be considered. Education Education must be accredited by an institution recognized by the U.S. Department of Education. Applicants can verify accreditation here: www.ed.gov/admins/finaid/accred. Special Instructions for Candidates with Foreign Education: Education completed outside the United States must be deemed equivalent to that gained in U.S. education programs. You must submit all necessary documents to a private U.S. credential evaluation service to interpret equivalency of your education against courses given in U.S. accredited colleges and universities. For further information visit: http://www2.ed.gov/about/offices/list/ous/international/usnei/us/edlite-visitus-forrecog.html. Additional Information Pay is only part of the compensation you will earn working for the USPS OIG. We offer a broad array of benefits programs: Health, Dental, Vision, Life and Long-Term Care Insurances with Flexible Spending options. For more information about these programs visit www.opm.gov/insure. Retirement and Thrift Savings. For more information about these programs see www.opm.gov/retire and http://www.tsp.gov/. Flexible Work Schedules. USPS OIG offers a range of family friendly flexibilities including flexible work schedules, telework and employee assistance programs. Leave and Holidays. In addition to 11 paid holidays each year, you will earn 13 days of paid sick leave and 13 to 26 paid vacation days each year depending on your years of service. For further information, please refer to our website at: https://www.uspsoig.gov/frequently-asked-questions Fair Labor Standards Act (FLSA) Status: Exempt (Nonexempt employees are entitled to overtime pay; Exempt employees are not). This agency provides Reasonable Accommodations to applicants with disabilities. If you require accommodations for any part of the application and/or hiring process, please send an email to jobs@uspsoig.gov. The decision on granting an accommodation request will be made on a case- by-case basis.