Find your next role

Discover amazing opportunities across our network of companies committed to gender equality in the workplace.

Data Engineer

IBM

IBM

Software Engineering, Data Science
Posted on Jan 29, 2024
Introduction
At IBM, work is more than a job – it’s a calling: To build. To design. To code. To consult. To think along with clients and sell. To make markets. To invent. To collaborate. Not just to do something better, but to attempt things you’ve never thought possible. Are you ready to lead in this new era of technology and solve some of the world’s most challenging problems? If so, lets talk.

Your Role and Responsibilities

Octo, an IBM company, is an industry-leading, award-winning provider of technical solutions for the federal government. At Octo, we specialize in providing agile software engineering, user experience design, cloud services, and digital strategy services that address government’s most pressing missions. Octo delivers intelligent solutions and rapid results, yielding lower costs and measurable outcomes.

Our team is what makes Octo great. At Octo you’ll work beside some of the smartest and most accomplished staff you’ll find in your career. Octo offers fantastic benefits and an amazing workplace culture where you will feel valued while you perform mission critical work for our government. Voted one of the region’s best places to work multiple times, Octo is an employer of choice!

You…

As a Data Engineer, you will be joining the team that is deploying and delivering a cloud-based, multi-domain Common Data Fabric (CDF), which provides data sharing services to the entire DoD Intelligence Community (IC). The CDF connects all IC data providers and consumers. It uses fully automated policy-based access controls to create a machine-to-machine data brokerage service, which is enabling the transition away from legacy point-to-point solutions across the IC enterprise.

Us…

We were founded as a fresh alternative in the Government Consulting Community and are dedicated to the belief that results are a product of analytical thinking, agile design principles and that solutions are built in collaboration with, not for, our customers. This mantra drives us to succeed and act as true partners in advancing our client’s missions.

Program Mission…

The CDF program is an evolution for the way DoD programs, services, and combat support agencies access data by providing data consumers (e.g., systems, app developers, etc.) with a “one-stop shop” for obtaining ISR data. The CDF significantly increases the DI2E’s ability to meet the ISR needs of joint and combined task force commanders by providing enterprise data at scale. The CDF serves as the scalable, modular, open architecture that enables interoperability for the collection, processing, exploitation, dissemination, and archiving of all forms and formats of intelligence data. Through the CDF, programs can easily share data and access new sources using their existing architecture. The CDF is a network and end-user agnostic capability that enables enterprise intelligence data sharing from sensor tasking to product dissemination.

Responsibilities…

Primarily responsible for defining and executing a process to document and prioritize data mapping and transformation requirements for the DoD/IC enterprise. Develop detailed data mapping and transformation requirements packages to be used by CDF Common Transformation Service (CTS) engineers to implement within the CTS system. In this role, you will:

  • Designing, developing, and maintaining high-performance Java-based applications.
  • Writing well-designed, efficient, and testable code.
  • Collaborating with cross-functional teams to define, design, and add new features.
  • Debugging and resolving application issues to ensure optimal performance.
  • Participating in code reviews and knowledge-sharing sessions.
  • Develop, optimize, and maintain data ingest flows using Apache Nifi, Java, and Python.
  • Develop within the components in the cloud platform, such as Apache Kafka, NiFi, HDFS/HBase.
  • Communicate with data owners to set up and ensure CDF streaming and batching components are working (including configuration parameters).
  • Document SOP related to streaming configuration, batch configuration or API management depending on role requirement.
  • Document details of each data ingest activity to ensure they can be understood by the rest of the team.

Years of Experience: Mid Level (5-8 years)

Education: Bachelor’s degree in systems engineering, computer engineering, or a related technical field (preferred)

Location: Chantilly, VA

Clearance: Active TS/SCI w/ ability to obtain CI Poly


Required Technical and Professional Expertise

  • A minimum of 5 years of experience with programming and software development including analysis, design, development, implementation, testing, maintenance, quality assurance, troubleshooting and/or upgrading of software systems.
  • Proven experience as a Java and Python Developer.
  • DoD 8570 IAT Level II Certification (e.g. Security+) or the ability to obtain the certification within 90 days.
  • Demonstrable Linux command line knowledge.
  • Working knowledge of web services environments, languages, and formats such as RESTful APIs, SOAP, HTML, JavaScript, XML, and JSON.
  • Understanding of foundational ETL concepts.
  • Experience implementing data ignorations with in the IC DoD Enterprise.
  • Experience with data flow, management, and storage solutions (i.e. Kafka, NiFi, and AWS S3).
  • Clearance: Active TS/SCI w/ ability to obtain CI Poly


Preferred Technical and Professional Expertise

  • Knowledge of software development best practices and design patterns.
  • Java (Core and Enterprise Edition) and Python.
  • Version control systems (Git).
  • Build tools (Maven, Nexus).
  • Experience or expertise using, managing, and/or testing API Gateway tools and Rest APIs (desired).
  • 5+ Experience in Java Development and Python.
  • Advanced organizational skills with the ability to handle multiple assignments.
  • Strong written and oral communication skills.