Today
Top Secret/SCI
Unspecified
CI Polygraph
IT - Database
Columbia, MD (On-Site/Office)
Overview
BigBear.ai is seeking a Data Engineer to provide data engineering and integration support
This position will be based out of Annapolis Junction, MD and will report on-site daily (no remote flexibility).
As a Data Engineer/Integrator on the program, your role entails collaborating closely with a team of developers to fulfill data integration requirements. This involves writing and maintaining code using an Extract, Transform, Load (ETL) platform to ensure data is transformed into suitable formats as defined by the IC ITE initiatives. You will interface with external teams and systems, employing various protocols, including HTML and SFTP, to collect data efficiently.
Additionally, your responsibilities include enhancing the ETL platform by adding features aimed at shortening timelines for future data integration efforts. Beyond coding tasks, you'll develop and maintain software to ensure seamless integration into a fully functional system. Collaboration with external teams will be necessary to validate data ingest processes. Finally, you'll be responsible for providing comprehensive documentation covering system architecture, development, and any enhancements made throughout the process.
What you will do
What you need to have
What we'd like you to have
About BigBear.ai
BigBear.ai is a leading provider of AI-powered decision intelligence solutions for national security, supply chain management, and digital identity. Customers and partners rely on BigBear.ai's predictive analytics capabilities in highly complex, distributed, mission-based operating environments. Headquartered in Columbia, Maryland, BigBear.ai is a public company traded on the NYSE under the symbol BBAI. For more information, visit https://bigbear.ai/ and follow BigBear.ai on LinkedIn: @BigBear.ai and X: @BigBearai.
BigBear.ai is seeking a Data Engineer to provide data engineering and integration support
This position will be based out of Annapolis Junction, MD and will report on-site daily (no remote flexibility).
As a Data Engineer/Integrator on the program, your role entails collaborating closely with a team of developers to fulfill data integration requirements. This involves writing and maintaining code using an Extract, Transform, Load (ETL) platform to ensure data is transformed into suitable formats as defined by the IC ITE initiatives. You will interface with external teams and systems, employing various protocols, including HTML and SFTP, to collect data efficiently.
Additionally, your responsibilities include enhancing the ETL platform by adding features aimed at shortening timelines for future data integration efforts. Beyond coding tasks, you'll develop and maintain software to ensure seamless integration into a fully functional system. Collaboration with external teams will be necessary to validate data ingest processes. Finally, you'll be responsible for providing comprehensive documentation covering system architecture, development, and any enhancements made throughout the process.
What you will do
- Collaborate with a team of developers to meet data integration requirements by writing and maintaining ETL code.
- Transform data into suitable formats as defined by ICC ITE initiatives, ensuring alignment with project goals.
- Interface with external teams and systems using protocols such as HTML and SFTP to efficiently collect and process data.
- Enhance the ETL platform by developing features to streamline and accelerate future data integration efforts.
- Develop and maintain software, validate data ingest processes, and create comprehensive documentation of system architecture, development, and enhancements.
What you need to have
- Clearance: Must possess and maintain an active TS/SCI with a CI Poly clearance
- 5+ Years of Experience and a Bachelor's degree or 3+ Years of Experience and a Masters in Computer Science, Data Science, or related fields
- Linux/Unix experience
- Object oriented programming language experience
- Possess strong verbal and written communication skills
- Possess strong analytical and problem solving abilities in the face of ambiguity
- Self starter with a willingness to learn new skills
What we'd like you to have
- Expertise in data ingestion data transformation and data modeling
- Experience with Ruby Java or Python
- One year working with restful web services
- Experience with code development, deployment, and versioning
- Working in cloud architecture with AWS EC2, RDS, S3, VPC, Elasticsearch
- Experience with XML data formats including processing and translation with XSLT and X path
- Experience leading a team
About BigBear.ai
BigBear.ai is a leading provider of AI-powered decision intelligence solutions for national security, supply chain management, and digital identity. Customers and partners rely on BigBear.ai's predictive analytics capabilities in highly complex, distributed, mission-based operating environments. Headquartered in Columbia, Maryland, BigBear.ai is a public company traded on the NYSE under the symbol BBAI. For more information, visit https://bigbear.ai/ and follow BigBear.ai on LinkedIn: @BigBear.ai and X: @BigBearai.
group id: 10424449