Today
Top Secret
Unspecified
Unspecified
IT - Database
Washington, DC (On-Site/Office)
Overview
BigBear.ai is seeking to hire a Data Engineer (Cloud) to work with one of our clients in Washington, DC. A s a Data Engineer, you will be required to interpret business needs and select appropriate technologies and have experience in implementing data governance of shared and/or master sets of data. You will work with key business stakeholders, IT experts, and subject-matter experts to plan and deliver optimal data solutions. You will create, maintain , and optimize data pipelines as workloads move from development to production for specific use cases to ensure seamless data flow for the use case. You will perform technical and non-technical analyses on project issues and help to ensure technical implementations follow quality assurance metrics. You will analyze data and systems architecture, create designs, and implement information systems solutions .
What you will do
What you need to have
What we'd like you to have
About BigBear.ai
BigBear.ai is a leading provider of AI-powered decision intelligence solutions for national security, supply chain management, and digital identity. Customers and partners rely on BigBear.ai's predictive analytics capabilities in highly complex, distributed, mission-based operating environments. Headquartered in Columbia, Maryland, BigBear.ai is a public company traded on the NYSE under the symbol BBAI. For more information, visit https://bigbear.ai/ and follow BigBear.ai on LinkedIn: @BigBear.ai and X: @BigBearai.
BigBear.ai is seeking to hire a Data Engineer (Cloud) to work with one of our clients in Washington, DC. A s a Data Engineer, you will be required to interpret business needs and select appropriate technologies and have experience in implementing data governance of shared and/or master sets of data. You will work with key business stakeholders, IT experts, and subject-matter experts to plan and deliver optimal data solutions. You will create, maintain , and optimize data pipelines as workloads move from development to production for specific use cases to ensure seamless data flow for the use case. You will perform technical and non-technical analyses on project issues and help to ensure technical implementations follow quality assurance metrics. You will analyze data and systems architecture, create designs, and implement information systems solutions .
What you will do
- Define and communicate a clear product vision for our client's software products, aligning with user needs and business objectives
- Create and manage product roadmaps that reflect both innovation and growth strategies
- Partner with a government product owner and a product team of 7-8 FTEs
- Develop and design data pipelines to support an end-to-end solution
- Develop and maintain artifacts ( e.g. schemas, data dictionaries, and transforms related to ETL processes)
- Integrate data pipelines with AWS cloud services to extract meaningful insights
- Manage production data within multiple datasets ensuring fault tolerance and redundancy
- Design and develop robust and functional dataflows to support raw data and expected data
- Provide Tier 3 technical support for deployed applications and dataflows
- Collaborate with the rest of data engineering team to design and launch new features
- Coordinate and document dataflows, capabilities, etc.
- Occasionally (as needed) support to off-hours deployment such as evening or weekends
What you need to have
- Clearance: Must possess and maintain a TS-SCI clearance
- Bachelor's degree or equivalent practical experience
- Understanding of cloud architectures and enabling tools and technologies, such as, AWS Cloud (GovCloud/C2S)
- Familiar with Amazon Web Managed Services (AWS)
- Working knowledge with software platforms and services, such as, Docker, Kubernetes, JMS/SQS, SNS, Kafka, AWS Lambda, NiFi , Airflow, or similar
- Proficient experience utilizing JavaScript, Elasticsearch, JSON, SQL, XML
- Working knowledge with datastores MongoDB/DynamoDB, PostgreSQL, S3, Redshift, JDBC/ODBC, and Redis
- Familiar with Linux/Unix server environments
- Experience with Agile development methodology
- Publishing and/or presenting design reports
- Coordinating with other team members to reach project milestones and deadlines
- Working knowledge with Collaboration tools, such as, Jira and Confluence
What we'd like you to have
- Master's degree or equivalent experience in a related field
- Familiarity and experience with the Intelligence Community (IC), and the intel cycle
- Familiarity and experience with the Department of Homeland Security (DHS)
- Direct Experience with DHS and Intelligence Community (IC) component's data architectures and environments (IC-GovCloud experience preferred)
- Experience with cloud message APIs and usage of push notifications
- Keen interest in learning and using the latest software tools, methods, and technologies to solve real world problem sets vital to national security
- Working knowledge with public keys and digital certificates
- Experience with DevOps environments
- Expertise in various COTS, GOTS, and open source tools which support development of data integration and visualization applications
- Experience with cloud message APIs and usage of push notifications
- Specialization in Object Oriented Programming languages, scripting, and databases
About BigBear.ai
BigBear.ai is a leading provider of AI-powered decision intelligence solutions for national security, supply chain management, and digital identity. Customers and partners rely on BigBear.ai's predictive analytics capabilities in highly complex, distributed, mission-based operating environments. Headquartered in Columbia, Maryland, BigBear.ai is a public company traded on the NYSE under the symbol BBAI. For more information, visit https://bigbear.ai/ and follow BigBear.ai on LinkedIn: @BigBear.ai and X: @BigBearai.
group id: 10424449