Today
Public Trust
Unspecified
Unspecified
IT - Software
Remote/Hybrid• (Off-Site/Hybrid)
Zachary Piper Solutions is seeking a Senior Developer to work in our fully remote.
We are seeking a problem solver, who enjoys the challenge of tackling technical tasks and problems with little oversight, and connecting dots to think at the broader, solution level. A strong desire and aptitude for learning is a must - our solution is rapidly evolving, along with new requirements, technologies, and the threat landscape. Successful candidates will have demonstrated experience writing well-structured Python code using best practices to deliver enterprise-level applications. Candidates should have a solid understanding of core data pipeline processing capabilities to include different design patterns, stream vs bulk processing, ETL functionality, a range of data stores, and APIs. The ability to develop and deliver reliable, performant, and scalable solutions on time is a critical ability of the ideal candidate.
Required Skills
• 10+ years experience developing backend enterprise-level Python applications.
• 10+ years experience building highly performant, reliable, and scalable systems integrations to include ETL functionality, integrating functions between applications, and working with APIs.
• 5+ years experience working with a multitude of storage solutions to include relational databases, non-relational or NoSQL databases, object-relational databases, distributed data stores, and caches or in-memory data stores. Specific technologies include Elasticsearch, Splunk, Postgres, AWS s3, Redis, and memcached.
• 5+ years developing microservice based applications utilizing components such as containers, Docker, Kubernetes, AWS ECR/ECS/EKS/Fargate, and container registries.
• 5+ years leveraging git repositories (e.g. Github, Gitlab, Bitbucket) with an advanced understanding of branching strategies, pull requests, commits, and other key concepts.
• Demonstrated experience writing unit tests to ensure reliable code delivery.
• Expertise in creating data pipelines to include batch and streaming data such as log data or tool/sensor data.
• Experience leveraging message queuing systems as part of a data pipeline solution. RabbitMQ or Kafka preferred.
• Strong analytical and problem-solving skills, with attention to detail.
• Excellent written and oral communication. Must be comfortable presenting information both internally, and to customers.
Compensation:
This job opens for applications on 3/28/2025. Applications for this job will be accepted for at least 30 days from the posting date.
#LI-NR1
We are seeking a problem solver, who enjoys the challenge of tackling technical tasks and problems with little oversight, and connecting dots to think at the broader, solution level. A strong desire and aptitude for learning is a must - our solution is rapidly evolving, along with new requirements, technologies, and the threat landscape. Successful candidates will have demonstrated experience writing well-structured Python code using best practices to deliver enterprise-level applications. Candidates should have a solid understanding of core data pipeline processing capabilities to include different design patterns, stream vs bulk processing, ETL functionality, a range of data stores, and APIs. The ability to develop and deliver reliable, performant, and scalable solutions on time is a critical ability of the ideal candidate.
Required Skills
• 10+ years experience developing backend enterprise-level Python applications.
• 10+ years experience building highly performant, reliable, and scalable systems integrations to include ETL functionality, integrating functions between applications, and working with APIs.
• 5+ years experience working with a multitude of storage solutions to include relational databases, non-relational or NoSQL databases, object-relational databases, distributed data stores, and caches or in-memory data stores. Specific technologies include Elasticsearch, Splunk, Postgres, AWS s3, Redis, and memcached.
• 5+ years developing microservice based applications utilizing components such as containers, Docker, Kubernetes, AWS ECR/ECS/EKS/Fargate, and container registries.
• 5+ years leveraging git repositories (e.g. Github, Gitlab, Bitbucket) with an advanced understanding of branching strategies, pull requests, commits, and other key concepts.
• Demonstrated experience writing unit tests to ensure reliable code delivery.
• Expertise in creating data pipelines to include batch and streaming data such as log data or tool/sensor data.
• Experience leveraging message queuing systems as part of a data pipeline solution. RabbitMQ or Kafka preferred.
• Strong analytical and problem-solving skills, with attention to detail.
• Excellent written and oral communication. Must be comfortable presenting information both internally, and to customers.
Compensation:
- $130,000-$150,000
- Full benefits (Medical, Dental, Vision, and sick leave if required by law)
This job opens for applications on 3/28/2025. Applications for this job will be accepted for at least 30 days from the posting date.
#LI-NR1
group id: 10430981