Yesterday
Unspecified
$160,000
IT - Data Science
Boston, MA (On-Site/Office)
Responsibilities:
· Develop, optimize, and maintain data ingest flows using Apache Kafka, Apache Nifi and MySQL/PostGreSQL
· Develop within the components in the AWS cloud platform using services such as RedShift, SageMaker, API Gateway, QuickSight, and Athena
· Communicate with data owners to set up and ensure configuration parameters
· Document SOP related to streaming configuration, batch configuration or API management depending on role requirement
· Document details of each data ingest activity to ensure they can be understood by the rest of the team
· Develop and maintain best practices in data engineering and data analytics while following Agile DevSecOps methodology
Desired Skills:
· Strong analytical skills, including statistical analysis, data visualization, and machine learning techniques
· Strong understanding of programming languages like Python, R, and Java
· Expertise in building modern data pipelines and ETL (extract, transform, load) processes using tools such as Apache Kafka and Apache Nifi
· Proficient in programming languages like Java, Scala, or Python
· Experience or expertise using, managing, and/or testing API Gateway tools and Rest APIs
· Experience in traditional database and data warehouse products such as Oracle, MySQL, etc.
· Experience in modern data management technologies such as Datalake, data fabric, and data mesh
· Experience with creating DevSecOps pipeline using CI CD CT tools and GitLab
· Excellent written and oral communication skills, including strong technical documentation skills
· Strong interpersonal skills and ability to work collaboratively in dynamic team environment
· Proven track record in demanding, customer service-oriented environment
· Ability to communicate clearly with all levels within an organization
· Excellent analytical skills, organizational abilities and problem-solving skills
· Experience in instituting data observability solutions using tools such as Grafana, Splunk, AWS CloudWatch, Kibana, etc.
· Experience in container technologies such as Docker, Kubernetes, and Amazon EKS
Qualifications:
· Ability to obtain an Active Secret clearance or higher
· Bachelors Degree in Computer Science, Engineering, or other technical discipline required, OR a minimum of 8 years equivalent work experience
· 8+ years of experience of IT data/system administration experience
· AWS Cloud certifications are a plus
· Develop, optimize, and maintain data ingest flows using Apache Kafka, Apache Nifi and MySQL/PostGreSQL
· Develop within the components in the AWS cloud platform using services such as RedShift, SageMaker, API Gateway, QuickSight, and Athena
· Communicate with data owners to set up and ensure configuration parameters
· Document SOP related to streaming configuration, batch configuration or API management depending on role requirement
· Document details of each data ingest activity to ensure they can be understood by the rest of the team
· Develop and maintain best practices in data engineering and data analytics while following Agile DevSecOps methodology
Desired Skills:
· Strong analytical skills, including statistical analysis, data visualization, and machine learning techniques
· Strong understanding of programming languages like Python, R, and Java
· Expertise in building modern data pipelines and ETL (extract, transform, load) processes using tools such as Apache Kafka and Apache Nifi
· Proficient in programming languages like Java, Scala, or Python
· Experience or expertise using, managing, and/or testing API Gateway tools and Rest APIs
· Experience in traditional database and data warehouse products such as Oracle, MySQL, etc.
· Experience in modern data management technologies such as Datalake, data fabric, and data mesh
· Experience with creating DevSecOps pipeline using CI CD CT tools and GitLab
· Excellent written and oral communication skills, including strong technical documentation skills
· Strong interpersonal skills and ability to work collaboratively in dynamic team environment
· Proven track record in demanding, customer service-oriented environment
· Ability to communicate clearly with all levels within an organization
· Excellent analytical skills, organizational abilities and problem-solving skills
· Experience in instituting data observability solutions using tools such as Grafana, Splunk, AWS CloudWatch, Kibana, etc.
· Experience in container technologies such as Docker, Kubernetes, and Amazon EKS
Qualifications:
· Ability to obtain an Active Secret clearance or higher
· Bachelors Degree in Computer Science, Engineering, or other technical discipline required, OR a minimum of 8 years equivalent work experience
· 8+ years of experience of IT data/system administration experience
· AWS Cloud certifications are a plus
group id: 10106647