What You'll Be Doing
- Deploy, automate and maintain AWS/GCP data architectures that address client business problems and meet their technical requirements
- Collaborate and contribute to technical and strategic discussions and decision making in terms of architecture, infrastructure, tools and technology
- Help ensure customer success in addressing data-related technical issues and support their data infrastructure needs
- Work with management and sales teams to develop and support strategies to ensure a long term trusted relationship status
Skills You'll Need
- Experience building and optimizing data pipelines, architectures and data sets
- Experience with data pipeline and workflow management tools such as Fivetran, Kafka and Airflow
- Experience with relational SQL and NoSQL databases, such as Postgres and MongoDB
- Experience with cloud services such as BigQuery, Dataflow, Glue, EMR, RDS, Redshift
- Knowledge of programming languages and open source solutions
- 2+ year of experience as a DevOps/Data Platform Engineer
- Cloud certifications for AWS/GCP (or willingness to earn one within 3 months after joining us)
- Fluent in English
- Proactive, self-driven, analytical and well-organized with excellent time management skills
- Bachelor’s degree in technical field or equivalent practical experience
- Knowledge of Configuration management tools such as Ansible/Chef/Puppet
- Knowledge of CI/CD systems such as Jenkins, Spinnaker, Gitlab, BitBucket,ArgoCD
- Knowledge of Container technologies such as Docker, Kubernetes, Rancher
- Knowledge of Infrastructure-As-Code tools like Terraform, CloudFormation
- Industry experience or experience working in a specialist area of technology