We are looking for GCP (Google Cloud Platform) Data Engineer for our new project.
Requirements:
- Extensive experience with GCP technologies over various GCP components such as BigQuery, Cloud SQL, Pub/Sub, Cloud Storage, Cloud Spanner, and Cloud Firestore Proven track record of building and maintaining data pipelines on GCP using App Engine, Cloud Functions, Cloud Run, DataFlow etc.
- Understanding of DevOps practices and tools for CI/CD pipelines on GCP like Cloud Build and IaC tools like Terraform.
- Thorough understanding of various aspects of Google IAM and sensitivity toward best practices of Authentication and Authorization for appropriate access management. Deep understanding of cost aspects related to different GCP services for their effective and optimum utilization.
- Strong experience with data curation and dimensionality for data consumers. Knowledge of ETL and ELT techniques and best practices.
- Ability to design, develop, and deploy highly scalable and reliable data pipelines. Familiarity with programming languages such as Python, Java, and SQL
- Excellent communication and collaboration skills for highly effective interaction with cross-functional teams.
- Experience with data modelling and schema design.
- Excellent communication and collaboration skills.
- Excellent problem-solving and analytical skills.
Benefits:
- English courses
- Health insurance
- Sport compensation
- Psychologist compensation
- Discount program
- Corporate events and team building
- Corporate library
- Remote work options
- Flexible working hours
- Activities in the office- kicker, table tennis, PlayStation, music room
- LinkedIn Learning Online Coursers
Apply for this job