About You:
· You’re a proponent of creating emerging architecture - move fast for what we know
now, evolve as we go, and don’t build it before we need it.
· You can build scalable data ingestion, ETL/ELT, and data warehousing solutions.
· You contribute to the design and architecture of applications, services, and data.
Requirements:
· About 5+ years implementing large scale data projects within a cloud environment
· About 5+ years of experience with Cloud databases – Snowflake, Azure SQL DW, AWS Redshift, Google BigQuery, or similar
· About 5+ years of experience in developing data ingestion, data processing, and analytical pipelines for big data, relational databases, and data lake solutions
· Knowledge of continuous developments/deployment concepts & technologies (GIT, Azure DevOps, Gitlab, CircleCI, ...)
· Experience with data pipeline and workflow management tools such as Apache Airflow
· Experience with Linux operating system is an advantage.
· Strong programming skills using Python, Scala, or similar
Bonus Points:
· Experience with BI tools such as PowerBI, Tableau, or similar
· Familiarity with Iam and Cloud Security
· Hands-on experience in technologies such as Hadoop, Spark, Hive, and Pig or similar
· Experience with NoSQL / graph databases (Cosmos DB, MongoDB, Neo4j )
· Streaming technologies such as Kafka
· Knowledge Docker and orchestration platform (Kubernetes, Openshift, AKS, GKE...)
· Experience with ML platforms such as Azure ML studio or BigQuery ML
Schedule:
· Being able to work in shifts based on Eastern United States Time Zone (EST)
Education:
· A Reputable College Degree preferably in CS
Experience:
· Min 3-5 years (Required)
Work Location:
· Fully Remote