What you will do
- You'll design and develop scalable, reliable, secure and fault tolerant data systems and work on improving to deliver next gen systems
- You’ll work 100% in the public cloud (AWS, GCP, Azure)
- You'll build readable, concise, reusable, extensible code that avoids re-inventing solutions to problems that you solved
- You will be migrating massive datasets from high-growth startups from one cloud to another or on-prem to cloud
- You'll be part of discovery sessions with clients and will build state of the art data architectures
- You’ll research new technologies and tools that enable building the next generation systems
- You’ll use the knowledge you’ve learned from client projects to develop and improve our internal products.
- At least 5 years of experience as a Data Engineer or Software engineer
- Strong coding skills in one of the following: Python, Java or Scala
- Experience with Spark or Beam
- Experience with SQL databases as well as NoSQL data stores
- Experience in SQL with an in-depth understanding of query optimization. You should also understand DDL very well.
- Excellent understanding of building data models for the target data warehouse
- Hands-on experience in any of the cloud platforms and having a thorough understanding of the cloud concepts are required.
- Experience with any orchestration tool - Airflow, Kubeflow, Oozie, Luigi, Azkaban, Step Functions...
- Any big data or data engineering certification on any of the clouds
- Must have an inherent talent of understanding data and its interpretation for end business users
- Ability to drive initiatives and work independently
- Good communication skills to be able to express one’s point to client concisely
- Experience building large-scale, high throughput, 24x7 data systems
- Exposure to machine learning algorithms with implementation in practice
- Experience with legacy big data systems (ex: Hadoop)
About the Team
CloudCover‘s data team is united around two core principles: technical excellence, and shipping code. Every team member has very strong technical skills, from juniors to managers. We require everyone to get public cloud certifications - and we get rewarded for it.
We are a flexible work environment, where we focus on the results: the quality of the code we produce, and its capacity to get things done (in a good way) is ultimately what matters to us.
We’re also very committed to having each of us share technical ideas - as a new team member, you're coming to learn, but also to share your knowledge and passion for building highly scalable data systems.
We believe in giving back to the community. Public contributions is one way to do it: open source software, helping on stack overflow, writing a strong blog post.
We encourage everyone to do it so much that it’s one of the metrics we look at for advancement.
Our mission is to unlock the human potential for creative, strategic, and sustainable impact by eliminating cloud and IT operations that are repetitive and devoid of value. To accomplish this mission, we need to include humans from all walks of life. So, we actively seek to promote and celebrate diversity and inclusiveness in our embrace of human talent.
We welcome you as you are without regard to your gender, race, caste, color, age, sexual orientation, country of origin, religion, physical or neurological challenges, veteran status, or marital status.
Interested in this role?
CloudCover might be the right place for you, click on the apply button below and send in your resume.Apply
How can we help?
At CloudCover, we are always looking forward for the next challenge. Drop us a line, we would love to hear from you.
Thanks for writing us! We'll be in touch real quick.Back to website
Applying for Senior Data Engineer
We are always looking for talented folks that push us to be bolder, dream bigger, and perform better.
Thank you for applying!
We will review your application and get in touch soon.Back to website