Staff Infrastructure Engineer
Workato
About Workato
Workato is the only integration and automation platform that is as simple as it is powerful — and because it’s built to power the largest enterprises, it is quite powerful.
Simultaneously, it’s a low-code/no-code platform. This empowers any user (dev/non-dev) to painlessly automate workflows across any apps and databases.
We’re proud to be named a leader by both Forrester and Gartner and trusted by 7,000+ of the world's top brands such as Box, Grab, Slack, and more. But what is most exciting is that this is only the beginning.
Why join us?
Ultimately, Workato believes in fostering a flexible, trust-oriented culture that empowers everyone to take full ownership of their roles. We are driven by innovation and looking for team players who want to actively build our company.
But, we also believe in balancing productivity with self-care. That’s why we offer all of our employees a vibrant and dynamic work environment along with a multitude of benefits they can enjoy inside and outside of their work lives.
If this sounds right up your alley, please submit an application. We look forward to getting to know you!
Also, feel free to check out why:
Business Insider named us an “enterprise startup to bet your career on”
Forbes’ Cloud 100 recognized us as one of the top 100 private cloud companies in the world
Deloitte Tech Fast 500 ranked us as the 17th fastest growing tech company in the Bay Area, and 96th in North America
Quartz ranked us the #1 best company for remote workers
Responsibilities
As a Senior Data Infrastructure Engineer you will be responsible for deploying, scaling and maintenance of services at the core of Workato Data Platform such as analytical data storages, real time ingestion services, data lakes, orchestration tools. You will closely work with Data Engineers, and Developers as a part of a small, flexible team and will have a direct impact on the process of modernisation and maturation of the platform including infrastructure architecture decisions.
Workato Data Platform is based on the industry leading technologies such as Snowflake Data Warehouse, Clickhouse Database, Airflow, Apache Kafka and AWS (RDS, Lambda, Fargate, etc), Prometheus, VictoriaMetrics and Workato SAAS solution itself for collecting data from third party business supporting services. We are currently working on upgrading the platform to meet new requirements of rapidly growing business such as:
Increase the confidence in analytical data.
Make a clearest possible vision of the user journey.
Minimize the gap between data emitting and data availability for analytical purposes.
Support for a fast growing amount of data.
We plan to achieve this by adopting some additional leading edge technologies such as DBT, Trino, Kafka Streams, Kafka Connect and DataHub unified metadata management platform. Thus, you will work with the technologies that are relevant in the industry and will have challenging tasks.
Requirements
Qualifications / Experience / Technical Skills
8+ years of trackable work experience with deploying and supporting data-intensive services.
Production experience with building deployments of services commonly used as a part of data platform stack such as Kafka, Debezium, Airflow, Trino/Presto, Spark, Flink, Clickhouse, S3, Firehose, Kinesis, Snowflake, Big Query, Red Shift.
Experience monitoring, logging, and analyzing service health. Ability to troubleshoot common bottlenecks of data intensive applications.
Experience with managing complex infrastructure (such as Kubernetes clusters, VPC networking and security policies) using Infrastructure as Code tools (e.g. Terraform or CloudFormation).
Experience creating application deployments of Kubernetes-based services using tools like Kustomize, Helm, etc
Experience with AWS cloud computing (EC2, RDS, EKS, EMR, Route53, VPCs, Subnets, Route Tables).
Basic knowledge of one or more high-level programming languages, such as Python, Go, Java.
Experience with a solution cost optimization and capacity planning.
Good understanding of Data Privacy and Security (GDPR, CCPA).
Soft Skills / Personal Characteristics
Good communication and collaboration skills.
Exposure or interest working with Data pipeline technologies.
Readiness to work remotely with teams distributed across the world and timezones.
Spoken English (at the level enough to pass technical interviews and later work with colleagues)