Our Ideal Candidate

  • Demonstrated experience and proven track record of effectively managing clients in a highly technical and B2B software product and/or solution.
  • Experience in a fast-paced startup/scale-up environment.
  • Experience in working with remote deployment teams.

What you will be doing:

  • Respond to support requests from customers about the Tookitaki Platform
  • Triage problems related to the Tookitaki Platform and the customer enterprise
  • Work with Tookitaki subject matter experts, data scientists, and engineers to resolve customer issues
  • Assist support management to develop support models that enable fanatical support
  • Design support strategies and services to enable customers to gain optimal value from the platform
  • Learn Tookitaki Platform and keep learning and creating new products, features, and techniques
  • Document customer problems and resolutions
  • Monitor production job schedules where required and attend to failures
  • Meet client SLA and communicate via Freshdesk tools
  • Provide workaround solutions for defects raised
  • Develop a career in solution architecture

What you won’t be doing:

  • You will not commit roadmap deliverables to clients
  • You will not commit and take change requests from clients
  • You will not pass customer communications to other teams

Requirements:

  • Understanding of  shell scripting, python, scala code, Linux, Elastic search
  • Have an understanding of Big data and hands-on experience with Apache and Cloudera
  • Knowledge working in Mysql setup and understanding of SQL
  • Hands on deployment and configuration of any products in previous companies
  • Working knowledge of monitoring and scheduler tools such as Datadog, airflow etc.
  • Installation and configuration of Apache, Tomcat, JBoss
  • 3-5 years of experience with big data application stack including HDFS, YARN, Spark, Hive and Hbase.
  • 3+ years of experience in Cloud Technologies (AWS/ GCP/ Azure)
  • 3+ years of experience in Big data Enterprise Cluster installation and product installation
  • 2-3 years experience in ETL in a Big Data Hadoop environment (Hive/Hbase knowledge).
  • 3+ years years of experience in shell scripting or Python.
  • 4+ years of experience in Hadoop Big Data projects Experience with developing, tuning and debugging code, python/shell scripts loading into Hive, Mariadb, and MySQL.
  • Knowledge of Python and hands-on coding experience is a plus.
  • PySpark understanding and experience.
  • Experience setting up enterprise security solutions including setting up active directories, firewalls, SSL certificates, Kerberos KDC servers, etc.
  • Experience working with automation tools like Terraform, CI/CD, Jenkins and various test reports and coverage
  • Experience defining and automating the build, versioning and release processes for complex enterprise products.

Compensation & Job Perks

  • Opportunity to work for the fastest growing AML (Anti Money Laundering) Compliance company serving the fastest and biggest economies
  • Flexible working hours and remote working