Tookitaki is looking for passionate candidate who should have experience and proven track record of effectively managing clients in a highly technical and B2B software product and / or solutions.
Respond to support requests from customers about the Tookitaki Platform
Triage problems related to the Tookitaki Platform and the customer enterprise.
Work with Tookitaki subject matter experts, data scientists, and engineers to resolve customer issues.
Assist support management to develop support models that enable fanatical support.
Design support strategies and services to enable customers to gain optimal value from the platform.
Learn Tookitaki Platform and keep learning and creating new products, features, and techniques.
Document customer problems and resolutions and monitor production job schedules where required and attend to failures.
Meet client SLA and communicate via Freshdesk tools and provide work around solutions for defects raised.
Develop a career in solution architecture.
What you won’t be doing
You will not commit roadmap deliverables to clients
You will not commit and take change requests from clients
You will not pass customer communications to other teams
Understanding of shell scripting, python, scala code, Linux, Elastic search.
Have understanding of Big data and hands on experience with Apache and Cloudera.
Knowledge working in Mysql setup and understanding of sql.
Hands on deployment and configuration of any products in previous companies.
Working knowledge of monitoring & scheduler tools such as datadog, airflow etc.
Installation and configuration of Apache, Tomcat, Jboss 3-5 years of experience with big data application stack including HDFS, YARN, Spark, Hive and Hbase.
3+ years of experience in Cloud Technologies (AWS/ GCP/ Azure) 3+ years of experience in Big data Enterprise Cluster installation and product installation.
2-3 years experience in ETL in a Big Data hadoop environment(Hive/Hbase knowledge).
3+ years years of experience in shell scripting or python.
4+ years of experience in Hadoop Big Data projects Experience with developing, tuning and debugging code, python/shell scripts loading into Hive, Mariadb, mysql.
Knowledge of python and hands-on coding experience is a plus. PySpark understanding and experience.
Experience setting up enterprise security solutions including setting up active directories, firewalls, SSL certificates, Kerberos KDC servers, etc. Experience working with automation tools like Terraform, CI/CD, Jenkins and various test reports and coverage.
Experience defining and automating the build, versioning and release processes for complex enterprise products.
Compensation & Job perks
Opportunity to work for the fastest growing AML (Anti Money Laundering)
Compliance company serving the fastest and biggest economies