Tags: Web3 Jobs • Blockchain Web3 Jobs • Cryptocurrency Developer Jobs • Web3 Remote Jobs • Blockchain Python Jobs • Blockchain Compliance Jobs • Blockchain Content Jobs • Cryptocurrency Senior Jobs • Web3 Security Jobs • Cryptocurrency Operations Jobs • Blockchain Frontend Jobs • Blockchain Kubernetes Jobs • Blockchain Data Jobs • Web3 Talent Acquisition Jobs • Blockchain Docker Jobs • Blockchain Ai Jobs • Blockchain Linux Jobs • Cryptocurrency Full Time Jobs
We are a team to design, develop, maintain, and improve software for various ventures projects, i.e., projects that are adjacent to our core businesses and are bootstrapped fast with a lean team. You will be actively involved in the design of various components behind scalable applications, from frontend UI to backend infrastructure.
Responsibilities:
- Manage the daily operations of Hadoop, Spark, Flink, Clickhouse, and other big data platforms to ensure system stability and data security;
- Operate and administer Linux systems, proficient in common commands, with scripting skills in Shell, Python, or similar languages;
- Maintain and operate big data components on cloud platforms (AWS/GCP/Azure), with hands-on experience in container technologies such as Kubernetes and Docker;
- Build and maintain monitoring systems to track platform performance, optimize configurations and resource allocation, and troubleshoot performance issues (tools such as Datadog, Prometheus, Grafana, Zabbix, etc.);
- Deploy, upgrade, and scale core big data components including Flink, Kafka, Spark, Impala,Clickhouse and Kudu;
- Write and maintain technical documentation for operations, and support in platform solution design and upgrades;
- Explore and adopt new technologies to optimize operation processes and provide technical guidance to help client teams improve their capabilities.
- Preferred: Experience on the development of n8n workflow
Qualifications:
- Familiarity with CDH/CDP/HDP or similar Hadoop distributions, with proven experience in big data platform operations preferred;
- Proficiency in Linux administration and scripting (Shell, Python, or equivalent);
- Experience with major cloud platforms (AWS/GCP/Azure) and containerized environments (Kubernetes, Docker);
- Strong knowledge of big data components and performance tuning, able to resolve stability and scalability issues independently;
- Strong technical writing and communication skills to support documentation and solution delivery;
- Self-motivated with a passion for new technologies, able to apply innovative approaches to optimize operations and drive efficiency.
Apply here 👉 Senior DataOps Engineer - MainApp infrastructure
Be the first to know aboutnew jobs every week
Get 8 new jobs with salaries, once per week! Sign up here so you don't miss a single newsletter.
