The Digital Path Forward

Bigdata with Kafka Analyst

Roles and Responsibilities

  • Develop deploy big data pipelines in a cloud environment using Snowflake cloud DW,ETL design, development and migration of existing on-prem ETL routines to Cloud Service
  • Interact with senior leaders, understand their business goals, contribute to the delivery of the workstreams
  • Design and optimize model codes for faster execution
  • Should be willing to travel for consultancy
  • Very good communication Skills and should be able to interact with customer directly
  • Good troubleshooting and Analytical Skills
  • Conceptualize a new product or, new feature or, new component and to test new technologies or features through PoCs
  • Contribute in practice growth initiatives – Interviews, training, mentoring, operational activities etc.
  • Create documentation of operational tasks, procedures and automated processes.
  • Fix faults and restore the operability or carry out suitable measures (creation of workaround)

Required Technical and Professional Expertise

  • 5+ Years of Industry experience in Big Data / Hadoop Field
  • Should have proven track record in the space of Big data Architecting, Solutioning, consulting
  • Technology Stacks : Spark, Kafka, Cassandra, HBase, Hadoop, HDFS, Mapreduce
  • Programming Skills : Python, Scala, Java experience
  • CI/CD, JIRA, Any Automation Testing experience is preferable
  • Should have strong thought leadership capabilities - like Blogs, Opensource contribution, Whitepapers, Research paper publications, Forum participation
  • Shall have proven track records in competency development, innovation and value addition
  • Prior working within a CoE is preferable.

今すぐ申し込む

許可されるファイル拡張子Pdf、Doc、Docx|最大ファイルサイズ:2MB


HCLテクノロジーズのプライバシーステートメントを読み、利用規約に同意します。
フォームを送信すると、購読を確認するためのEメール認証リンクが送信されます。

募集職種を見る

Job
Job

適切な職種が見つかりませんか?

履歴書・経歴書をお送りください。適切な職種の募集がありしだい、連絡させていただきます。

Graduate
Graduate

新卒のエンジニアですか?

新卒採用チームでは、優秀なエンジニアを探しています。