Kajal | Nov 16, 2022 |
Citi Hiring B.Tech/BE/MCA Graduates
Citi is hiring an experienced Data Analytics Analyst at their Pune location.
The complete details of this job are as follows:-
The Ideal Candidate should be able to:
Transform complex analytical models in scalable, production-ready solutions
Provide support and enhancements for an advanced anomaly detection machine learning platform
Continuously integrate and ship code into our cloud production environments
Develop cloud based applications from the ground up using a modern technology stack
Work directly with Product Owners and customers to deliver data products in a collaborative and agile environment
Developing sustainable data driven solutions with current new generation data technologies to drive our business and technology strategies
Building data APIs and data delivery services to support critical operational and analytical applications
Contributing to the design of robust systems with an eye on the long-term maintenance and support of the application
Leveraging reusable code modules to solve problems across the team and organization
Handling multiple functions and roles for the projects and Agile teams
Defining, executing and continuously improving our internal software architecture processes
Being a technology thought leader and strategist.
Also Read
Motorola Hiring Graduates: Check More Details
The Ideal Candidate should also have:
Hands on experience on Amazon Web Services (AWS), Google Compute or another public cloud service
experience working with Streaming using Spark or Flink or Kafka or NoSQL
experience working with Dimensional Data Model and pipelines in relation with the same
Intermediate level experience/knowledge in at least one scripting language (Python, Perl, JavaScript)
Hands on design experience with data pipelines, joining data between structured and unstructured data
Familiarity of SAS programming will be a plus
BE/B.Tech/MCA degree or equivalent degree.
At least 2 years of experience on designing and developing Data Pipelines for Data Ingestion or Transformation using Java or Scala or Python.
At least 2 years of experience in the following Big Data frameworks: File Format (Parquet, AVRO, ORC), Resource Management, Distributed Processing and RDBMS
At least 2 years of developing applications with Monitoring, Build Tools, Version Control, Unit Test, TDD, Change Management to support DevOps
At least 2 years of experience with SQL and Shell Scripting experience
Experience of designing, building, and deploying production-level data pipelines using tools from Hadoop stack (HDFS, Hive, Spark, HBase, Kafka, NiFi, Oozie, Apache Beam, Apache Airflow etc).
Experience with Spark programming (pyspark or scala or java).
Experience troubleshooting JVM-related issues.
Experience and strategies to deal with mutable data in Hadoop.
Experience with Stream sets.
Familiarity with machine learning implementation using PySpark.
Experience in data visualization tools like Cognos, Arcadia, Tableau.
To Apply for this Job, Visit Official Website
Disclaimer: The Recruitment Information provided above is for informational purposes only. The above Recruitment Information has been taken from the official site of the Organisation. We do not provide any Recruitment guarantee. Recruitment is to be done as per the official recruitment process of the company or organization posted the recruitment Vacancy. We don’t charge any fee for providing this Job Information. Neither the Author nor Studycafe and its Affiliates accepts any liabilities for any loss or damage of any kind arising out of any information in this article nor for any actions taken in reliance thereon.
In case of any Doubt regarding Membership you can mail us at [email protected]
Join Studycafe's WhatsApp Group or Telegram Channel for Latest Updates on Government Job, Sarkari Naukri, Private Jobs, Income Tax, GST, Companies Act, Judgements and CA, CS, ICWA, and MUCH MORE!"