Job Experience: 5-8 years
Responsibilities for the job —
Technical (Primary) –
- Excellent in SQL, ADF. A must.
- Strong experience in building data pipelines in Azure Data Factory. Experience calling rest APIs using ADF
- ETL, API (Rest, OAuth, SOAP, etc) data integration into ADLS Gen2/ Azure Synapse Analytics and data migration.
- Experience in Ingestion of batch and Streaming data with complex transformations.
- Experience in handling Semi-structured data in various data formats (Parquet, JSON) and manipulating data in complex data types.
- Azure Synapse/data warehouse experience using synapse/data warehouse to present data securely and to build & manage data models.
- Azure Data Bricks experience writing in and using data bricks Using Python to transform, manipulate data.
- Azure Data Lake/ Delta Lake, Blob Storage
- Knowledge of data modeling, design, manipulation, optimization, and best practices.
- Experience in working with very large volume of log data and building analytical insights based on user requirements
Technical (Secondary) Skills:
- Knowledge of Dev-Ops tools and experience in building CI/CD pipelines on Azure.
- Write programs to pull data from External Applications and Services using REST API with different authentication methods.
- Knowledge on NoSQL Database types like Document and Graph DB is a plus.
Behavioral:
- Good Communication Skills · Ability to work in a cooperative team environment
Eligibility Criteria for the Job
Education: –
B.E/B.Tech in any specialization, BCA, M.Tech in any specialization, MCA.
Primary Skill: –
1. SQL
2. Databricks
3. Cloud Azure
4. Azure Data Factory, Azure Synapse, Azure Data Lake/ Delta Lake, Blob Storage
5. Python, Pyspark
Management Skills: –
1. Ability to handle given tasks and projects simultaneously in an organized and timely manner.
Soft Skills: –
1. Good communication skills, verbal and written.
2. Attention to details.
3. Positive attitude and confidence.