Role: PySpark Developer

Experience: 3- 5 Years

Location: Hyderabad (Initially remote)

 

Technovert is not a typical IT services firm. We have to credit two of our successful products generating $2M+ in licensing/SaaS revenues which is rare in the industry. 

We are Obsessed with our love for technology and the infinite possibilities it can create for making this world a better place. Our clients find us at our best when we are challenged with their toughest of the problems and we love chasing the problems. It thrills us and motivates us to deliver more. Our global delivery model has earned the trust and reputation of being a partner of choice. 

We have a strong heritage built on great people who put customers first and deliver exceptional results with no surprises - every time. We partner with you to understand the interconnection of user experience, business goals, and information technology. It's the optimal fusing of these three drivers that delivers 


 

Responsibilities:

  • Work closely with data engineering, data architects to develop end to end data bricks solutions.
  • Work with integration of python/pyspark with delta lake and other databases and applications.
  • Work to analyze, modify and standardize existing python/pyspark code base.
  • Understanding the Stakeholder Requirements: Work closely with key business leaders to understand data science needs.
  • Work on Azure environment with ADF, Databricks, Delta Lake, Azure Devops, SQL Server database.

 

Must have:

  • Hands on Azure with databricks, pyspark and ADF.
  • Excellent in analytical thinking for implementing pyspark pipelines in Azure data factory.
  • Strong knowledge of data warehousing and Hands on SQL queries.
  • Strong SQL knowledge as well as an understanding of database structure
  • Excellent communication skills and exposure dealing with customers
  • Experience in connecting multiple source systems
  • Experience in working with different python/Pyspark libraries.


Nice to Have:

  • Hands on working experience with power Bi.
  • Any knowledge or experience in Snowflake.


Qualification: 

  • 3 to 5 years of experience in data processing.
  • Proven experience with python/pyspark in processing data.
  • Data enthusiast