design complicated procedures and process massive amounts of data in order to provide complicated business insights for the company, using advanced technologies and visualization platforms. You will use cutting-edge cloud technologies such as Hadoop, Pig, Hive, Spark, Java and Map Reduce.
Responsibilities:
• Implement BI products in Big Data environment from top to bottom, by gathering requirements, specifying data schemes, developing ETL processes and reports.
• Design and apply procedures to process raw data using Big Data technologies.
• Implement modularized, reusable, advanced reports and dashboards for business decision making and rapid visualization of key trends and metrics
• Perform research and POC on new Big Data tools and technologies
Requirements:
• 3+ years of ETL development experience with proven high quality deliverables
• 2+ years of experience with at least one of the following: Java, C#, Python, Scala, bash.
• Strong skills in SQL – must
• Excellent communication skills, both verbal and written
• A proactive problem solving mentality that thrives in an agile work environment
Advantages:
• Experience with large data sets and distributed computing (Pig/Hive/Spark/Storm)
• Programming experience in data processing models with Java
• Prior developing experience using commercial or open source reporting platforms (Tableau, QlikView or others)