Pentaho and Bigdata Architect @ ValueLabs

Job Description

Pentaho and Bigdata Architect

Expires on: 20211004
Job Code: VL/SD/0907/D/278
Hyderabad
Experience:
Location: Hyderabad

• Pentaho/Informatica/Talend and knowledge on Spark/PySpark • Big Data • Vertica / relational DB/No SQL

Responsibilities
• 5+ years of experience with database development on Oracle, MSSQL or PostgreSQL • 3+ years on one specific ETL tool, such as Pentaho, Talend, Informatica, DataStage • 3+ years data modeling experience, both logical and physical • Strong communication and documentation skill is absolutely required for this role as you will be working directly with both technical and non-technical teams • Experience working closely with teams outside of IT (ie. Business Intelligence, Finance, Marketing, Sales) • Experience with setting up the infrastructure and architectural requirements • Requires minimal or no direct supervision • Working knowledge with big data databases such as Vertica, Snowflake or Redshift • Experience on the Hadoop ecosystem. Programmed or worked with key data components such as HIVE, Spark and Sqoop moving and processing terabyte level of data • Web analytics or Business Intelligence a plus • Understanding of Ad stack and data (Ad Servers, DSM, Programmatic, DMP, etc) • Knowledge of scripting languages such as Perl or Python • Hadoop administration experience a plus, but not required • Exposure or understanding of scheduling tools such as Airflow • Experience in Linux environment is preferred but not mandatory
Requirements
• Pentaho/Informatica/Talend and knowledge on Spark/PySpark • Big Data • Vertica / relational DB/No SQL
Apply




    By continuing to use the site, you agree to the use of cookies. more information

    The cookie settings on this website are set to "allow cookies" to give you the best browsing experience possible. If you continue to use this website without changing your cookie settings or you click "Accept" below then you are consenting to this.

    Close