Big Data Architecture Developer

Big Data Architecture Developer

REQUIREMENTS

  • 6-8 years of hands on development experience in Big Data (DWH, MPP databases, Data Lake creation, data replication and archive)
  • Experience in technologies that include ETL, ELT, data replication and change data capture (CDC), primarily for the purposes of data migration and data warehousing.
  • Hands on experience in SQL, PL/SQL, Oracle/MySQL/SQL Server
  • Hands on experience in at least one Data Warehouse like Snowflake, HBase, Impala, AWS DynamoDB or Azure SQL Data Warehouse
  • Hands on experience in at least one RDBMS like Oracle, MySQL, PostgreSQL, SQL Server or Cloud RDBMS
  • Good to have hands on experience of Cloudera or Hortonworks Hadoop
  • Experience of data replication tools like Goldengate, Sqoop, AWS DMS, etc.

ROLES & RESPONSIBILITIES

  • Move data to cloud databases, such as MySQL to Microsoft Azure SQL Database, Oracle to Amazon AWS RDS, or SQL Server to Amazon REDSHIFT
  • Move on-premises data to the cloud to generate insights and benefit from on-demand elastic services.
  • Use data movement solutions to shift data from transactional databases to a big data or data lake environment, such as Hadoop , Snowflake, AWS and Azure.
  • Schedule archives of core data to proactively manage the growth of databases and keep your systems running at peak performance.
  • Capture, transform and replicate data from a selection of sources to a number of target data systems (such as databases, data warehouse, big data, cloud and digital platforms).
  • Replicate a database for disaster recovery, faster analytics at multiple locations, and more efficient use of distributed resources.
Job Category: AWS Azure Hadoop MySQL Redshift Snowflake
Job Type: Full Time
Job Location: Remote

Apply for this position

Allowed Type(s): .pdf, .doc, .docx