Connecting linkedin

Data Architect

  • Location

    Luton, Bedfordshire

  • Sector:

    BI, Big Data & Analytics

  • Job type:

    Contract

  • Salary:

    £650 - £700 per day

  • Contact:

    Amy Harris

  • Contact email:

    amyh@montash.com

  • Salary high:

    700

  • Salary low:

    650

  • Job ref:

    data16119_1555401506

  • Published:

    3 months ago

  • Duration:

    3 months +

  • Expiry date:

    2019-04-23

  • Startdate:

    ASAP

  • Consultant:

    #

Montash have been engaged by a leading Retailer to source a Big Data Architect for an initial 3 month contract with a chance of being extended to 6 or 12 months

The role is working on a yielding platform with links to multiple source markets. The client are moving from a low level PySpark streaming to a Snowflake solution.

As a Modern Data Architect, you will be responsible for designing and delivering innovative solutions using modern data technology platforms such as Snowflake, Redshift, Kafka, Hadoop, Spark, NoSQL and other Big Data related technologies on AWS

Responsibilities

  • Analyse complex problems; architect, design and develop unique and innovative solutions using modern data technology platforms
  • Develop the solution to provide Commercial Analysts and Data Scientists with a modern yield engine to inform pricing.
  • Be part of a growing and dynamic team; contribute to the culture and growth of modern data architecture across the client.

What the client are looking for

  • Display passion for delivering high quality products that meet customer's needs
  • Show a passion about individual improvement and learning new technologies
  • Solving data-oriented problems in an analytical and iterative fashion
  • Big data architectures and patterns in the cloud
  • Design, development & management in Snowflake, Redshift and NoSQL solutions
  • Design and development of EMR, Qubole and Data Lakes
  • ETL/ELT design development with tools such as Talend, Informatica, Spark or Airflow
  • Data ingestion and management concepts of cataloguing, lifecycle and lineage
  • Working with various kinds of data (streaming, structured, unstructured, metrics, logs, json, xml, parquet, etc.)
  • Working in various Agile methodologies (Scrum, Kanban)

    Candidates should also be able to demonstrate some (but not necessarily all) of the following:
  • Experience with approach, platforms and best practice for reporting and visualisation tools
  • Proficiency in languages such as Python, R, Java, Scala and/or Go as well as source code management and testing
  • Proficiency with data solutions on any public cloud platforms (AWS, Azure, GCP)