Loading data to GCS using Kafka

Cerrado Publicado hace 5 años Pagado a la entrega
Cerrado Pagado a la entrega

I want a PoC to be done for the below requirement.

Source data files(in CSV format and SQL extract format) and a schema files (in JSON format) will be placed in GCS. Based on the schema definition rules given in the json file, data in CSV file should be loaded to Kafka and using Kafka streaming, data needs to be transformed to 3NF form and loaded to GCS . The main objective is,When schema changes , dynamically the code should absorb the changes in kafka without modification in the code.

Skillsets: Big Data-Kafka,Spark-Scala,GCS

Data Analytics Cloud Storage de Google Scala Spark

Nº del proyecto: #17599939

Sobre el proyecto

5 propuestas Proyecto remoto Activo hace 5 años

5 freelancers están ofertando un promedio de £198 por este trabajo

deytps86

Hello, I am working in Bigdata/Hadoop technologies for years and have experiences working in latest Spark,Kafka, Cassandra, Hive/HBase, ELK stacks using ava, Scala, python. Can we talk? Thank you!

£220 GBP en 3 días
(5 comentarios)
3.1
amitkumar0327

Hi, I am Amit. I have experience in Spark, Scala, Kafka and GCP. I will be able to do the POC based on the requirement. I will be interested to know more about the Schema files that will be placed in GCS. The Schema fi Más

£150 GBP en 3 días
(2 comentarios)
2.7
gunjanchoudhary9

Key Skill set and expertise in several Big Data projects : Spark , Kafka, Spark SQL, GCP, RDBMS, Scala Proposed Solution:Using Spark and Scala for importing the schema json file for creating dynamic schema . Check th Más

£177 GBP en 5 días
(0 comentarios)
0.0