Loading data to GCS using Kafka
£20-250 GBP
Pagado a la entrega
I want a PoC to be done for the below requirement.
Source data files(in CSV format and SQL extract format) and a schema files (in JSON format) will be placed in GCS. Based on the schema definition rules given in the json file, data in CSV file should be loaded to Kafka and using Kafka streaming, data needs to be transformed to 3NF form and loaded to GCS . The main objective is,When schema changes , dynamically the code should absorb the changes in kafka without modification in the code.
Skillsets: Big Data-Kafka,Spark-Scala,GCS
Nº del proyecto: #17599939
Sobre el proyecto
5 freelancers están ofertando un promedio de £198 por este trabajo
Hello, I am working in Bigdata/Hadoop technologies for years and have experiences working in latest Spark,Kafka, Cassandra, Hive/HBase, ELK stacks using ava, Scala, python. Can we talk? Thank you!
Hi, I am Amit. I have experience in Spark, Scala, Kafka and GCP. I will be able to do the POC based on the requirement. I will be interested to know more about the Schema files that will be placed in GCS. The Schema fi Más
Key Skill set and expertise in several Big Data projects : Spark , Kafka, Spark SQL, GCP, RDBMS, Scala Proposed Solution:Using Spark and Scala for importing the schema json file for creating dynamic schema . Check th Más