Stream Processing using Storm and Kafka

In my earlier post, we looked at how Kafka can be integrated with Spark Streaming for processing the loan data. In the Spark streaming process, we are cleansing the data to remove invalid records before we aggregate the data. We could potentially cleanse the data in the pipeline prior to streaming the loan records in … Continue reading Stream Processing using Storm and Kafka

Financial Data Analysis using Kafka and Spark Streaming

In my earlier posts on Apache Spark Streaming, we looked at how data can be processed using Spark to compute the aggregations and also store the data in a compressed format like Parquet for future analysis. We also looked at how data can be published and consumed using Apache Kafka which is a distributed message … Continue reading Financial Data Analysis using Kafka and Spark Streaming

Apache Kafka – Producers and Consumers

This post will provide a quick overview on how to write Kafka Producer and Kafka Consumer with a Kafka broker running locally. First, let's set-up the Kafka broker locally by downloading the TAR file and running the required scripts. Other option is to run Kafka broker locally using Docker image, however I'll stick to the … Continue reading Apache Kafka – Producers and Consumers

Introduction to Stream Processing using Apache Spark

In my previous post, we looked at how Apache Spark can be used to ingest and aggregate the data using Spark SQL in a batch mode. There are different ways to create the Dataset from the raw data depending upon whether the schema of the ingested data is already well-known in advance (RDD of Java … Continue reading Introduction to Stream Processing using Apache Spark

Analyzing financial data with Apache Spark

With the rise of big data processing in the Enterprise world, it's quite evident that Apache Spark has become one of the most popular framework to process large amount of data to both in the batch mode and real-time. This article won't go into the overview of Apache Spark since there is already many good … Continue reading Analyzing financial data with Apache Spark

JWT – Token Based Authentication

In my earlier post on Cryptography, we looked at some of the cryptographic techniques and cryptographic functions that are commonly used to secure the data. In this post, we'll discuss JSON Web Token (JWT) which is one of the most commonly used token based authentication. It has become quite popular since it allows the distributed systems to … Continue reading JWT – Token Based Authentication

AWS – Relational Database Service

Amazon Relational Database Service (RDS) is a fully managed and cost efficient database service that makes it easy to provision, manage, and scale a relational database in the cloud. Amazon RDS provides an option to choose from the 6 available relational database engines - Commercial Oracle Microsoft SQL Server Open Source MySQL PostgreSQL MariaDB Cloud … Continue reading AWS – Relational Database Service