Learn how to perform anomaly detection using Kafka Streams with an example of a loan payment website that needs to send an alert if the payment is too high.
See how to get started with writing stream processing algorithms using Apache Flink. by reading a stream of Wikipedia edits and getting some meaningful data out of it.
Some quick stats: 656 million tweets go out per day, and 15,220,700 texts are sent every minute. This makes for LOTS of data. Read on for more shocking stats!
If you've been following software development news recently you probably heard about the new project called Apache Flink. I've already written about it a bit...
The variable selection process in the credit score modeling process is critical to finding key information. Learn how to do it to get a good understanding of your data!
If you have often wondered to yourself about the difference between machine learning and deep learning, read on to get a detailed comparison in simple layman language.
More than a third of the Fortune 500 companies now use Kafka in production — and for good reason. In this article, learn how to track real-time activity using Kafka.
Data scientists are responsible for designing and developing accurate, useful, and stable models. This is especially important when it comes to credit risk models.
GPUs can accelerate the training of machine learning models. In this post, explore the setup of a GPU-enabled AWS instance to train a neural network in TensorFlow.
Learn how data is analyzed and boiled down to a single value — a credit score — using statistical, machine learning, and predictive analytics techniques.
If you're looking to start an AI project but don't know where to start, check out this article. We've listed the top 12 AI tools, libraries, and platforms, what they are typically used for, what pros and cons they come with, and more!
Should you switch to Apache Flink? Should you stick with Apache Spark for a while? Or is Apache Flink just a new gimmick? Get the answers to these and other questions.
The goal of someone learning ML should be to use it to improve everyday tasks—whether work-related or personal. To do this, it's important to first understand algorithms.