Deepanker Verma from Pexels

Implementing PageRank algorithm used by Google in MapReduce frame Work (Spark)

Early searching engines used to crawl the Web and create an inverted index of all terms found in each page. When you search query (search for a term), an old search engine would return to you the pages that contain the term you entered and rank these pages based on the presence frequency of this entered term in each page. Because of the naivety of this simple approach, spammers found an opportunity to exploit the searching engines so they can lead people to their pages, for example a spammer would write an irrelevant term thousands of times (say movie) where…

Photo by Michael Dziedzic on Unsplash

Applying one of the most popular metaheuristics algorithms called Tabu Search to solve a well-known NP-hard scheduling problem using Python.

We find optimization in all life aspects around us from engineering design and industrial manufacturing to choosing our dinner from the menu or planning our trip, and in general, we always try to optimize an objective(s) related to the problem such as profits, performance, cost, quality, enjoyment, etc. In computer science, AI, and mathematical optimization, a metaheuristic is a technique designed to solve problems with an approximate solution when classic methods fail to find an exact solution in a reasonable amount of time (NP-hard problems).

Photo by Vitaly Vlasov from Pexels

In this tutorial, I will walk you through the fundamentals of data crawling using BeautifulSoup in Python as you write the code from the scratch.

If you are a data scientist, engineer, analyst, or just a simple guy who collects data as a hobby, you will often need to create your dataset despite the huge amount of datasets over the internet by scratching the messy, spacious, and wild web. To do so, you need to get yourself familiar with what we call web scraping, crawling, or harvesting.

Objective: Using the BeautifulSoup library in Python create a bot that aims to crawl private universities names along with the URL of their home websites in a user-specified country and downloading them as xlsx file.

We will be… from Pexels

My opinion on why Reinforcement Learning is superior to Supervised Learning when it comes to Financial Markets.

We all agree that financial markets are at the heart of our modern economy and no doubt that they provide an important avenue for the sale and purchases of assets such as bonds, stocks, foreign exchange, and derivatives. However, to make profits from such markets, investors should study the market’s complicated and highly volatile environment. …

Microsoft Servers

In this article i will give you a gentle introduction to the programming model used in Hadoop for parallel processing.

Before we dive into MapReduce , let’s talk a bit about parallel processing which is the main purpose of using MapReduce, and how this programming model ease the task of parallel processing.

Note: Through the article I may refer to a machine (computer) as processor, node, or unit, just know that all refer to the same thing.

Parallel Processing

Dealing with massive amount of data sets requires efficient processing methods that will reduce run-time and cost in addition to overcome memory constraints. Parallel Processing is one of the most used methods by data scientists when it comes to compute and data-intensive tasks…

Taylan Kabbani

Data Scientist

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store