Anamoly detection is are a set of various sets of methods or practices to determine a point or trends which deviate unusually from given distributions metrics of perceived data in terms of variance or distance. Anomaly detection methods and techniques are widespread in the industry since the control charts were introduced in production processes in the early 1900s by Shewhart. The cluster of points or patterns unusually forming on one side of the charts represented a special cause variation in production processes. With the advent of big data and high dimensional data, more refined methods such as time series analysis…

Data Science is a vast field. Mastery of data science discipline requires extensive study of the body of knowledge and practice. Below are some of the core statistical ideas that any aspiring data scientist should know. All of the below micro-ideas and toy examples are illustrated using the Numpy python library.

**Basic Matrix Operations. Linear Algebra is the foundation of machine learning. The basic matrix operations are listed below**

import numpy as npMatrixA = np.random.rand(10,10)MatrixB = np.random.rand(10,10)MatrixC = np.zeros((10,10))def add_matrix(A:np.array,B:np.array) -> np.array:"""Adds two numpy matrix elements wise"""ar,ac = A.shapebr,bc = B.shape…

Cellular computations or automation have often been used to answer the question of whether randomness actually exists in nature as a principle or is simply a result of interactions between most basic units of complexity.

Cellular automata can be thought of as the most basic representation of computation in a complex environment. Some of the representations are actually found to be Turing compute. Imagine creating a most basic form of computation life to understand and imitate the complex behavior of natural processes and the multi-level interactions between different entities. The initial ideas were worked on by Von Neumann, however, the…

Much has been written and discussed bout the truly world-changing possibilities of OpenAI’s GPT-3 on society. It is a highly immersive autoregressive language model which uses deep learning to produce human-like text and can produce texts in-distinguishable from those produced by human authors. *The New York Times* has described it as “by far the most powerful language mode ever created.” *MIT Technology Review* article by W. D. Heaven** **called it “shockingly good — and completely mindless.” GPT-3 stands for “Generative Pre-trained Transformer 3. Transformers are language models trained on sequential data to encode and decode the language into an abstract…

Due to the component-based nature of React and with the introduction of React hooks, the functional programming paradigm is becoming popular among react community. By seeing all components more as more functions than state carrying objects and the popularity of redux, there is an accelerated shift towards full functional approaches. The major advantages of this approach, include writing overall less code and easier debugging and testing since a pure function by definition produces no side effects. For large codebases, these features are really useful. Also, it forces us to focus more on writing immutable functions, which better support asynchronous programming…

Network theory, a subset of graph theory, is at the heart of several recent computer science advancements, from neural networks to social networks to biology-related improvements in Gene-networks. Although, despite many advances in the field, complex network analysis is a relatively new field. Many problems still need to be solved arising from interactions between various systems and sub-systems of the future. The article discusses the most common types of problems solved in network analysis and graph theory and applies them to simple and complex scenarios. …

Management of inventory and stock items is one of the fundamental business processes in any organization, as many sub-processes are linked to inventory, optimizing the inventory process becomes crucial. Obviously, all businesses have different types of inventory management sub-processes based on the industry such as the optimum selection of stock items, setting reorder level, setting up warehouse procedures, etc. However, generally, Microsoft Excel is one of the most favored tools in keeping track of inventory and setting up inventory modeling scenarios especially for small-mid-sized businesses.

Continuing from my last article on inventory simulation, we will develop a python-based SimPy tool…

Production simulation for the industrial process has been around since the 1950s and is a powerful technique for testing business conditions, logic, and changes in processes. Traditionally simulations were done using spreadsheets tools like excel or statistical-based packages such as SPSS or Minitab. In this blog, we will use the “Discrete event simulation” technique to model and test inventory simulations or scenarios. Discrete event simulation or DES creates events at a specific interval within the given time frame and these are then processed subsequently using blocks of processes based on some business conditions, this difference to the continuous event simulation…

Mathematics seems to be fascinated by prime numbers and the number of prime numbers in particular. Prime numbers are said to be discovered by ancient greeks at around 300BC. Interestingly, one of the best methods for finding prime numbers was discovered by Greek known as Eratosthenes. Gauss, Legendre, and a myriad of other mathematicians later expanded these works.

A prime number theorem provides a way to approximate the number of primes less than or equal to a given number n. This value is called π(n), where π is the “prime counting function.” For example, π(10) = 4, since four primes…