Lecture notes of Professor Stéphane Mallat - Collège de France - Paris
-
Updated
May 9, 2025 - Jupyter Notebook
Lecture notes of Professor Stéphane Mallat - Collège de France - Paris
resilience and prospective resilience of complex networks
Arithmetic coding library with statistical models like PPM and Context mixing (demonstrating core principles of probabilistic inference, ensemble learning in AI)
Signal analysis tool featuring FFT (fast Fourier transform), fractal dimension, entropy, and numerical differentiation and integration
Cài đặt thuật & tóm tắt lí thuyết Mã hóa mật mã - fit@hcmus
A way to analyse how malware and/or goodware samples vary from each other using Shannon Entropy, Hausdorff Distance and Jaro-Winkler Distance
Content-Aware Scaling using Seam Carving method with different algorithms for energy mapping
Passphrase generator
This application calculates the entropy of a string. The focus of this implementation is represented by a specialized function called "entropy" which receives a text sequence as a parameter and returns a value that represents the entropy. Entropy is a measure of the uncertainty in a random variable.
Calculate the Shannon entropy of the provided file.
Network analysis of Twitter trends.
Splicing Diversity Analysis for Transcriptome Data
A sophisticated web application for text analysis and Shannon Entropy calculation.
Information-theoretic social neuroscience analyses, particularly for hyperscanning paradigms
Uniswap Transaction Analysis Repository: Layer-1 and Layer-2 Transaction Measures
A simple python script to find and compare WhatsApp chat entropy
Entropy is a measure of the uncertainty in a random variable. This application calculates the entropy of text. The current example calculates the entropy of sequence "TTTAAGCC". In the context of information theory the term "Entropy" refers to the Shannon entropy.
entro.py calculates basic informational parameters for any input string.
Add a description, image, and links to the shannon-entropy topic page so that developers can more easily learn about it.
To associate your repository with the shannon-entropy topic, visit your repo's landing page and select "manage topics."