Monday, January 26, 2026
Links for Information Lecture 2
Information Basics
"The heart of his theory is a simple but very general model of communication: A transmitter encodes information into a signal, which is corrupted by noise and then decoded by the receiver. Despite its simplicity, Shannon’s model incorporates two key insights: isolating the information and noise sources from the communication system to be designed, and modeling both of these sources probabilistically. He imagined the information source generating one of many possible messages to communicate, each of which had a certain probability. The probabilistic noise added further randomness for the receiver to disentangle." Full article @ Quanta Magazine.
Predicting Entropy Game Demo. Refining the Estimated Entropy of English by Shannon Game Simulation. Shannon's calculations of entropy of English.
Prediction and Entropy of Languages Wolfram Demo
An experimental estimation of the entropy of English, in 50 lines of Python code
Letter frequency in English
Word and Letter Frequency in English
Entropy of English. .
Text Mechanic - Text Manipulation Tools
Shuffle Letters and Narakeet.
Using Information Theory to Solve Wordle.
1952 – “Theseus” Maze-Solving Mouse @ cyberneticzoo.com
"The heart of his theory is a simple but very general model of communication: A transmitter encodes information into a signal, which is corrupted by noise and then decoded by the receiver. Despite its simplicity, Shannon’s model incorporates two key insights: isolating the information and noise sources from the communication system to be designed, and modeling both of these sources probabilistically. He imagined the information source generating one of many possible messages to communicate, each of which had a certain probability. The probabilistic noise added further randomness for the receiver to disentangle." Full article @ Quanta Magazine.
Predicting Entropy Game Demo. Refining the Estimated Entropy of English by Shannon Game Simulation. Shannon's calculations of entropy of English.
Prediction and Entropy of Languages Wolfram Demo
An experimental estimation of the entropy of English, in 50 lines of Python code
Letter frequency in English
Word and Letter Frequency in English
Entropy of English. .
Text Mechanic - Text Manipulation Tools
Shuffle Letters and Narakeet.
Using Information Theory to Solve Wordle.
Claude Shannon Demonstrating Theseus
Labels: #entropy, #information, #life
Sunday, January 25, 2026
Lecture notes (chapter 1): What is Life?
Welcome to Life-Inspired, ISE483/SSIE583 Spring 2026 Class. On this blog you will see many types of posts related to the class, from links to multimedia materials used in class, to breaking research on related topics. The updated first chapter of the course's lecture notes is now available:
ISE483/SSIE583: What is Life?
You can also listen to the AI-produced podcast or video of this chapter via Google NoteBookLM. Note, the AIpodcast and video does not substitute the lecture notes at all! it brings up many connections not really in the argument, it is provided for fun only.
You can also listen to the AI-produced podcast or video of this chapter via Google NoteBookLM. Note, the AIpodcast and video does not substitute the lecture notes at all! it brings up many connections not really in the argument, it is provided for fun only.

Labels: #artificallife, #bioinspiredcomputing, #ComplexSystems, #evolutionarysystems, #life