Markov Chains
A Markov chain, as I understand it, is a graph that carries probabilities in its edges. If you’re at a node, the directed edges will tell you the probability of going to each of your neighbors.
The best-known application of this is to generate random text. Load a text file one character at a time and create a node in your graph for each unique letter you see. If you just loaded letter x and now read letter y, add a directed edge from node x to node y. If the edge already exists, increment the count associated with it. At the end you’ll have a graph that encodes each letter pairs and their frequency of occurrence. (You can think of normalizing each node’s outgoing edges so that those numbers represent probabilities of going to each of its neighbors, and the sum of each node’s outgoing edges is one.)
For example, with the input string TOTAL you’d generate a graph with four nodes (T, O, A, L). The T node would have two edges (probability 50% each), one to O and one to A. O would have a single edge to T, A only to L, and L would have no outgoing edges. You could then create text by traveling through this graph, following edges according to their probabilities. The output would sound a lot like the input, and the more input you had the more varied the output would be.
Read full article from Markov Chains
A Markov chain, as I understand it, is a graph that carries probabilities in its edges. If you’re at a node, the directed edges will tell you the probability of going to each of your neighbors.
The best-known application of this is to generate random text. Load a text file one character at a time and create a node in your graph for each unique letter you see. If you just loaded letter x and now read letter y, add a directed edge from node x to node y. If the edge already exists, increment the count associated with it. At the end you’ll have a graph that encodes each letter pairs and their frequency of occurrence. (You can think of normalizing each node’s outgoing edges so that those numbers represent probabilities of going to each of its neighbors, and the sum of each node’s outgoing edges is one.)
For example, with the input string TOTAL you’d generate a graph with four nodes (T, O, A, L). The T node would have two edges (probability 50% each), one to O and one to A. O would have a single edge to T, A only to L, and L would have no outgoing edges. You could then create text by traveling through this graph, following edges according to their probabilities. The output would sound a lot like the input, and the more input you had the more varied the output would be.
Read full article from Markov Chains
No comments:
Post a Comment