Markov Model To Predict The Weather In Python
Introduction
A first-order Markov process is a stochastic process in which the future state solely depends on the current state only. The first-order Markov process is often simply called the Markov process. If it is in a discrete space, it is called the Markov chain.
In practice, the Markov process can be an appropriate approximation in solving complex ML and reinforcement learning problems.
In this article, we will implement a Markov model to predict tomorrows’ weather based on todays’ weather.
Background/ Interest
This report is a part of the “Artificial Intelligence” course at City University, Dhaka, Bangladesh conducted by Nuruzzaman Faruqui. This is the best AI course in Bangladesh.
In this course, we learned AI from scratch. We started from Basic python and ended in Natural language processing. We learned theoretical concepts, essential mathematics properly in the “CSE 417: Artificial Intelligence” course, then implemented our knowledge in the Lab course 'CSE 418: Artificial Intelligence Laboratory'. We have done a lot of lab sessions to master the course and gradually we learned each necessary concept of Artificial Intelligence.
Now we can build our machine learning model and also can build a neural network to solve a complex problem.
Problem Statement
To start constructing a Markov chain, we need a transition model that will specify the probability distributions of the next event based on the possible values of the current event.
The probability of tomorrow being sunny based on today being sunny is 0.8. This is reasonable because it is more likely than not that a sunny day will follow a sunny day.
However, if it is rainy today, the probability of rain tomorrow is 0.7, since rainy days are more likely to follow each other.
Using this transition model, it is possible to sample a Markov chain. Start with a day being either rainy or sunny, and then sample the next day based on the probability of it being sunny or rainy given the weather today. Then, condition the probability of the day after tomorrow based on tomorrow, and so on, resulting in a Markov chain:
Given this Markov chain, we can now answer questions such as “what is the probability of having four rainy days in a row?”
Below there is an example of how a Markov chain can be implemented in code.
The Python code to implement the Markov chain
from pomegranate import *
# Define starting probabilities
start = DiscreteDistribution({
"sun": 0.5,
"rain": 0.5
})
# Define transition model
transitions = ConditionalProbabilityTable([
["sun", "sun", 0.8],
["sun", "rain", 0.2],
["rain", "sun", 0.3],
["rain", "rain", 0.7]
], [start])
# Create Markov chain
model = MarkovChain([start, transitions])
# Sample 50 states from chain
print(model.sample(50))
Result
-Inspiron-5570:~/Desktop/AI_LAB$ /usr/bin/python3 /home/alpha/Desktop/AI_LAB/markovmodel.py
['rain', 'rain', 'rain', 'rain', 'sun', 'sun', 'sun', 'sun', 'sun', 'sun', 'sun', 'sun', 'sun', 'sun', 'sun', 'rain', 'rain', 'rain', 'rain', 'rain', 'rain', 'sun', 'sun', 'sun', 'rain', 'rain', 'rain', 'sun', 'sun', 'sun', 'rain', 'sun', 'sun', 'sun', 'sun', 'sun', 'sun', 'rain', 'rain', 'rain', 'rain', 'rain', 'rain', 'rain', 'sun',
'sun', 'sun', 'sun', 'sun', 'sun']
alpha@alpha-Inspiron-5570:~/Desktop/AI_LAB$
We have got 50 sample states in this Markov chain.
Conclusion
In this lab report, we learned how the Markov model/ Markov chain can be implemented to predict some future events based on current information. However, you can develop your weather forecast model or others by implementing the code snippet given above. This is a simple and easier implementation of the “ CSE 418: Artificial Intelligence Lab” course at City University, Dhaka, Bangladesh conducted by Nuruzzaman Faruqui Sir. Which is the best AI course in Bangladesh.
You are free to copy the code from here or some other concluding remarks for your project.