< prev | next >

Naive Bayes

Naïve Bayes Classifiers are based on applying Bayes' theorem which describes the probability of an event based on prior knowledge of conditions related to the event.

The classifier is called ‘naïve’ because it assumes that no factors other than the known prior conditions effect the classifier predictions, which is not a realistic assumption.

These functions are named after Thomas Bayes (1701-1761).

Naive Bayes algorithms can be used for Cluster Analysis to perform Classification:

Naive Bayes algorithms can also be used in conjunction with Decision Trees for improved accuracy:

Key Aspects

  • processing is non-iterative, so the algorithm is fast

  • the algorithm isn’t a ‘black box’; we can know how it operates internally

  • the algorithm can be used to determine the impact of individual features on results

  • the algorithm depends on using known prior probabilities

  • the algorithm depends on assuming the independence of predictor features

Python Example

To download the code below, click here.

"""
naive_bayes_multinomial_with_scikit_learn.py
performs classification with discrete features
"""

# Import needed packages.
import numpy as np
from sklearn.naive_bayes import MultinomialNB

# Set parameters.
random_number_seed = 5
maximum_feature_value = 6
number_of_training_feature_records = 6
number_of_prediction_feature_records = 1
number_of_features = 100

# Create a random number generator.
generator = np.random.RandomState(random_number_seed)

# Create model training feature data of 6 groups of 100 numeric features each.
X = generator.randint(maximum_feature_value,
                      size=(number_of_training_feature_records,
                            number_of_features))

# Print the training data.
print("X Feature Training Data:")
print(X)

# Define the classes for training.
y = np.arange(1, number_of_training_feature_records + 1)

# Print the training classes.
print("y Training Classes:")
print(y)

# Instantiate a Naive Bayes model.
model = MultinomialNB()

# Train the model.
model.fit(X, y)

# Create a prediction input.
prediction_input = generator.randint(maximum_feature_value,
                                     size=(number_of_prediction_feature_records,
                                           number_of_features))

# Print the prediction input.
print("Prediction Input:")
print(prediction_input)

# Make a prediction
prediction = model.predict(prediction_input)

# Print the prediction
print("Predicted Class:")
print(prediction)

Outputs are below:

X Feature Training Data:
[[3 5 0 1 0 4 3 0 0 4 1 5 0 3 4 5 3 1 4 5 2 1 1 2 1 1 1 2 0 5 2 0 0 4 4 1
  3 3 2 4 1 3 3 2 1 5 4 4 5 3 3 3 4 1 3 3 3 5 1 1 5 0 2 1 0 5 2 5 3 0 5 3
  0 0 4 4 5 2 0 3 0 0 0 2 4 5 3 5 1 4 5 2 4 3 5 0 0 1 4 3]
 [4 1 0 0 2 5 4 3 2 4 1 2 3 4 3 4 3 1 4 2 3 4 1 4 0 2 4 1 2 2 1 3 0 0 0 3
  1 4 4 3 0 2 4 0 0 5 3 3 3 4 0 2 2 1 3 1 5 1 2 3 0 0 5 1 1 1 0 0 1 4 1 3
  4 2 1 5 4 4 2 2 5 1 2 3 5 1 2 4 1 0 1 2 3 0 2 5 2 5 4 3]
 [2 1 5 1 1 5 1 1 0 4 0 5 0 5 5 2 1 3 4 3 3 0 3 3 3 2 5 2 0 3 4 5 1 3 5 3
  3 5 1 1 2 4 2 5 2 4 0 0 1 4 5 3 1 0 3 2 1 0 3 5 4 4 2 1 1 1 3 0 2 4 4 5
  1 3 1 3 5 4 3 3 5 1 0 0 2 5 2 2 0 1 1 0 3 2 2 2 1 5 1 0]
 [2 2 4 4 1 5 4 1 1 5 0 4 1 5 1 3 0 0 5 0 3 2 3 2 1 0 3 0 2 3 4 4 2 5 0 5
  4 5 3 2 1 5 1 0 3 4 4 4 0 0 3 0 2 5 5 2 4 5 5 5 2 2 5 4 5 1 2 5 0 2 3 5
  5 5 2 4 4 3 1 2 0 0 2 1 0 1 4 4 5 5 3 1 1 5 5 4 1 1 2 0]
 [3 5 5 2 4 3 3 3 3 1 1 5 0 1 5 5 2 4 2 2 2 4 4 5 3 1 1 5 0 1 2 1 2 0 1 1
  5 0 1 4 0 1 3 4 5 4 1 0 4 5 1 2 5 5 2 2 3 5 2 2 3 5 3 2 1 5 5 5 0 3 3 1
  5 5 2 0 5 1 5 0 0 3 5 4 4 0 3 1 1 1 1 2 0 4 5 1 3 5 5 2]
 [2 1 4 2 2 2 2 2 5 3 0 4 2 5 3 0 4 4 0 1 0 0 2 4 1 4 1 4 2 5 3 3 2 3 2 0
  3 4 2 0 0 0 1 3 3 2 5 0 3 4 2 3 1 1 0 4 1 5 5 1 4 3 1 2 4 0 3 5 3 1 0 2
  4 1 1 2 2 1 3 4 1 1 1 3 0 4 2 5 1 2 3 2 1 2 5 0 1 0 3 1]]
y Training Classes:
[1 2 3 4 5 6]
Prediction Input:
[[0 2 1 5 4 1 0 5 5 1 4 5 1 3 2 4 0 2 4 0 3 2 4 5 5 1 1 5 4 5 5 0 4 4 1 5
  3 3 2 5 0 4 1 1 1 5 2 1 2 4 0 4 0 2 1 2 3 2 4 0 4 3 4 2 5 1 2 0 0 2 3 5
  0 3 1 4 1 3 0 4 5 0 1 1 4 5 2 0 3 5 3 2 4 3 4 4 5 4 1 4]]
Predicted Class:
[3]

References