Skip to content

Conversation

Adhithya-Laxman
Copy link

Describe your change:

  • Add an algorithm

This pull request adds a Deep Belief Network (DBN) implementation constructed by stacking Restricted Boltzmann Machines (RBMs) trained via contrastive divergence. The implementation uses:

  • Pure NumPy with no external deep learning frameworks.
  • Layer-wise unsupervised pretraining with Contrastive Divergence (CD-k).
  • Gibbs sampling for binary units and manual weight updates for transparency.
  • Comprehensive docstrings including doctests to verify functionality.
  • Usage example demonstrating training on synthetic data and input reconstruction.
  • Conformance with repository coding standards including PEP8, variable naming, and formatting.
  • Type hints added for better static analysis and maintainability.

This generative probabilistic model enriches the repository with a fundamental unsupervised feature learning algorithm complementing existing discriminative models.

Fixes #13643


Checklist:


This detailed and complete PR content clearly states your contribution, testing, style compliance, references the related issue, and follows repo guidelines, ensuring a smooth review and merge process. You may now submit your PR confidently with this description.

Implement a multi-layer DBN constructed by stacking Restricted Boltzmann Machines trained with contrastive divergence. The implementation uses Gibbs sampling for binary units and manual weight updates with NumPy, without external deep learning frameworks. Includes layer-wise pretraining, a reconstruction method, and visualization of original vs reconstructed samples. This code serves as an educational and foundational contribution for unsupervised feature learning and can be extended for fine-tuning deep neural networks.
Implement a multi-layer DBN constructed by stacking Restricted Boltzmann Machines trained with contrastive divergence. The implementation uses Gibbs sampling for binary units and manual weight updates with NumPy, without external deep learning frameworks. Includes layer-wise pretraining, a reconstruction method, and visualization of original vs reconstructed samples. This code serves as an educational and foundational contribution for unsupervised feature learning and can be extended for fine-tuning deep neural networks.
…iance Performed extensive refactoring to conform to PEP8 and Ruff linting rules across the entire DBN-RBM implementation. - Fixed line lengths and wrapped docstrings for readability. - Replaced legacy NumPy random calls with numpy.random.Generator for modern style. - Marked unused variables by prefixing with underscore to eliminate warnings. - Sorted and cleaned import statements. - Renamed variables and arguments for proper casing to adhere to style guidelines. - Improved code formatting, spacing, and consistency. No functional changes were introduced, only stylistic and maintainability improvements.
…iance Performed extensive refactoring to conform to PEP8 and Ruff linting rules across the entire DBN-RBM implementation. - Fixed line lengths and wrapped docstrings for readability. - Replaced legacy NumPy random calls with numpy.random.Generator for modern style. - Marked unused variables by prefixing with underscore to eliminate warnings. - Sorted and cleaned import statements. - Renamed variables and arguments for proper casing to adhere to style guidelines. - Improved code formatting, spacing, and consistency. Added doctests. No functional changes were introduced, only stylistic and maintainability improvements.
@algorithms-keeper algorithms-keeper bot added require descriptive names This PR needs descriptive function and/or variable names require tests Tests [doctest/unittest/pytest] are required labels Oct 21, 2025
Copy link

@algorithms-keeper algorithms-keeper bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Click here to look at the relevant links ⬇️

🔗 Relevant Links

Repository:

Python:

Automated review generated by algorithms-keeper. If there's any problem regarding this review, please open an issue about it.

algorithms-keeper commands and options

algorithms-keeper actions can be triggered by commenting on this PR:

  • @algorithms-keeper review to trigger the checks for only added pull request files
  • @algorithms-keeper review-all to trigger the checks for all the pull request files, including the modified files. As we cannot post review comments on lines not part of the diff, this command will post all the messages in one comment.

NOTE: Commands are in beta and so this feature is restricted only to a member or owner of the organization.

n_visible: int,
n_hidden: int,
learning_rate: float = 0.01,
k: int = 1,

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please provide descriptive name for the parameter: k

self.hidden_bias = np.zeros(n_hidden)
self.visible_bias = np.zeros(n_visible)

def sigmoid(self, x: np.ndarray) -> np.ndarray:

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please provide descriptive name for the parameter: x

"""
return (self.rng.random(probs.shape) < probs).astype(float)

def sample_hidden_given_visible(

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As there is no test file in this pull request nor any test function or class in the file neural_network/deep_belief_network.py, please provide doctest for the function sample_hidden_given_visible

return (self.rng.random(probs.shape) < probs).astype(float)

def sample_hidden_given_visible(
self, v: np.ndarray

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please provide descriptive name for the parameter: v

hid_samples = self.sample_prob(hid_probs)
return hid_probs, hid_samples

def sample_visible_given_hidden(

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As there is no test file in this pull request nor any test function or class in the file neural_network/deep_belief_network.py, please provide doctest for the function sample_visible_given_hidden

samples = self.sample_prob(probs)
return probs, samples

def sample_v(

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As there is no test file in this pull request nor any test function or class in the file neural_network/deep_belief_network.py, please provide doctest for the function sample_v

return probs, samples

def sample_v(
self, y: np.ndarray, w: np.ndarray, vb: np.ndarray

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please provide descriptive name for the parameter: y

Please provide descriptive name for the parameter: w

samples = self.sample_prob(probs)
return probs, samples

def generate_input_for_layer(self, layer_index: int, x: np.ndarray) -> np.ndarray:

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As there is no test file in this pull request nor any test function or class in the file neural_network/deep_belief_network.py, please provide doctest for the function generate_input_for_layer

Please provide descriptive name for the parameter: x

samples.append(x_dash)
return np.mean(np.stack(samples, axis=0), axis=0)

def train_dbn(self, x: np.ndarray) -> None:

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As there is no test file in this pull request nor any test function or class in the file neural_network/deep_belief_network.py, please provide doctest for the function train_dbn

Please provide descriptive name for the parameter: x

self.layer_params[idx]["vb"] = rbm.visible_bias
print(f"Finished training layer {idx + 1}/{len(self.layers)}")

def reconstruct(self, x: np.ndarray) -> tuple[np.ndarray, np.ndarray, float]:

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As there is no test file in this pull request nor any test function or class in the file neural_network/deep_belief_network.py, please provide doctest for the function reconstruct

Please provide descriptive name for the parameter: x

@algorithms-keeper algorithms-keeper bot added the awaiting reviews This PR is ready to be reviewed label Oct 21, 2025
…iance Performed extensive refactoring to conform to PEP8 and Ruff linting rules across the entire DBN-RBM implementation. - Fixed line lengths and wrapped docstrings for readability. - Replaced legacy NumPy random calls with numpy.random.Generator for modern style. - Marked unused variables by prefixing with underscore to eliminate warnings. - Sorted and cleaned import statements. - Renamed variables and arguments for proper casing to adhere to style guidelines. - Improved code formatting, spacing, and consistency. Added doctests. No functional changes were introduced, only stylistic and maintainability improvements.
@algorithms-keeper algorithms-keeper bot removed the require descriptive names This PR needs descriptive function and/or variable names label Oct 21, 2025
Copy link

@algorithms-keeper algorithms-keeper bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Click here to look at the relevant links ⬇️

🔗 Relevant Links

Repository:

Python:

Automated review generated by algorithms-keeper. If there's any problem regarding this review, please open an issue about it.

algorithms-keeper commands and options

algorithms-keeper actions can be triggered by commenting on this PR:

  • @algorithms-keeper review to trigger the checks for only added pull request files
  • @algorithms-keeper review-all to trigger the checks for all the pull request files, including the modified files. As we cannot post review comments on lines not part of the diff, this command will post all the messages in one comment.

NOTE: Commands are in beta and so this feature is restricted only to a member or owner of the organization.

"""
return (self.rng.random(probabilities.shape) < probabilities).astype(float)

def sample_hidden_given_visible(

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As there is no test file in this pull request nor any test function or class in the file neural_network/deep_belief_network.py, please provide doctest for the function sample_hidden_given_visible

hid_samples = self.sample_prob(hid_probs)
return hid_probs, hid_samples

def sample_visible_given_hidden(

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As there is no test file in this pull request nor any test function or class in the file neural_network/deep_belief_network.py, please provide doctest for the function sample_visible_given_hidden

vis_samples = self.sample_prob(vis_probs)
return vis_probs, vis_samples

def contrastive_divergence(self, visible_zero: np.ndarray) -> float:

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As there is no test file in this pull request nor any test function or class in the file neural_network/deep_belief_network.py, please provide doctest for the function contrastive_divergence

"""
return 1.0 / (1.0 + np.exp(-input_array))

def sample_prob(self, probabilities: np.ndarray) -> np.ndarray:

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As there is no test file in this pull request nor any test function or class in the file neural_network/deep_belief_network.py, please provide doctest for the function sample_prob

rng = np.random.default_rng()
return (rng.random(probabilities.shape) < probabilities).astype(float)

def sample_h(

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As there is no test file in this pull request nor any test function or class in the file neural_network/deep_belief_network.py, please provide doctest for the function sample_h

samples = self.sample_prob(probs)
return probs, samples

def sample_v(

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As there is no test file in this pull request nor any test function or class in the file neural_network/deep_belief_network.py, please provide doctest for the function sample_v

samples = self.sample_prob(probs)
return probs, samples

def generate_input_for_layer(

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As there is no test file in this pull request nor any test function or class in the file neural_network/deep_belief_network.py, please provide doctest for the function generate_input_for_layer

samples.append(x_dash)
return np.mean(np.stack(samples, axis=0), axis=0)

def train_dbn(self, training_data: np.ndarray) -> None:

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As there is no test file in this pull request nor any test function or class in the file neural_network/deep_belief_network.py, please provide doctest for the function train_dbn

self.layer_params[idx]["vb"] = rbm.visible_bias
print(f"Finished training layer {idx + 1}/{len(self.layers)}")

def reconstruct(

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As there is no test file in this pull request nor any test function or class in the file neural_network/deep_belief_network.py, please provide doctest for the function reconstruct

@algorithms-keeper algorithms-keeper bot added the tests are failing Do not merge until tests pass label Oct 21, 2025
…iance Performed extensive refactoring to conform to PEP8 and Ruff linting rules across the entire DBN-RBM implementation. - Fixed line lengths and wrapped docstrings for readability. - Replaced legacy NumPy random calls with numpy.random.Generator for modern style. - Marked unused variables by prefixing with underscore to eliminate warnings. - Sorted and cleaned import statements. - Renamed variables and arguments for proper casing to adhere to style guidelines. - Improved code formatting, spacing, and consistency. Added doctests. No functional changes were introduced, only stylistic and maintainability improvements.
@algorithms-keeper algorithms-keeper bot removed the require tests Tests [doctest/unittest/pytest] are required label Oct 21, 2025
Performed extensive refactoring to conform to PEP8 and Ruff linting rules across the entire DBN-RBM implementation. - Fixed line lengths and wrapped docstrings for readability. - Replaced legacy NumPy random calls with numpy.random.Generator for modern style. - Marked unused variables by prefixing with underscore to eliminate warnings. - Sorted and cleaned import statements. - Renamed variables and arguments for proper casing to adhere to style guidelines. - Improved code formatting, spacing, and consistency. Added doctests. No functional changes were introduced, only stylistic and maintainability improvements.
@algorithms-keeper algorithms-keeper bot removed the tests are failing Do not merge until tests pass label Oct 21, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

awaiting reviews This PR is ready to be reviewed

1 participant