Convert Figma logo to code with AI

hunkim logoword-rnn-tensorflow

Multi-layer Recurrent Neural Networks (LSTM, RNN) for word-level language models in Python using TensorFlow.

1,305
494
1,305
33

Top Related Projects

11,562

Multi-layer Recurrent Neural Networks (LSTM, GRU, RNN) for character-level language models in Torch

77,006

Models and examples built with TensorFlow

Efficient, reusable RNNs and LSTMs for torch

Multi-layer Recurrent Neural Networks (LSTM, RNN) for character-level language models in Python using Tensorflow

Easily train your own text-generating neural network of any size and complexity on any text dataset with a few lines of code.

Quick Overview

The hunkim/word-rnn-tensorflow repository is a TensorFlow implementation of a character-level language model using Recurrent Neural Networks (RNNs). It allows users to train models on text data and generate new text based on the learned patterns. This project is particularly useful for natural language processing tasks and creative text generation.

Pros

  • Easy to use and understand implementation of RNNs for text generation
  • Supports both training and sampling (text generation) modes
  • Includes pre-trained models for quick experimentation
  • Compatible with TensorFlow, a popular deep learning framework

Cons

  • Limited to character-level modeling, which may not capture higher-level language structures
  • Requires significant computational resources for training on large datasets
  • May produce nonsensical or repetitive text without careful tuning
  • Not actively maintained, potentially leading to compatibility issues with newer TensorFlow versions

Code Examples

  1. Training a model:
python train.py --data_dir=./data/shakespeare --rnn_size=128 --num_layers=2 --batch_size=50 --seq_length=50 --num_epochs=50 --save_every=1000 --learning_rate=0.002

This code trains a model on Shakespeare's text with specified hyperparameters.

  1. Sampling from a trained model:
python sample.py --save_dir=./save --prime="The " --sample=1000

This code generates 1000 characters of text starting with "The " using the trained model.

  1. Fine-tuning a pre-trained model:
python train.py --init_from=save --data_dir=./data/new_data --rnn_size=128 --num_layers=2 --batch_size=50 --seq_length=50 --num_epochs=10

This code continues training a pre-trained model on new data.

Getting Started

  1. Clone the repository:

    git clone https://github.com/hunkim/word-rnn-tensorflow.git
    cd word-rnn-tensorflow
    
  2. Install dependencies:

    pip install -r requirements.txt
    
  3. Prepare your data: Place your text file in the data directory.

  4. Train the model:

    python train.py --data_dir=./data/your_data --rnn_size=128 --num_layers=2 --batch_size=50 --seq_length=50 --num_epochs=50
    
  5. Generate text:

    python sample.py --save_dir=./save --prime="Your starting text" --sample=1000
    

Competitor Comparisons

11,562

Multi-layer Recurrent Neural Networks (LSTM, GRU, RNN) for character-level language models in Torch

Pros of char-rnn

  • Character-level modeling allows for more flexibility in generating diverse text
  • Simpler implementation, easier to understand and modify
  • Can generate text in any language without preprocessing

Cons of char-rnn

  • May produce less coherent output compared to word-level models
  • Requires more training data and computational resources
  • Limited context understanding due to character-level processing

Code Comparison

char-rnn (Lua):

local model = nn.Sequential()
model:add(nn.LookupTable(vocab_size, rnn_size))
model:add(nn.LSTM(rnn_size, rnn_size, num_layers))
model:add(nn.Linear(rnn_size, vocab_size))
model:add(nn.LogSoftMax())

word-rnn-tensorflow (Python):

cell = rnn_cell.BasicLSTMCell(rnn_size, state_is_tuple=True)
self.cell = rnn_cell.MultiRNNCell([cell] * num_layers, state_is_tuple=True)
self.initial_state = self.cell.zero_state(batch_size, tf.float32)
softmax_w = tf.get_variable("softmax_w", [rnn_size, vocab_size])
softmax_b = tf.get_variable("softmax_b", [vocab_size])

The code snippets show the core model architecture for both projects. char-rnn uses Lua with the Torch framework, while word-rnn-tensorflow uses Python with TensorFlow. The main difference is the level of abstraction, with word-rnn-tensorflow using higher-level TensorFlow constructs.

77,006

Models and examples built with TensorFlow

Pros of models

  • Comprehensive collection of various ML models and examples
  • Actively maintained by TensorFlow team with frequent updates
  • Extensive documentation and tutorials for each model

Cons of models

  • Large repository size, potentially overwhelming for beginners
  • May require more setup and configuration for specific tasks
  • Not focused solely on RNN text generation like word-rnn-tensorflow

Code Comparison

word-rnn-tensorflow:

def sample(self, sess, chars, vocab, num=200, prime='The '):
    state = sess.run(self.cell.zero_state(1, tf.float32))
    for char in prime[:-1]:
        x = np.zeros((1, 1))
        x[0, 0] = vocab[char]
        feed = {self.input_data: x, self.initial_state: state}
        [state] = sess.run([self.final_state], feed)

models (text generation example):

def generate_text(model, start_string):
    num_generate = 1000
    input_eval = [char2idx[s] for s in start_string]
    input_eval = tf.expand_dims(input_eval, 0)
    text_generated = []
    temperature = 1.0
    model.reset_states()
    for i in range(num_generate):
        predictions = model(input_eval)

Efficient, reusable RNNs and LSTMs for torch

Pros of torch-rnn

  • Utilizes Torch, which is known for its efficiency in deep learning tasks
  • Supports both CPU and CUDA, allowing for GPU acceleration
  • Offers more customization options for network architecture

Cons of torch-rnn

  • Less active maintenance and updates compared to word-rnn-tensorflow
  • Requires Lua knowledge, which may be less common among developers
  • More complex setup process, especially for users new to Torch

Code Comparison

word-rnn-tensorflow:

def sample(self, sess, chars, vocab, num=200, prime='The ', sampling_type=1):
    state = sess.run(self.cell.zero_state(1, tf.float32))
    for char in prime[:-1]:
        x = np.zeros((1, 1))
        x[0, 0] = vocab[char]
        feed = {self.input_data: x, self.initial_state: state}
        [state] = sess.run([self.final_state], feed)

torch-rnn:

function LSTM:updateOutput(input)
  local h_prev = self.output[{{}, {1, self.hidden_dim}}]
  local c_prev = self.cell
  local i2h = self.i2h:updateOutput(input)
  local h2h = self.h2h:updateOutput(h_prev)
  local all_input_sums = i2h:add(h2h)

The code snippets show differences in syntax and structure between TensorFlow (Python) and Torch (Lua) implementations. word-rnn-tensorflow uses TensorFlow's session-based approach, while torch-rnn employs Torch's object-oriented style.

Multi-layer Recurrent Neural Networks (LSTM, RNN) for character-level language models in Python using Tensorflow

Pros of char-rnn-tensorflow

  • Character-level modeling allows for more fine-grained text generation
  • Simpler implementation, easier to understand and modify
  • Can handle out-of-vocabulary words and generate novel words

Cons of char-rnn-tensorflow

  • May require more training data and computational resources
  • Less coherent output at the word level compared to word-based models
  • Potentially slower inference due to character-by-character generation

Code Comparison

word-rnn-tensorflow:

def sample(self, sess, words, vocab, num=200, prime='first all', sampling_type=1, pick=0):
    state = self.cell.zero_state(1, tf.float32).eval()
    for word in prime.split()[:-1]:
        x = np.zeros((1, 1))
        x[0, 0] = vocab.get(word, 0)
        feed = {self.input_data: x, self.initial_state: state}
        [state] = sess.run([self.final_state], feed)

char-rnn-tensorflow:

def sample(self, sess, chars, vocab, num=200, prime='The ', sampling_type=1):
    state = sess.run(self.cell.zero_state(1, tf.float32))
    for char in prime[:-1]:
        x = np.zeros((1, 1))
        x[0, 0] = vocab[char]
        feed = {self.input_data: x, self.initial_state: state}
        [state] = sess.run([self.final_state], feed)

Both implementations use similar approaches for sampling, but char-rnn-tensorflow operates on individual characters, while word-rnn-tensorflow processes whole words.

Easily train your own text-generating neural network of any size and complexity on any text dataset with a few lines of code.

Pros of textgenrnn

  • Easier to use with a higher-level API
  • Supports both word-level and character-level text generation
  • Includes pre-trained models for quick start

Cons of textgenrnn

  • Less flexible for customization compared to word-rnn-tensorflow
  • May have lower performance on specific tasks due to generalization

Code Comparison

textgenrnn:

from textgenrnn import textgenrnn

textgen = textgenrnn()
textgen.train_from_file('path/to/file.txt', num_epochs=1)
textgen.generate()

word-rnn-tensorflow:

import train

args = train.parse_args()
args.data_dir = 'data/tinyshakespeare'
args.save_dir = 'save'
train.main(args)

The textgenrnn code is more concise and user-friendly, while word-rnn-tensorflow offers more control over the training process but requires more setup and configuration.

Convert Figma logo designs to code with AI

Visual Copilot

Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.

Try Visual Copilot

README

word-rnn-tensorflow

Build Status

Multi-layer Recurrent Neural Networks (LSTM, RNN) for word-level language models in Python using TensorFlow.

Mostly reused code from https://github.com/sherjilozair/char-rnn-tensorflow which was inspired from Andrej Karpathy's char-rnn.

Requirements

Basic Usage

To train with default parameters on the tinyshakespeare corpus, run:

python train.py

To sample from a trained model

python sample.py

To pick using beam search, use the --pick parameter. Beam search can be further customized using the --width parameter, which sets the number of beams to search with. For example:

python sample.py --pick 2 --width 4

Sample output

Word-RNN

LEONTES:
Why, my Irish time?
And argue in the lord; the man mad, must be deserved a spirit as drown the warlike Pray him, how seven in.

KING would be made that, methoughts I may married a Lord dishonour
Than thou that be mine kites and sinew for his honour
In reason prettily the sudden night upon all shalt bid him thus again. times than one from mine unaccustom'd sir.

LARTIUS:
O,'tis aediles, fight!
Farewell, it himself have saw.

SLY:
Now gods have their VINCENTIO:
Whipt fearing but first I know you you, hinder truths.

ANGELO:
This are entitle up my dearest state but deliver'd.

DUKE look dissolved: seemeth brands
That He being and
full of toad, they knew me to joy.

Char-RNN

ESCALUS:
What is our honours, such a Richard story
Which you mark with bloody been Thilld we'll adverses:
That thou, Aurtructs a greques' great
Jmander may to save it not shif theseen my news
Clisters it take us?
Say the dulterout apy showd. They hance!

AnBESS OF GUCESTER:
Now, glarding far it prick me with this queen.
And if thou met were with revil, sir?

KATHW:
I must not my naturation disery,
And six nor's mighty wind, I fairs, if?

Messenger:
My lank, nobles arms;

Beam search

Beam search differs from the other --pick options in that it does not greedily pick single words; rather, it expands the most promising nodes and keeps a running score for each beam.

Word-RNN (with beam search)

# python sample.py --prime "KING RICHARD III:" -n 100 --pick 2 --width 4

KING RICHARD III:
you, and and and and have been to be hanged, I am not to be touched?

Provost:
A Bohemian born, for tying his own train,
Forthwith by all that converses more with a crow-keeper;
I have drunk, Broach'd with the acorn cradled. Follow.

FERDINAND:
Who would not be conducted.

BISHOP OF ELY:
If you have been a-bed an acre of barren ground, hath holy;
I warrant, my lord restored of noon.

ISABELLA:
'Save my master and his shortness whisper me to the pedlar;
Money's a medler.
That I will pamper it to complain.

VOLUMNIA:
Indeed, I am

Word-RNN (without beam search)

# python sample.py --prime "KING RICHARD III:" -n 100

KING RICHARD III:
marry, so and unto the wind have yours;
And thou Juliet, sir?

JULIET:
Well, wherefore speak your disposition cousin;
May thee flatter.
My hand will answer him;
e not to your Mariana Below these those and take this life,
That stir not light of reason.
The time Lucentio keeps a root from you.
Cursed be his potency,
It was my neighbour till the birth and I drank stay.

MENENIUS:
Here's the matter,
I know take this sour place,
they know allegiance Had made you guilty.
You do her bear comfort him between him or our noble bosom he did Bolingbroke's

Projects

If you have any project using this word-rnn, please let us know. I'll list up your project here.

Contribution

Your comments (issues) and PRs are always welcome.