Posts
Mushroom Gravy Recipe from Chat GPT
- Get link
- X
- Other Apps
From Chat GPT. To make a lightweight thin mushroom gray sauce in a crockpot, you're likely aiming for a smooth, savory mushroom-based gravy that can be used as a sauce or topping. Here's a simple recipe to make mushroom gravy in a crockpot: Ingredients: 2 cups fresh mushrooms (such as cremini or button), sliced 1 medium onion, finely chopped 2 cloves garlic, minced 2 tablespoons butter or olive oil 1 cup vegetable broth (or chicken broth for non-vegetarian option) 1 cup milk or heavy cream (for a lighter version, use milk or a non-dairy option) 2 tablespoons flour (or cornstarch for a gluten-free version) 1 teaspoon soy sauce (optional, for umami) Salt and pepper to taste Fresh thyme or rosemary (optional) Instructions: Sauté the Aromatics (Optional): In a pan, melt the butter or heat the olive oil over medium heat. Add the chopped onions and cook for 3-4 minutes until soft. Add the garlic and cook for another 1-2 minutes until fragrant. Prepare the Mushroom Base: Place the sl...
Calculating parameters and what does that mean - straight from Chat GPT
- Get link
- X
- Other Apps
This is a question from Chat GPT and Sebastain's book on large language models from scratch. First, this data is interesting. 124 Million parameters GPT_CONFIG_124M = { "vocab_size": 50257, # Vocabulary size "context_length": 1024, # Context length "emb_dim": 768, # Embedding dimension "n_heads": 12, # Number of attention heads "n_layers": 12, # Number of layers "drop_rate": 0.1, # Dropout rate "qkv_bias": False # Query-Key-Value bias } The 1.5 billion parameter GPT model config GPT_CONFIG_1558M = { "vocab_size": 50257, # Vocabulary size "context_length": 1024, # Context length "emb_dim": 1600, # ...
Testing multi machine hadoop cluster
- Get link
- X
- Other Apps
So I was testing setup of a hadoop cluster and running map reduce. There are preliminary notes in first post. One key, I ran all commands under openssh as that user Setup the user Main machine: sudo usermod --shell /bin/bash mainhdfs Start openssh openssh su - mainhdfs sudo apt-get install openssh-server sudo systemctl enable ssh sudo systemctl enable ssh --now sudo systemctl start ssh su - mainhdfs sudo mkdir /usr/local/hadoop sudo mv hadoop-3.3.6 /usr/local/hadoop/ sudo chown own -R mainhdfs:hadoop /usr/local/hadoop sudo chown -R mainhdfs:hadoop /usr/local/hadoop Setup more on ssh: ssh-keygen -t rsa -P '' -f ~/.ssh/id_rsa cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys chmod 600 ~/.ssh/authorized_keys chmod 700 ~/.ssh Write the following .bashrc and .bash_profile for that user export HADOOP_HOME=/usr/local/hadoop/hadoop-3.3.6 export PATH=$PATH:$HADOOP_HOME/bin:$HADOOP_HOME/sbin export HADOOP_CONF_...