Posts

Showing posts from 2025

Updates on AGI Library Numenta and Thousand Brains

 Review of Projects: Java Version: https://github.com/berlinbrown/htm.java https://github.com/htm-community htm core: https://github.com/htm-community/htm.core Thousand Brains Project https://www.youtube.com/watch?v=VqDVUWgJQPI https://github.com/thousandbrainsproject/feat.dynamic_resizing Jobs https://thousandbrains.org/company/careers/open-source-software-engineer/

Let's got for AGI

Posed question to Chat GPT: response from chatgpt openai. ...  That is really interesting — you’re describing something like a "human-like AI operator" that interacts with the world only through the same physical channels a person would. Instead of APIs, direct memory access, or digital hooks, it would: See through a camera (screen scraping instead of data feeds). Hear through microphones (parsing audio instead of reading text). Act by pressing keys, moving a mouse, or even flipping physical switches. This would create an AI that is forced to deal with the world in the same messy way humans do , where perception and action are mediated by physical hardware. Some implications: Advantages Robustness — It can interact with any system humans use, regardless of APIs or locked digital environments. Security sandboxing — Because it only uses physical I/O, it’s less likely to bypass boundaries in unsafe ways. Human-aligned learning — Its training match...

Good Meeting Ajug - Apache Iceberg

 Good Meeting Atlanta Java Group: https://www.meetup.com/atlantajug/events/310453214/ "Pratik Patel is a Java Champion, developer advocate at Azul Systems, and has written three books on programming" "AI-Powered Data Exploration: Interacting with Apache Iceberg via Spark and LLMs"

More AGI Links Like Thousand Brains Project and Numenta Links

 Have to read the book: Jeff Hawkins – A Thousand Brains: A New Theory of Intelligence Explains the "Thousand Brains Theory" in an accessible but deep way. Great for grounding AGI ideas in cortical theory.  Architectures, AGI Projects & Platforms OpenCog Hyperon – An open-source AGI framework using symbolic + probabilistic reasoning. SingularityNET / Cogito – AGI-focused decentralized AI project. Nengo (by Applied Brain Research) – A Python framework for large-scale brain modeling. Often used with Eliasmith’s work.

Curious on LLM from Scratch - Chapter 5 Training Example

Image
 Running the Chapter 5 Training Example,  This is an interesting example and without fully understanding it, asking Github Copilot, Chat GPT abotu the code. ... Here it is running, and also updates on the code.  The model has approximately 124 million parameters (hence the name) primarily from:  Position Embeddings: context_length × emb_dim = 256 × 768 parameters  Transformer Layers: 12 layers, each with attention heads and feed-forward networks  Even though "The Verdict" is only 3,600 words:  The model is designed to learn general language patterns, not just memorize the text   The vocabulary size (50,257 tokens) matches GPT-2's full vocabulary  The architecture follows smaller GPT-2 specifications (768-dim embeddings, 12 layers)  This is the total number of unique tokens the model can recognize and generate ''' ''' The full vocabulary (50,257 tokens) gives it the capacity to represent  many words it never sees in training It's bas...

Bela

 Bela Thank you for attention to this matter.

Random Code GPT from Scratch Book

Image
 Looking at the code from Sebastian Raschka on LLMs, here is a small snippet.  Training a model from scratch and basic test.  The key is understanding the basics. Here is the text to train, "I HAD always thought Jack Gisburn rather a cheap genius--though a good fellow enough--so it was no great surprise to me to hear that, in the height of his glory, he had dropped his painting, married a rich widow, and established himself in a villa on the Riviera. (Though I rather thought it would have been Rome or Florence.)" See: https://pytorch.org/ https://github.com/openai/tiktoken import matplotlib.pyplot as plt import os import torch import urllib.request import tiktoken   class GPTDatasetV1 (Dataset):     def __init__(self, txt, tokenizer, max_length, stride):         self.input_ids = []         self.target_ids = []         # Tokenize the entire text         token_ids = tokenizer.enco...

Random Finds from Blogger Search

 So just random searches, Who is still using blogger, blogspot. https://returntoatl.blogspot.com/ https://diego-pacheco.blogspot.com/

Dark Souls is great from fromsoftware

Image
  Darks Souls 1 and 2 are great games

From Chat GPT on Distributed AI

 Yes, absolutely — the idea of building and training AI models using distributed computing , especially with tiny or resource-constrained devices like Raspberry Pis , is possible in concept and increasingly viable with the right technology stack. There are a few real-world technologies and approaches that relate to what you're describing. 🔹 What You’re Describing: You're imagining a world where: A small, local device (like a Raspberry Pi) can train or fine-tune a small model. These devices are networked , forming a distributed system . They collaboratively train a model or share knowledge (like a federated system). This is an intersection of: Distributed Computing Federated Learning TinyML Edge AI 🔧 Technologies and Concepts to Look Into 1. Federated Learning A decentralized approach to training models where each device trains on its own data locally, and only updates (not raw data) are sent to a central server or aggregator. Used by : G...

Thousand Brains Project AI

 https://github.com/thousandbrainsproject ... Reverse engineering the neocortex 🧠 to revolutionize AI 🤖. An open-source initiative.

Asked Chat GPT - What is next for AI LLMS?

 I asked this: 1. Better Search and Retrieval (RAG) Holy Grail: Turn every piece of company knowledge into something easily searchable, understandable, and actionable. LLMs are increasingly being integrated with retrieval-augmented generation (RAG) systems, which combine a company's internal documents (PDFs, wiki pages, logs) with conversational LLMs. Goal: Create “company copilots” — tools that answer employee/customer questions using internal knowledge. 🧠 2. Enterprise-Grade Agents and Automation Holy Grail: Use AI agents to autonomously take action on behalf of employees or customers. Example use cases: Automate Tier-1 and Tier-2 support tickets Handle complex workflows in finance, HR, legal, IT Companies want agents that are safe, traceable, and reliable , not just “smart.” 💬 3. Natural Language Interfaces to Software Holy Grail: Let employees use natural language to interact with business systems (SQL databases, CRMs, analytics, etc.). Think: ...

Went down the AI LLM rabbit hole, asking questions. All responses from Chat GPT

 Here are responses from Chat GPT.  Just random questions and responses from chatgpt.com What's Already in the Model? When you ask me a question, I first try to answer it based on what I already "know." I was trained on a huge amount of data up until mid-2024, which includes: Wikipedia Gaming wikis (like Fextralife and others) Walkthroughs Forums (like Reddit, GameFAQs) Guides and FAQs So for a lot of questions—especially about older games like Dark Souls 1 (2011)—the answer is already in the model. That means I can often recall details like: Boss names Soul drops Recommended level ranges Item and spell descriptions Lore connections If that had been enough, I would’ve just given you the info directly. 🌐 Step 2: Using the Web (Real-Time Search) In your case, the question was very detailed—you wanted: All bosses & mini-bosses Who give 4,000+ souls Exact soul amounts Recommended player level to fight them Even though I remembered some of that from training, it wasn’t en...