How to Work with AI-Generated Code You Don't Understand with ghx and Gemini

AI is amazing for generating code, but often leaves us staring at functions and libraries we've never seen before. This creates a huge problem: When things break (and they will break), we're lost. We don't understand the individual pieces, let alone how they interact. Knowing one part isn't enough; we need to see how unfamiliar components work together.

The solution? Real-world examples.

This lesson introduces ghx, a tool I built to do exactly that. ghx searches GitHub, finding actual code examples that use the specific combinations of functions and libraries that are tripping you up. Instead of generic documentation, you get concrete implementations. You can find the ghx repository at https://github.com/johnlindquist/ghx.

Here's the workflow we cover:

Identify the Unknown: Pinpoint the AI-generated code you don't understand – specific functions, libraries, or their interactions.

Search with ghx: Use ghx to search GitHub for examples using those terms together. ghx finds repos where those elements co-exist.

Consolidate Examples: ghx combines the relevant code snippets from those examples into a single, digestible file.

Ask the AI (Again, but Better!): Feed this consolidated code to an AI (like a coding-focused model in an AI studio). Now, instead of asking general questions, you can ask targeted questions based on real-world usage:

"Explain how these functions are used together in these examples."

"What are some common use cases for this combination of libraries?"

"Show me alternative code snippets that achieve the same result."

"Can you identify potential problems or areas for improvement, based on these examples?"

Iterate. Use your improved understanding to make changes, and then repeat the process to debug.

By leveraging ghx and targeted AI prompting, we transform from being lost in a sea of unfamiliar code to actively learning from how others have solved similar problems. We gain a deeper understanding of our AI-generated codebase, making debugging, modification, and future development far more manageable. This gives you the tools to understand, debug and improve your own code.

Share with a coworker

Transcript

[00:00] Install ghx globally pnpm I or npm or whatever at john lindquist slash ghx then once that's installed you'll have access to ghx and what this can do is take code you're completely unfamiliar with say for example this is code I've generated to help script ways of interacting with the menus on Mac. But this is all generated by AI and I have no idea how any of it works. So I can grab any of these terms like this one, paste that in there, and maybe let's grab this, paste that in, and then I'm going to limit this with "-e", to an extension of for Objective-C. This is Objective-C code. You could do that with .ts for TypeScript or any file extension.

[00:39] Hit enter and then it'll go out and search GitHub for examples that have those words in it. And once it's done finding all of those examples it's going to create a giant markdown file which contains, oh and by default it automatically opens it in your editor, and it contains all of the examples that it could find with these terms in it. So if I search for this you'll see there's 99 results with this showing how other people have used this keyword, function name, whatever in their code. And you can see all of these snippets and all of these examples of how other people have used this. And what I'll almost always do is take all of this, I'll open AI Studio, paste it all in.

[01:17] Remember we get 2 million tokens in here and we did find 50 results which is the default limit. I'll show how to bump that up in a second. But for right now, just scroll down to the bottom and ask it to, I'll copy this, please explain how paste, again these are terms I'm super unfamiliar with, copy this and this, work together and I can start dictating now that I've pasted these in. I'm brand new to Objective-C, I don't know how these work, I'm building some Mac utilities and I'd like to have a better understanding of how these things work together. So based on all these examples I provided, please provide code snippets with comments explaining what's going on, help me avoid common mistakes, show me some other scenarios I can explore, and maybe come up with some cool ideas of how these could work together.

[02:00] And then on and on and you can brainstorm and throw out any ideas that come to mind. I always like to turn grounding on just so I can search the web with the most recent stuff. I'll leave it on the Gemini 2 Pro, hit run. This was streaming in after 37 seconds. It's all done and you can see some of the code examples it spat out.

[02:17] Just kind of going back to the beginning here, it lists the core concepts of that this represents a UI element. It's like a handle or a pointer to a window button or text field, and this can read information about the UI element. And some examples of how they work together, breaking down frontmost application, focused windows, window titles, and on and on, and some of the ideas, a window information tool, window management, cross-application scripting. And so just from those basic two terms and from real user examples, we can make our understanding much more real and based on what people are actually building instead of what the AI thinks people are building based on how it was trained. Now if you just run ghx //help, you'll see all of the options that we have.

[03:00] The main ones you'll use are basically limit, so it defaults to 50 of how many results. The extension, like I showed the for Objective-C, you could use .ts for TypeScript and on and on. And file names are super helpful for passing in something like ghx-framer-motion and we'll say file name is package.json with a limit of 200 and then just let that run. It'll sit there and work for quite a while finding all these examples and then putting them all together. And one thing I haven't mentioned, this is just based on the GitHub CLI, so this is all free.

[03:35] It just uses your GitHub token from the GitHub CLI. And this opened in our editor here, and you can see all these package jsons which have framer-motion inside of them. So you can see what sort of packages people are using alongside it, like Zod, Chakra UI, Emotion, CLSX, to help you explore what other people are using for their projects. And this can give you some really good context of how certain libraries operate together, so your own generated projects make way more sense, like maybe your AI picked a certain library and you could offer it. Well, have you tried or considered these as alternatives?

[04:09] I found they work with frame or motion more, and as you find more and more real-world examples, you get much stronger feedback to give to AIs which can generate code for you.