NotebookLM started as a Google Labs experiment, so I tested other Labs projects to see how they stack up

NotebookLM started off as a Google Labs experiment, under the code name “Project Tailwind” in 2023. If you aren’t familiar with Google Labs, it’s basically Google’s experimental playground, where the company tests out early-stage AI tools and features before deciding whether they’re worth rolling out widely.

Though NotebookLM is one of the experiments that survived and is now widely available for anyone to try out, Google has a long list of other Labs projects that the company is experimenting with. After testing out NotebookLM’s competitors, I figured it was only fair to see how the rest of Google’s Labs experiments stack up.

Related

NotebookLM is already great, but these 4 features would make it even better

Good? Yes. Perfect? Not yet.

Illuminate

Like NotebookLM’s Audio Overviews, but better

For the first Labs experiment I properly tried out, I figured it made sense to go with something that felt close to NotebookLM. That’s where Illuminate, a Google Labs experiment that’s “dedicated to fostering learning” and can turn research papers into AI-generated audio summaries (like NotebookLM’s Audio Overviews), comes in.

When you first head to Illuminate’s website, you’ll have the option to listen to a conversation about a few research papers curated by Google. A play icon, along with the duration of the Audio Overview, is displayed right below each title. Upon clicking it, you’ll hear a two-person AI-generated conversation between two virtual AI hosts, a male and a female.

There’s also a small hand icon at the bottom (like the Raise Hand icon in Meet). Clicking it pulls up a Q&A section where you can ask questions related to the paper being discussed. It also shows example prompts to help you get started, and gives three follow-up questions after each query to keep things going.

The answers are clear and to the point. Like NotebookLM, Illuminate only references the source (the research paper, in this case) to answer your queries. This means the chances of the AI simply telling you what you want to hear, even if it’s inaccurate, are slim to none. For instance, I asked Illuminate what XDA is, and it said:

Thanks for the question. I can only answer questions directly related to the content.

Other than listening to the already-created Audio Overviews on Illuminate, you can create your own too by switching to the Generate section. You can upload a URL of any web content, and Illuminate will convert it into an AI podcast, as long as it isn’t paywalled content. Illuminate gives you a lot more control than NotebookLM, which I appreciate.

Related

Until NotebookLM, I never believed AI could be this game-changing for productivity

It transformed my view of AI, for the better.

Learn About

An actual AI study Buddy

Google's Learn About tool

Though NotebookLM isn’t limited to students only, and I use it for various other tasks like preparing for job interviews myself, I’d be lying if I said I don’t primarily use it for studying. I’m a full-time student, and studying isn’t easy. And NotebookLM makes it a lot more manageable.

Another Google Labs experiment the company is working on where the primary focus is “learning” is Learn About. It’s an AI-powered learning companion that’s pretty much supposed to act like a personal tutor. When you head to the experiment’s website, the first page you’ll see is “What would you like to learn today?,” where you can type in any topic you’d like to study.

To test it out, I thought the best idea was to try and see how it’d teach me a topic I was already familiar with. Since I’m a computer science major, I typed in “teach me the basic syntax of Python.” Within seconds, the AI tool responded with an in-depth answer that had multiple interactive elements.

In this example, the answer included an interactive list of the “Key Elements of Python Syntax,” which showed a relevant image of each element and a short snippet.

Google Labs Key Elements of Photo Syntax

And if I wanted to learn more about a certain element, all I had to do was click on it. The answer included examples of basic Python syntax, and even a learning card called “Stop and Think.”

The learning card was probably what I liked most about Learn About. It basically presented a fact about Python syntax and then posed a question. After you stopped and thought about it, clicking the Tap and reveal button would display the answer.

Google Labs Stop and Think card

Like NotebookLM and Gemini, citations were always present next to each claim, and hovering over a citation revealed the exact text where Learn About pulled the information from.

I decided to click on one of the interactive elements from the list I mentioned earlier, and it explained that particular element (indentation) in depth with a table (that included the rule, explanation, and a conceptual example) and images.

Here’s another tidbit I loved: the explanation included a “Common misconception” learning card, which, as the name suggests, called out a common misconception.

There was also a Comprehension check button at the bottom, which said:

Comprehension check

Now that we’ve covered the rules of Python indentation, try explaining in your own words why it’s so important and what happens if it’s incorrect.

Why is indentation critical in Python, and what is the consequence of incorrect indentation?

It included a text box where I could type my answer and get feedback on it! It then assessed my response and highlighted its strengths and weaknesses. After every question you asked, it would suggest related content, which I found really helpful. There was also the option to simplify the explanation, get a more in-depth answer, or view related images.

What I loved about Learn About is how interactive and aesthetically pleasing the tool was. It made learning new information a lot more intuitive. And it’s easy to tell it’s made for the sole purpose of learning something new and supporting active learning, rather than just spitting out information like other AI chatbots.

The ability to adjust the complexity of the explanation, get instant feedback, and explore visuals all in one place made it feel less like I was using a chatbot and more like I was inside a digital study room built just for me.

Related

5 ways NotebookLM completely changed my workflow (for the better)

Hey Siri, how did I ever survive before NotebookLM?

Little Language Lessons

Like Duolingo, but with an AI twist

Google's Little Language Lessons

The Little Language Lessons contain three “bite-sized learning experiments,” all powered by Google’s multimodal large language model (LLM), Gemini. The best part is all of them are uniquely refreshing, and I haven’t seen them done before.

The first bite-sized experiment within this collection is Tiny Lessons, where you can describe a situation (like ordering coffee) and it’ll compile useful vocabulary, phrases, and grammar tips in a language of your choice.

For instance, some vocabulary words it suggested for me, both in English and the language I chose, were: coffee, milk, sugar, hot, cold, and more. It suggested genuinely useful phrases like “Give me a coffee,” “How much does it cost?,” and “Do you have iced coffee?”

Google labs hindi for ordering coffee

Lastly, it suggested a grammar tip, where the AI explained that being polite while ordering coffee is key, and adding “please” in the language I had chosen would be the way to go!

If you travel often and feel like you sound too formal when speaking a foreign language, the next bite-sized example, Slang Hang, is something you’ll love. It allows you to “generate a realistic conversation between native speakers” on a randomized scenario. For instance, this is the scenario I got:

Setting: A bustling outdoor farmers market in Moscow on a crisp autumn morning. Anya, a young art student, is sketching the vendors when she overhears a conversation between Dmitri, a middle-aged farmer, and a customer.

Though the entire conversation will be in the language you’re learning, you can translate it to your native language as well. You can also click the speaker icon to listen to the pronunciation, which I found incredibly helpful.

The last experiment, Word Cam, is best for when you can’t think of words in another language for things right in front of you. All you need to do is snap a photo, and Gemini will detect the objects in the image and label them in your target language.

Word Cam Google Labs experiment

It’ll also include additional words you can use to describe the objects, which can really help when learning new languages.

Related

3 productivity tools I pair with NotebookLM to instantly boost my workflow

Why use it alone when it’s even more powerful with the right support?

Google Labs is definitely up to something

The three experiments I walked through above are just some of the projects Google’s currently working on. You can find a lot more on the Google Labs website’s Experiments section. After trying the abvove and many others out, I can say with complete confidence that Google’s up to something, and it’s clear that it didn’t just get lucky with NotebookLM.

There’s real thought and experimentation going into each of these tools, even if some feel a bit early or niche, they’re all pushing toward the same goal: making learning and information access smarter, more interactive, and a lot less boring.

Continue Reading